CN114055501A - Robot grabbing system and control method thereof - Google Patents

Robot grabbing system and control method thereof Download PDF

Info

Publication number
CN114055501A
CN114055501A CN202111358870.2A CN202111358870A CN114055501A CN 114055501 A CN114055501 A CN 114055501A CN 202111358870 A CN202111358870 A CN 202111358870A CN 114055501 A CN114055501 A CN 114055501A
Authority
CN
China
Prior art keywords
workpiece
industrial robot
camera
calibration
axis industrial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111358870.2A
Other languages
Chinese (zh)
Inventor
马国庆
刘丽
刘贵军
吕源辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun University of Science and Technology
Wuhu Hit Robot Technology Research Institute Co Ltd
Original Assignee
Changchun University of Science and Technology
Wuhu Hit Robot Technology Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun University of Science and Technology, Wuhu Hit Robot Technology Research Institute Co Ltd filed Critical Changchun University of Science and Technology
Priority to CN202111358870.2A priority Critical patent/CN114055501A/en
Publication of CN114055501A publication Critical patent/CN114055501A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

The invention discloses a robot grabbing system and a control method thereof.A robot, a binocular camera and an object recognition and positioning module pre-installed in an upper computer controller are combined together, the object recognition and positioning module adopts a depth learning algorithm to extract and train various postures of a workpiece, a machine vision system is used for carrying out vision analysis operation to quickly find the position of the workpiece, then the characteristics of the binocular camera are used for carrying out three-dimensional reconstruction to obtain the three-dimensional coordinates and pose data of the workpiece, and a six-axis industrial robot is guided to grab. The speed and the degree of accuracy that the work piece was seeked have been guaranteed on the one hand, and on the other hand six industrial robot that move replaces artifically, reach unmanned purpose, have improved work efficiency that work piece discernment and snatched, have satisfied the requirement of future industrial production line intellectuality, high efficiency and reliability.

Description

Robot grabbing system and control method thereof
Technical Field
The invention relates to a robot grabbing system and a control method thereof, which can identify and position articles with complex structures or stacks and belong to the field of industrial automatic production industry.
Background
At present, industrial robots are used in large-scale production workshops of China to replace workers to participate in some complicated work, but the robots installed in factories face the problems of low intelligent degree and incapability of independently finishing complex work. Therefore, the realization of unmanned automatic grabbing of objects is the key point of research in the field of industrial robots at present.
The key to realize automatic grabbing is that the object can be accurately identified and positioned. The industrial camera is used for replacing human eyes, so that labor is saved, and production efficiency is improved. The retrieval shows that the object identification and object positioning capabilities of the existing industrial robot during grabbing are weak, the depth information of the object cannot be obtained, and positioning errors can be caused as long as the object is stacked. The invention relates to a robot grabbing system based on visual guidance, which is an invention patent with publication date of 3, 12 and 2019, and the invention number CN201811284359.0, the invention name is that the robot grabbing system based on visual guidance uses a visual guidance system, which comprises a CCD industrial camera, and the CCD industrial camera uses gigabit Ethernet to communicate with a computer, and is arranged right above a conveyor belt. The lens adopts a fixed focus lens. Image information acquired by a CCD camera is converted into a robot control signal after workpiece recognition is finished by an image processing algorithm so as to control the actual position of a robot end effector. The six-axis robot adopted by the industrial robot is driven by an alternating current servo motor and is used for grabbing operation on a production line. Although the mechanism realizes automatic grabbing of workpieces and identification of objects, and improves production efficiency, the traditional template matching algorithm used by the mechanism is slow in speed for searching the objects, a monocular camera cannot acquire depth information of the workpieces, and accuracy is lost if the objects are stacked.
Disclosure of Invention
The invention provides a robot grabbing system and a control method thereof, aiming at the problem that the existing robot grabbing system cannot accurately identify and grab industrial objects with complex structures or difficult identification, and the robot grabbing system has the advantages of simple structure, high integration, high speed, high accuracy and complete functions and can accurately identify and position objects with complex structures or stacking and the like which are difficult to identify.
The purpose of the invention is realized by the following technical scheme:
a robot grabbing system comprises six industrial robots, clamping jaws, an air pump, two identical monocular cameras, a working platform and an upper computer controller;
the clamping jaw is arranged at the tail end of the six-axis industrial robot; the air pump is connected to the six-axis industrial robot and used for controlling the opening and closing of the clamping jaws; two identical monocular cameras are parallelly arranged on a flange of the six-axis industrial robot through a clamp; the working platform is used for placing a workpiece to be grabbed; the two monocular cameras are respectively in communication connection with the upper computer controller through a gigabit Ethernet, and the control box of the six-axis industrial robot is in communication connection with the upper computer controller through the Ethernet;
an object identification and positioning module is pre-installed in the upper computer controller; the object recognition and positioning module adopts a deep learning algorithm to extract and train various postures of the workpiece, and a training model is established; the object recognition and positioning module receives images shot by the two monocular cameras, recognizes a workpiece in the images through the training model, positions the workpiece, converts pixel coordinates of the workpiece into three-dimensional coordinates under a robot-based coordinate system by combining internal and external reference calibration results and hand-eye calibration results of the two monocular cameras, and finally sends an instruction to the six-axis industrial robot to grab the workpiece and move the workpiece to a target.
Further, the gripper is installed at the tail end of the six-axis industrial robot through a connecting flange disc and is connected with the air pump.
Furthermore, the connecting flange plate is of a double-layer structure, the first layer is connected with the rotating manipulator at the tail end of the six-axis industrial robot, and the second layer is connected with the gripper.
Further, the monocular camera is a monocular industrial camera of meidweiser.
Further, the photographing positions of the two monocular cameras are 50cm-80cm above the workpiece.
The invention also provides a control method of the robot gripping system, which comprises the following steps:
step one, calibrating a monocular camera: selecting a plane template with a circular characteristic as a calibration reference object, and calibrating the monocular camera to obtain internal reference information of the camera, wherein the internal reference information comprises a distortion coefficient of the camera, a focal length of the camera and the width and height of a single pixel;
step two, calibrating the binocular camera: performing binocular calibration on two monocular cameras which are installed in parallel to obtain external parameters of the cameras, wherein the external parameters comprise conversion matrixes of the two monocular cameras and optical center distances of the two monocular cameras;
step three, calibrating hands and eyes: respectively carrying out hand-eye calibration on two monocular cameras installed at the tail ends of the six-axis industrial robot and the six-axis industrial robot to obtain a conversion matrix of coordinates of an image in the monocular cameras relative to a base coordinate system of the six-axis industrial robot;
step four, establishing a workpiece image database and carrying out network model training: shooting images of the workpiece in different states and heights, labeling and training the images of the workpiece to obtain a training model, searching the workpiece in the image of the workpiece through the training model, and displaying labeling information of the workpiece;
step five, acquiring a workpiece image: after the upper computer controller controls the six-axis industrial robot to reset to a reset point, the six-axis industrial robot drives the two monocular cameras to move to a photographing point of a working platform in a photographing area, and the left and right monocular cameras photograph a workpiece to acquire a workpiece image;
step six, image processing, determining the position of the workpiece: the upper computer controller receives the images shot by the two monocular cameras, finds the workpiece in the images through the training model obtained in the step five, and frames the workpiece in the images; the two monocular cameras extract the features of the workpiece framed in the image, and key points of the workpiece are extracted; carrying out three-dimensional reconstruction on key points in the image through hand-eye calibration and camera calibration results to obtain three-dimensional coordinates of the key points of the workpiece in the image based on a six-axis industrial robot base coordinate system;
step seven, grabbing the workpiece: and the upper computer controller sends a command to the six-axis industrial robot, and the six-axis industrial robot controls the gripper to grip the workpiece according to the received three-dimensional coordinate of the workpiece.
Further, the monocular camera calibration in the first step adopts a Zhang Zhengyou camera calibration method, which specifically comprises the following steps:
firstly, moving the robot to enable the axis of the monocular camera to be perpendicular to the working platform and at the position of the middle; selecting a plane template with circular characteristics as a calibration reference object, and then shooting pictures of the calibration template at different angles by using a camera;
performing corner point detection on the feature points in the image to obtain pixel coordinate values of the calibration board corner points, and calculating to obtain object coordinate values of the calibration class corner points according to the known size of the checkerboard and the origin of a world coordinate system;
and solving the conversion matrix according to the relation between the physical coordinate value and the pixel coordinate value, and then solving to obtain the internal parameter and distortion coefficient of the camera.
Further, the calibration process of the second binocular camera: two monocular cameras are parallelly installed on a clamp at the tail end of a six-axis industrial robot, and then the mechanical arm is in a photographing pose in working; and simultaneously, combining the internal reference coefficients of the two cameras obtained by monocular calibration, and calculating to obtain the external reference coefficients of the two cameras, namely the relative conversion matrixes of the two cameras.
Further, the three-step hand-eye calibration process: two monocular cameras are parallelly installed on a clamp at the tail end of a six-axis industrial robot, and then the mechanical arm is in a photographing pose in working; keeping the calibration plate still, shooting the picture of the stationary calibration plate by the camera under different poses by moving the mechanical arm, and recording data of xyz, rx, ry and rz on the six-axis industrial robot demonstrator in each shooting; and obtaining a conversion matrix of the camera coordinate system relative to the six-axis industrial robot base coordinate system.
Go further forwardIn one step, the hand-eye calibration calculation process involves four transformation matrices: conversion matrix of six-axis industrial robot base coordinate to calibration plate coordinate system
Figure BDA0003358253600000041
Transformation matrix from calibration plate coordinate system to camera coordinate system
Figure BDA0003358253600000042
Conversion matrix from six-axis industrial robot end coordinate system to camera coordinate system
Figure BDA0003358253600000043
Conversion matrix from six-axis industrial robot end coordinate system to six-axis industrial robot base coordinate system
Figure BDA0003358253600000044
The calculation equation is applied to each image taken by the camera:
Figure BDA0003358253600000045
Figure BDA0003358253600000046
directly solving through a shot calibration plate image, wherein the calibration plate image consists of internal parameters of a camera;
Figure BDA0003358253600000047
the teaching aid is directly read by a teaching device of the six-axis industrial robot;
Figure BDA0003358253600000048
is unknown, but the calibration plate has not moved, in each image
Figure BDA0003358253600000049
Are all the same;
Figure BDA00033582536000000410
is a fixed value;
the above calculation equations can be listed for each image, and then can be solved through the calculation equations of a plurality of images
Figure BDA00033582536000000411
And
Figure BDA00033582536000000412
a value of (d);
by equation
Figure BDA00033582536000000413
Solving a transformation matrix from a six-axis industrial robot base coordinate system to a camera coordinate system
Figure BDA00033582536000000414
The invention has the following beneficial effects:
the invention provides a robot grabbing system and a control method thereof.A robot is combined with an object recognition and positioning module pre-installed in an upper computer controller, the object recognition and positioning module adopts a deep learning algorithm to extract and train various postures of a workpiece, a machine vision system is used for carrying out vision analysis operation to quickly find the position of the workpiece, then the characteristics of a binocular camera are used for carrying out three-dimensional reconstruction to obtain three-dimensional coordinates and pose data of the workpiece, and a six-axis industrial robot is guided to grab. The speed and the degree of accuracy that the work piece was seeked have been guaranteed on the one hand, and on the other hand six industrial robot that move replaces artifically, reach unmanned purpose, have improved work efficiency that work piece discernment and snatched, have satisfied the requirement of future industrial production line intellectuality, high efficiency and reliability.
Drawings
Fig. 1 is a schematic structural diagram of a robot gripping system according to an embodiment of the present invention;
fig. 2 is a flowchart of a control method of a robot gripping system according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood by those skilled in the art, the following detailed description of the present invention is made with reference to the accompanying drawings.
Examples
As shown in fig. 1, a robot gripping system comprises a six-axis industrial robot 1, a clamping jaw 2, an air pump 3, two identical monocular cameras 4, a working platform and an upper computer controller 5;
the clamping jaw 2 is arranged at the tail end of the six-axis industrial robot 1; the air pump 3 is connected to the six-axis industrial robot 1, the air supply function is realized through IO control of the six-axis industrial robot 1, and the air pump 3 is used for controlling opening and closing of the clamping jaw 2; two monocular cameras 4 which are identical in one mode are arranged on a flange plate of the six-axis industrial robot 1 in parallel through a clamp; the working platform is used for placing a workpiece to be grabbed; the two monocular cameras 4 are in communication connection with the upper computer controller 5 through a gigabit Ethernet respectively, and the control box of the six-axis industrial robot 1 is in communication connection with the upper computer controller 5 through an Ethernet.
An object identification and positioning module is pre-installed in the upper computer controller 5; the object recognition and positioning module adopts a deep learning algorithm to extract and train various postures of the workpiece, and a training model is established; the object recognition and positioning module receives images shot by the two monocular cameras, recognizes a workpiece in the images through the training model, positions the workpiece, converts pixel coordinates of the workpiece into three-dimensional coordinates under a robot-based coordinate system by combining internal and external reference calibration results and hand-eye calibration results of the two monocular cameras, and finally sends an instruction to the six-axis industrial robot 1 to grab the workpiece and move the workpiece to a target.
Further, the gripper 2 is installed at the end of the six-axis industrial robot 1 through a connecting flange and connected with an air pump 3.
Furthermore, the connecting flange plate is of a double-layer structure, the upper layer of the first layer is connected with the rotating manipulator at the tail end of the six-axis industrial robot 1, and the lower layer of the second layer is connected with the gripper 2.
Further, the monocular camera 4 is a monocular industrial camera of midew.
Further, the photographing positions of the two monocular cameras 4: the camera cannot be too far away or too close to the workpiece, preferably 50-80 cm above the workpiece.
According to the invention, based on a machine vision system, an industrial grabbing robot is combined with a camera, the position of a workpiece is accurately and quickly found through a deep learning algorithm, the three-dimensional position and depth information of the workpiece is obtained through visual analysis and calculation, and then the six-axis industrial robot is guided to grab, so that the grabbing accuracy is improved, the six-axis industrial robot is used for replacing manpower, the purpose of no human participation is achieved, the grabbing and sorting work efficiency is greatly improved, and the requirements of intellectualization, safety and reliability of a future industrial production line are met.
As shown in fig. 2, a control method of a robot gripping system includes the following steps:
step one, calibrating a monocular camera: selecting a plane template with a circular characteristic as a calibration reference object, and calibrating the monocular camera to obtain internal reference information of the camera, wherein the internal reference information comprises a distortion coefficient of the camera, a focal length of the camera and the width and height of a single pixel;
step two, calibrating the binocular camera: performing binocular calibration on two monocular cameras which are installed in parallel to obtain external parameters of the cameras, wherein the external parameters comprise conversion matrixes of the two monocular cameras and optical center distances of the two monocular cameras;
step three, calibrating hands and eyes: respectively carrying out hand-eye calibration on two monocular cameras installed at the tail ends of the six-axis industrial robot and the six-axis industrial robot to obtain a conversion matrix of coordinates of an image in the monocular cameras relative to a base coordinate system of the six-axis industrial robot;
step four, establishing a workpiece image database and carrying out network model training: the method comprises the steps of taking a large number of pictures of the workpiece by only holding the industrial camera, obtaining pictures of the workpiece with different states and heights as much as possible, marking and training the images of the workpiece to obtain a training model, finding the workpiece in a new picture through the training model, and displaying marking information of the workpiece.
Step five, acquiring a workpiece image: after the upper computer controller controls the six-axis industrial robot to reset to a reset point, the six-axis industrial robot is started, the six-axis industrial robot drives the two monocular cameras to move to photographing points of a working platform in a photographing area, after the six-axis industrial robot is in place, the mechanical arms of the six-axis industrial robot feed back, and the upper computer controller controls the left monocular camera and the right monocular camera to photograph a workpiece to acquire a workpiece image;
step six, image processing, determining the position of the workpiece: the upper computer controller receives the images shot by the two monocular cameras, and through the training model obtained in the step five, the left camera and the right camera can quickly and accurately find the workpiece and frame out the workpiece in the images; then, the left camera and the right camera extract the features of the workpiece framed in the image, and key points of the workpiece are extracted; carrying out three-dimensional reconstruction on key points in the image through hand-eye calibration and camera calibration results to obtain three-dimensional coordinates of the key points of the workpiece in the image based on a six-axis industrial robot base coordinate system, and sending the three-dimensional coordinates of the workpiece to the six-axis industrial robot;
step seven, grabbing the workpiece: and the upper computer controller sends a command to the six-axis industrial robot, the six-axis industrial robot controls the gripper to move right above the workpiece according to the received three-dimensional coordinates of the workpiece, then the air pump controls the gripper to open, meanwhile, the gripper moves downwards to a proper distance, the gripping action is started to grip, and the gripped workpiece is placed at a target.
The first step is the calibration of the monocular camera: firstly, two monocular cameras are installed on a clamp at the tail end of a six-axis industrial robot, then a mechanical arm of the six-axis industrial robot is in a photographing pose in working, and then monocular calibration is carried out on the two monocular cameras under the pose respectively. In the whole calibration process, the camera is kept still, and the pose of the calibration plate is changed. Because the precision required on the workpiece is high, the camera has distortion due to the inherent characteristics and generates errors in the installation process, the distortion coefficient of the camera and the accurate internal reference of the camera must be obtained through calibration. Because the internal reference and the distortion coefficient of the camera are inherent properties of the camera, only calibration is needed.
A Zhangyingyou camera calibration method is adopted for monocular camera calibration, and the method comprises the following specific steps: firstly, moving the robot to enable the axis of the monocular camera to be perpendicular to the working platform and at the position of the middle; selecting a plane template with circular characteristics as a calibration reference object, and then shooting pictures of the calibration template at different angles by using a camera; performing corner point detection on the feature points in the image to obtain pixel coordinate values of the calibration board corner points, and calculating to obtain object coordinate values of the calibration class corner points according to the known size of the checkerboard and the origin of a world coordinate system; and solving the conversion matrix according to the relation between the physical coordinate value and the pixel coordinate value, and then solving to obtain the internal parameter and distortion coefficient of the camera.
The above steps are calibration of the binocular camera: two monocular cameras are parallelly installed on a clamp at the tail end of a six-axis industrial robot, and then the mechanical arm is in a photographing pose in working. And simultaneously, combining the internal reference coefficients of the two cameras obtained by monocular calibration, and calculating to obtain the external reference coefficients of the two cameras, namely the relative conversion matrixes of the two cameras. The calibration of the binocular camera also adopts a Zhang Zhengyou camera calibration algorithm.
The three steps are calibration by hands and eyes: the two monocular cameras are parallelly installed on a clamp at the tail end of the six-axis industrial robot, then the mechanical arm is in the position and posture for taking pictures in working, the preparation work is the same as the calibration of the binocular cameras, but the hand-eye calibration is different from the binocular calibration, the calibration plate is kept still, the cameras are enabled to take pictures of the immovable calibration plate in different positions and postures by moving the mechanical arm, and the data of xyz, rx, ry and rz on the six-axis industrial robot demonstrator are recorded in each shooting. The essence of hand-eye calibration is to obtain a transformation matrix of the camera coordinate system with respect to the six-axis industrial robot base coordinate system.
The whole hand-eye calibration calculation process relates to four conversion matrixes respectively: conversion matrix of six-axis industrial robot base coordinate to calibration plate coordinate system
Figure BDA0003358253600000081
Calibrating a board coordinate system to camera coordinatesTransformation matrix of system
Figure BDA0003358253600000082
Conversion matrix from six-axis industrial robot end coordinate system to camera coordinate system
Figure BDA0003358253600000083
Conversion matrix from six-axis industrial robot end coordinate system to six-axis industrial robot base coordinate system
Figure BDA0003358253600000084
The calculation equation is applied to each image taken by the camera:
Figure BDA0003358253600000085
Figure BDA0003358253600000086
directly solving through a shot calibration plate image, wherein the calibration plate image consists of internal parameters of a camera;
Figure BDA0003358253600000087
the teaching aid is directly read by a teaching device of the six-axis industrial robot;
Figure BDA0003358253600000088
is unknown, but is the same for each image because the calibration plate has not moved;
Figure BDA0003358253600000089
is also an unknown quantity, and is also a fixed value because it is the eye in the hand.
The above calculation equations can be listed for each image, and then can be solved through the calculation equations of a plurality of images
Figure BDA00033582536000000810
And
Figure BDA00033582536000000811
the value of (c). Thereby passing through the equation
Figure BDA00033582536000000812
Solving a transformation matrix from a six-axis industrial robot base coordinate system to a camera coordinate system
Figure BDA00033582536000000813
Transformation matrix from six-axis industrial robot base coordinate system to camera coordinate system
Figure BDA00033582536000000814
The position of the point in each image taken by the camera can be converted into three-dimensional coordinates relative to a six-axis industrial robot base coordinate system, and the robot is driven to perform a grabbing operation.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention, and it should be understood that the above-mentioned embodiments are only exemplary embodiments of the present invention, and are not intended to limit the present invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A robot grabbing system is characterized by comprising six industrial robots, clamping jaws, an air pump, two identical monocular cameras, a working platform and an upper computer controller;
the clamping jaw is arranged at the tail end of the six-axis industrial robot; the air pump is connected to the six-axis industrial robot and used for controlling the opening and closing of the clamping jaws; two identical monocular cameras are parallelly arranged on a flange of the six-axis industrial robot through a clamp; the working platform is used for placing a workpiece to be grabbed; the two monocular cameras are respectively in communication connection with the upper computer controller through a gigabit Ethernet, and the control box of the six-axis industrial robot is in communication connection with the upper computer controller through the Ethernet;
an object identification and positioning module is pre-installed in the upper computer controller; the object recognition and positioning module adopts a deep learning algorithm to extract and train various postures of the workpiece, and a training model is established; the object recognition and positioning module receives images shot by the two monocular cameras, recognizes a workpiece in the images through the training model, positions the workpiece, converts pixel coordinates of the workpiece into three-dimensional coordinates under a robot-based coordinate system by combining internal and external reference calibration results and hand-eye calibration results of the two monocular cameras, and finally sends an instruction to the six-axis industrial robot to grab the workpiece and move the workpiece to a target.
2. The robot gripper system of claim 1, wherein said gripper is mounted on the end of a six-axis industrial robot by means of a connecting flange and connected to an air pump.
3. The robotic gripper system according to claim 2 wherein said attachment flange is a double layer construction, a first layer being attached to the end turning robot of the six-axis industrial robot and a second layer being attached to the gripper.
4. A robotic grasping system according to claim 1, wherein said monocular camera is a midwegian monocular industrial camera.
5. The robotic gripper system according to claim 1, wherein the two monocular cameras take pictures from 50cm to 80cm above the workpiece.
6. A method of controlling a robotic gripper system as claimed in claim 1, comprising the steps of:
step one, calibrating a monocular camera: selecting a plane template with a circular characteristic as a calibration reference object, and calibrating the monocular camera to obtain internal reference information of the camera, wherein the internal reference information comprises a distortion coefficient of the camera, a focal length of the camera and the width and height of a single pixel;
step two, calibrating the binocular camera: performing binocular calibration on two monocular cameras which are installed in parallel to obtain external parameters of the cameras, wherein the external parameters comprise conversion matrixes of the two monocular cameras and optical center distances of the two monocular cameras;
step three, calibrating hands and eyes: respectively carrying out hand-eye calibration on two monocular cameras installed at the tail ends of the six-axis industrial robot and the six-axis industrial robot to obtain a conversion matrix of coordinates of an image in the monocular cameras relative to a base coordinate system of the six-axis industrial robot;
step four, establishing a workpiece image database and carrying out network model training: shooting images of the workpiece in different states and heights, labeling and training the images of the workpiece to obtain a training model, searching the workpiece in the image of the workpiece through the training model, and displaying labeling information of the workpiece;
step five, acquiring a workpiece image: after the upper computer controller controls the six-axis industrial robot to reset to a reset point, the six-axis industrial robot drives the two monocular cameras to move to a photographing point of a working platform in a photographing area, and the left and right monocular cameras photograph a workpiece to acquire a workpiece image;
step six, image processing, determining the position of the workpiece: the upper computer controller receives the images shot by the two monocular cameras, finds the workpiece in the images through the training model obtained in the step five, and frames the workpiece in the images; the two monocular cameras extract the features of the workpiece framed in the image, and key points of the workpiece are extracted; carrying out three-dimensional reconstruction on key points in the image through hand-eye calibration and camera calibration results to obtain three-dimensional coordinates of the key points of the workpiece in the image based on a six-axis industrial robot base coordinate system;
step seven, grabbing the workpiece: and the upper computer controller sends a command to the six-axis industrial robot, and the six-axis industrial robot controls the gripper to grip the workpiece according to the received three-dimensional coordinate of the workpiece.
7. The method for controlling the robot gripping system according to claim 6, wherein the monocular camera calibration of the step one adopts a Zhang Yong camera calibration method, and comprises the following steps:
firstly, moving the robot to enable the axis of the monocular camera to be perpendicular to the working platform and at the position of the middle; selecting a plane template with circular characteristics as a calibration reference object, and then shooting pictures of the calibration template at different angles by using a camera;
performing corner point detection on the feature points in the image to obtain pixel coordinate values of the calibration board corner points, and calculating to obtain object coordinate values of the calibration class corner points according to the known size of the checkerboard and the origin of a world coordinate system;
and solving the conversion matrix according to the relation between the physical coordinate value and the pixel coordinate value, and then solving to obtain the internal parameter and distortion coefficient of the camera.
8. The method for controlling a robotic gripper system as claimed in claim 6, wherein said step two binocular camera calibration procedures: two monocular cameras are parallelly installed on a clamp at the tail end of a six-axis industrial robot, and then the mechanical arm is in a photographing pose in working; and simultaneously, combining the internal reference coefficients of the two cameras obtained by monocular calibration, and calculating to obtain the external reference coefficients of the two cameras, namely the relative conversion matrixes of the two cameras.
9. The method for controlling a robotic gripper system as claimed in claim 6, wherein said step three hand-eye calibration procedure: two monocular cameras are parallelly installed on a clamp at the tail end of a six-axis industrial robot, and then the mechanical arm is in a photographing pose in working; keeping the calibration plate still, shooting the picture of the stationary calibration plate by the camera under different poses by moving the mechanical arm, and recording data of xyz, rx, ry and rz on the six-axis industrial robot demonstrator in each shooting; and obtaining a conversion matrix of the camera coordinate system relative to the six-axis industrial robot base coordinate system.
10. A method of controlling a robotic gripper system as claimed in claim 9, wherein said hand-eye calibration calculation process involves a total of four transformation matrices: six-axis industrial robot base coordinate targetConversion matrix of fixed plate coordinate system
Figure FDA0003358253590000031
Transformation matrix from calibration plate coordinate system to camera coordinate system
Figure FDA0003358253590000032
Conversion matrix from six-axis industrial robot end coordinate system to camera coordinate system
Figure FDA0003358253590000041
Conversion matrix from six-axis industrial robot end coordinate system to six-axis industrial robot base coordinate system
Figure FDA0003358253590000042
The calculation equation is applied to each image taken by the camera:
Figure FDA0003358253590000043
Figure FDA0003358253590000044
directly solving through a shot calibration plate image, wherein the calibration plate image consists of internal parameters of a camera;
Figure FDA0003358253590000045
the teaching aid is directly read by a teaching device of the six-axis industrial robot;
Figure FDA0003358253590000046
is unknown, but the calibration plate has not moved, in each image
Figure FDA0003358253590000047
Are all the same;
Figure FDA0003358253590000048
is a fixed value;
the above calculation equations can be listed for each image, and then can be solved through the calculation equations of a plurality of images
Figure FDA0003358253590000049
And
Figure FDA00033582535900000410
a value of (d);
by equation
Figure FDA00033582535900000411
Solving a transformation matrix from a six-axis industrial robot base coordinate system to a camera coordinate system
Figure FDA00033582535900000412
CN202111358870.2A 2021-11-17 2021-11-17 Robot grabbing system and control method thereof Pending CN114055501A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111358870.2A CN114055501A (en) 2021-11-17 2021-11-17 Robot grabbing system and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111358870.2A CN114055501A (en) 2021-11-17 2021-11-17 Robot grabbing system and control method thereof

Publications (1)

Publication Number Publication Date
CN114055501A true CN114055501A (en) 2022-02-18

Family

ID=80272990

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111358870.2A Pending CN114055501A (en) 2021-11-17 2021-11-17 Robot grabbing system and control method thereof

Country Status (1)

Country Link
CN (1) CN114055501A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114932516A (en) * 2022-04-28 2022-08-23 珠海格力电器股份有限公司 Positioning and assembling system for motor shaft of air conditioner external unit and control method of positioning and assembling system
US11536621B2 (en) * 2020-03-31 2022-12-27 Toyota Research Institute, Inc. Methods and systems for calibrating deformable sensors using camera

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108109174A (en) * 2017-12-13 2018-06-01 上海电气集团股份有限公司 A kind of robot monocular bootstrap technique sorted at random for part at random and system
CN110509300A (en) * 2019-09-30 2019-11-29 河南埃尔森智能科技有限公司 Stirrup processing feeding control system and control method based on 3D vision guidance
CN111496770A (en) * 2020-04-09 2020-08-07 上海电机学院 Intelligent carrying mechanical arm system based on 3D vision and deep learning and use method
CN111923053A (en) * 2020-04-21 2020-11-13 广州里工实业有限公司 Industrial robot object grabbing teaching system and method based on depth vision
CN112045676A (en) * 2020-07-31 2020-12-08 广州中国科学院先进技术研究所 Method for grabbing transparent object by robot based on deep learning
CN113524194A (en) * 2021-04-28 2021-10-22 重庆理工大学 Target grabbing method of robot vision grabbing system based on multi-mode feature deep learning
WO2021217976A1 (en) * 2020-04-28 2021-11-04 平安科技(深圳)有限公司 Method and apparatus for controlling mechanical arm on basis of monocular visual positioning

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108109174A (en) * 2017-12-13 2018-06-01 上海电气集团股份有限公司 A kind of robot monocular bootstrap technique sorted at random for part at random and system
CN110509300A (en) * 2019-09-30 2019-11-29 河南埃尔森智能科技有限公司 Stirrup processing feeding control system and control method based on 3D vision guidance
CN111496770A (en) * 2020-04-09 2020-08-07 上海电机学院 Intelligent carrying mechanical arm system based on 3D vision and deep learning and use method
CN111923053A (en) * 2020-04-21 2020-11-13 广州里工实业有限公司 Industrial robot object grabbing teaching system and method based on depth vision
WO2021217976A1 (en) * 2020-04-28 2021-11-04 平安科技(深圳)有限公司 Method and apparatus for controlling mechanical arm on basis of monocular visual positioning
CN112045676A (en) * 2020-07-31 2020-12-08 广州中国科学院先进技术研究所 Method for grabbing transparent object by robot based on deep learning
CN113524194A (en) * 2021-04-28 2021-10-22 重庆理工大学 Target grabbing method of robot vision grabbing system based on multi-mode feature deep learning

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11536621B2 (en) * 2020-03-31 2022-12-27 Toyota Research Institute, Inc. Methods and systems for calibrating deformable sensors using camera
CN114932516A (en) * 2022-04-28 2022-08-23 珠海格力电器股份有限公司 Positioning and assembling system for motor shaft of air conditioner external unit and control method of positioning and assembling system

Similar Documents

Publication Publication Date Title
CN109483554B (en) Robot dynamic grabbing method and system based on global and local visual semantics
CN107618030B (en) Robot dynamic tracking grabbing method and system based on vision
CN108399639B (en) Rapid automatic grabbing and placing method based on deep learning
CN110509300B (en) Steel hoop processing and feeding control system and control method based on three-dimensional visual guidance
CN113524194B (en) Target grabbing method of robot vision grabbing system based on multi-mode feature deep learning
CN111347411B (en) Two-arm cooperative robot three-dimensional visual recognition grabbing method based on deep learning
CN114055501A (en) Robot grabbing system and control method thereof
CN110580725A (en) Box sorting method and system based on RGB-D camera
CN110751691B (en) Automatic pipe fitting grabbing method based on binocular vision
CN111923053A (en) Industrial robot object grabbing teaching system and method based on depth vision
CN108748149B (en) Non-calibration mechanical arm grabbing method based on deep learning in complex environment
CN107009358A (en) A kind of unordered grabbing device of robot based on one camera and method
CN113146172B (en) Multi-vision-based detection and assembly system and method
CN110980276B (en) Method for implementing automatic casting blanking by three-dimensional vision in cooperation with robot
CN111823223A (en) Robot arm grabbing control system and method based on intelligent stereoscopic vision
CN110909644A (en) Method and system for adjusting grabbing posture of mechanical arm end effector based on reinforcement learning
CN112775959A (en) Method and system for determining grabbing pose of manipulator and storage medium
CN111267094A (en) Workpiece positioning and grabbing method based on binocular vision
CN114074331A (en) Disordered grabbing method based on vision and robot
Lin et al. Vision based object grasping of industrial manipulator
CN113664826A (en) Robot grabbing method and system in unknown environment
CN116749198A (en) Binocular stereoscopic vision-based mechanical arm grabbing method
CN108393676B (en) Model setting method for automatic makeup assembly
CN208020198U (en) A kind of intelligent grabbing robot based on RGBD
CN113500593B (en) Method for grabbing designated part of shaft workpiece for feeding

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination