CN111136656A - Method for automatically identifying and grabbing three-dimensional irregular object of robot - Google Patents

Method for automatically identifying and grabbing three-dimensional irregular object of robot Download PDF

Info

Publication number
CN111136656A
CN111136656A CN201911346789.5A CN201911346789A CN111136656A CN 111136656 A CN111136656 A CN 111136656A CN 201911346789 A CN201911346789 A CN 201911346789A CN 111136656 A CN111136656 A CN 111136656A
Authority
CN
China
Prior art keywords
robot
camera
point
grabbing
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911346789.5A
Other languages
Chinese (zh)
Other versions
CN111136656B (en
Inventor
杨锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhichang Technology Group Co.,Ltd.
Original Assignee
Shanghai Gene Automation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Gene Automation Technology Co ltd filed Critical Shanghai Gene Automation Technology Co ltd
Priority to CN201911346789.5A priority Critical patent/CN111136656B/en
Publication of CN111136656A publication Critical patent/CN111136656A/en
Application granted granted Critical
Publication of CN111136656B publication Critical patent/CN111136656B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Abstract

The invention provides a method for automatically identifying and grabbing a three-dimensional irregular object by a robot, which comprises the following specific operation methods: earlier stage data collection, data test, data analysis, robot move to snatch the position appearance, realize snatching to the tool to lock. The robot provided by the invention is additionally provided with automatic identification and grabbing of three-dimensional irregular objects, surface information of the three-dimensional irregular objects is obtained through the line laser 3D camera, the camera is installed on a flange plate at the tail end of the robot, when a workpiece reaches a scanning area of the robot, the robot carries the camera to perform linear motion to scan the surface of the product so as to obtain the surface information of the product, products of different specifications are scanned one by one to obtain a point cloud library of the whole product, for each product of each specification type, grabbing points are taught through the robot and recorded, and then current scanning workpiece information is converted into current robot grabbing position points through formulas 2-5, so that workpiece grabbing is realized.

Description

Method for automatically identifying and grabbing three-dimensional irregular object of robot
Technical Field
The invention relates to the field of industrial automation, in particular to a method for automatically identifying and grabbing a three-dimensional irregular object by a robot.
Background
Under the background of national intelligent manufacturing warfare, particularly in recent years, with rapid development in the field of logistics, there are more and more demands for automatically sorting, identifying and grabbing objects, and currently, a mainstream automatic identification and grabbing robot system in the market mainly extracts a two-dimensional plane feature on the surface of an object and performs template matching according to the feature, so that the current automatic identification and grabbing robot system can only automatically identify and grab regular objects, such as identification and grabbing of express boxes, but the type of product cannot identify and grab irregular three-dimensional objects. Based on the requirement of the robot for automatically identifying and grabbing irregular objects (such as locks for containers) provided by customers, the robot three-dimensional irregular object automatic identification and grabbing method is provided.
Disclosure of Invention
According to the technical problem, the invention provides a method for automatically identifying and grabbing a three-dimensional irregular object by a robot, which comprises the following specific operation methods:
firstly, early data collection: the client software firstly extracts key features in the lockset by acquiring images of the locksets, and simultaneously establishes a point cloud library of each product and stores the point cloud library in the client software;
secondly, data testing: installing a camera on a flange plate at the tail end of the robot, determining that the camera is installed at a specified position, placing a workpiece on a workbench, and when the workpiece reaches a scanning area of the robot, the robot carries the camera to perform linear motion to scan the surface of a product so as to acquire the surface information of the product;
thirdly, data analysis: when the robot starts to identify and grab the lock, the robot starts to scan the lock from an appointed position through a line laser 3-dimensional camera clamped at the tail end to obtain an image of the lock, the image is stored in client software, the client software is matched with a point cloud base stored by the client software through the image to determine the model of a product, if the model of the product exists, the client software is matched with the point cloud base through key features in the image, and the client sends the pose to the robot through Socket communication by determining the pose of a lock grabbing point;
and fourthly, the robot moves to the grabbing point position, and the grabbing of the lock is realized.
The camera is a line laser 3-dimensional camera.
The method for determining the specified position of the camera comprises the following steps:
robot camera calibration is mainly used for confirming the relation between a camera coordinate system and the position of a flange plate at the tail end of a robot, namely obtaining
Figure BDA0002333599300000023
Step1, printing a visual calibration plate, wherein the calibration plate is generally given by a camera manufacturer and is printed according to the use requirement;
step2, placing the calibration plate at a flat and wide position and ensuring that the position and posture of the robot can be reached;
step3 is to set a tool with a needle point at the end of a robot and teach the tool to establish the relation between the tool coordinate system and the robot world coordinate system
Figure BDA0002333599300000021
The tool is mainly used for calibrating the calibration plate;
step4 teaching the workpiece coordinate system of the calibration plate by using the robot tip to mount the needle tip, wherein the center point of the coordinate axis is the origin of the workpiece coordinate system, the direction of the coordinate system is as shown in FIG. 1, note that the current workpiece coordinate system is
Figure BDA0002333599300000022
Step5, moving the camera to the position right above the calibration plate to make the laser line projected by the camera aligned with the X axis of the calibration plate, the distance from the camera to the calibration plate along the direction of the laser line is consistent, and the distance from the camera to the calibration plate along the length direction of the camera, namely the direction vertical to the laser line, is also consistent, in other words, the installation plane on the camera is parallel to the plane of the calibration plate, at this time, the alignment of the camera coordinate system and the coordinate system of the calibration plate is realized;
step6, recording the point of the aligned robot flange (tool0) in the Step5 in the workpiece coordinate system
Figure BDA0002333599300000031
The lower point is pAlignMiddle, so that the prelimingnprestart position (palignprestart.x ═ palignmiddle.x, palignprestart.y ═ palignmiddle.y-250) and the end position pAlignEnd (palignend.x ═ palignmiddle.x, palignend.y ═ palignmiddle.y +250) of the next camera movement are calculated;
step7 moving the robot's camera laser line to the pAlignPreStart point (to get
Figure BDA0002333599300000032
For reference, the same applies below); starting the robot to move from the pAlignPreStart point to the pAlignEnd point in a straight line at the speed of 100 mm/s; it should be noted that there is an acceleration and deceleration process at a constant speed when the robot moves from a stationary state to a moving state to reach 100mm/s, and we enable the robot to send a signal to trigger the trisector camera to take an image after the robot runs for 50mm in a straight line (at this time, the robot has already accelerated to finish), where a trigger point of the signal is a start position of a Y coordinate of an acquired image, and the position of a flange (tool0) at this point is recorded as pAlignStart, and palignstart.y is palignmiddle.y-200; note that the length of the Y-direction graph of the trisector is set to 350mm, as shown in fig. 2; step8, after the image is taken, checking whether the Z values of a plurality of arbitrary positions of the scanned image are consistent through software provided by a line laser camera, so as to confirm whether the image-taking plane of the camera is parallel to the plane of the calibration plate (if the difference is too much, such as nearly 1mm or more, and the image-taking plane needs to be adjusted to be parallel, the operation is resumed from the Step 5); recording the point of the origin of coordinates of the calibration plate in the image coordinate system at the moment, and recording the coordinate values of the point as X, Y and Z, wherein the Euler angles of the rotation of the camera around the X, Y and Z axes are all 0 because the camera is ensured to be aligned with the coordinates of the calibration plate board in the direction when the image is taken, thereby obtaining the calibration plate
Figure BDA0002333599300000033
Step9, expressing the point location and the coordinate system in a homogeneous coordinate system mode respectively as follows:
Figure BDA0002333599300000034
representing the relation between the center point of the calibration plate and the world coordinate system of the robot;
Figure BDA0002333599300000041
representing the relation between a flange plate at the tail end of the robot and the central point of a calibration plate;
Figure BDA0002333599300000042
representing a relationship between the calibration plate and the visual camera;
Figure BDA0002333599300000043
as described above
Figure BDA0002333599300000044
Namely the tool coordinate system of the camera, namely the conversion relation between the camera coordinate system and the flange plate central point coordinate system.
The method for determining the positions and postures of the grabbing points of the robot is mainly characterized in that a conversion matrix of image grabbing points and template grabbing points is obtained, the grabbing points of the robot under the template are converted into grabbing points of the robot under a current scanning image, the robot scans a target workpiece to obtain the target characteristics of a required image, and a conversion matrix H and grabbing points teaching the positions of the template are given according to the characteristics and the characteristics in the template
Figure BDA0002333599300000045
Establishing a current image robot grasp point
Figure BDA0002333599300000046
The calculation schematic is as follows:
step1: the pose from the origin of the camera to the base coordinate is obtained, and the formula is obtained as follows:
Figure BDA0002333599300000047
Figure BDA0002333599300000048
the camera is started with a shot point, i.e. the origin of the camera.
Figure BDA0002333599300000049
A coordinate system of a calibrated camera origin and a tool end flange;
step2: and (3) solving the grabbing pose of the robot at the position 2, and obtaining a formula (2-5) by combining the following formulas (2-1), (2-2) and (2-3), namely the grabbing point of the robot at the position 2:
Figure BDA00023335993000000410
Figure BDA00023335993000000411
Figure BDA00023335993000000412
Figure BDA0002333599300000051
Figure BDA0002333599300000052
the invention has the beneficial effects that: the robot is different from the automatic identification and grabbing technology of a robot on the existing market for a two-dimensional plane object, the automatic identification and grabbing of a three-dimensional irregular object are added, the surface information of the three-dimensional irregular object is obtained through a line laser 3D camera, the camera is installed on a flange plate at the tail end of the robot, when a workpiece reaches a scanning area of the robot, the robot carries the camera to perform linear motion to scan the surface of a product so as to obtain the surface information of the product, the products of different specifications are scanned one by one to obtain a point cloud library of the whole product, for each specification type of product, grabbing points are taught through the robot and recorded, and then current scanned workpiece information is converted into current robot grabbing position points through formulas 2-5, so that the workpiece is grabbed.
Drawings
Fig. 1 is a diagram of an automatic recognition and grabbing framework of a three-dimensional irregular object of the robot.
FIG. 2 is a calibration plate workpiece coordinate system setting according to the present invention.
FIG. 3 is a schematic view of camera calibration according to the present invention.
FIG. 4 is a schematic diagram of the present invention for establishing a robot grasp point.
FIG. 5 is a schematic flow chart of the present invention.
Fig. 6 illustrates several lock images and features of the present invention.
Detailed Description
The invention will be further explained with reference to the figures:
example 1
A method for automatically identifying and grabbing a three-dimensional irregular object of a robot comprises the following specific operation methods:
1. collecting data at the early stage: the client software firstly extracts key features in the lockset by acquiring images of the locksets, and simultaneously establishes a point cloud library of each product and stores the point cloud library in the client software;
2. and (3) data testing: installing a camera on a flange plate at the tail end of the robot, determining that the camera is installed at a specified position, placing a workpiece on a workbench, and when the workpiece reaches a scanning area of the robot, the robot carries the camera to perform linear motion to scan the surface of a product so as to acquire the surface information of the product;
3. and (3) data analysis: when the robot starts to identify and grab the lock, the robot starts to scan the lock from an appointed position through a line laser 3-dimensional camera clamped at the tail end to obtain an image of the lock, the image is stored in client software, the client software is matched with a point cloud base stored by the client software through the image to determine the model of a product, if the model of the product exists, the client software is matched with the point cloud base through key features in the image, and the client sends the pose to the robot through Socket communication by determining the pose of a lock grabbing point;
4. the robot moves to the grabbing point position, and the lock is grabbed.
Example 2
Robot camera calibration is mainly used for confirming the relation between a camera coordinate system and the position of a flange plate at the tail end of a robot, namely obtaining
Figure BDA0002333599300000061
Step1, printing a visual calibration plate, wherein the calibration plate is generally given by a camera manufacturer and is printed according to the use requirement;
step2, placing the calibration plate at a flat and wide position and ensuring that the position and posture of the robot can be reached;
step3 is to set a tool with a needle point at the end of a robot and teach the tool to establish the relation between the tool coordinate system and the robot world coordinate system
Figure BDA0002333599300000062
The tool is mainly used for calibrating the calibration plate;
step4 teaching the workpiece coordinate system of the calibration plate by using the robot tip to mount the needle tip, wherein the center point of the coordinate axis is the origin of the workpiece coordinate system, the direction of the coordinate system is as shown in FIG. 1, note that the current workpiece coordinate system is
Figure BDA0002333599300000071
Step5, moving the camera to the position right above the calibration plate to make the laser line projected by the camera aligned with the X axis of the calibration plate, the distance from the camera to the calibration plate along the direction of the laser line is consistent, and the distance from the camera to the calibration plate along the length direction of the camera, namely the direction vertical to the laser line, is also consistent, in other words, the installation plane on the camera is parallel to the plane of the calibration plate, at this time, the alignment of the camera coordinate system and the coordinate system of the calibration plate is realized;
step6, recording the point position of the aligned robot flange (tool0) in Step5Coordinate system of a part
Figure BDA0002333599300000072
The lower point is pAlignMiddle, so that the prelimingnprestart position (palignprestart.x ═ palignmiddle.x, palignprestart.y ═ palignmiddle.y-250) and the end position pAlignEnd (palignend.x ═ palignmiddle.x, palignend.y ═ palignmiddle.y +250) of the next camera movement are calculated;
step7 moving the robot's camera laser line to the pAlignPreStart point (to get
Figure BDA0002333599300000073
For reference, the same applies below); starting the robot to move from the pAlignPreStart point to the pAlignEnd point in a straight line at the speed of 100 mm/s; it should be noted that there is an acceleration and deceleration process at a constant speed when the robot moves from a stationary state to a moving state to reach 100mm/s, and we enable the robot to send a signal to trigger the trisector camera to take an image after the robot runs for 50mm in a straight line (at this time, the robot has already accelerated to finish), where a trigger point of the signal is a start position of a Y coordinate of an acquired image, and the position of a flange (tool0) at this point is recorded as pAlignStart, and palignstart.y is palignmiddle.y-200; note that the length of the Y-direction graph of the trisector is set to 350mm, as shown in fig. 2; step8, after the image is taken, checking whether the Z values of a plurality of arbitrary positions of the scanned image are consistent through software provided by a line laser camera, so as to confirm whether the image-taking plane of the camera is parallel to the plane of the calibration plate (if the difference is too much, such as nearly 1mm or more, and the image-taking plane needs to be adjusted to be parallel, the operation is resumed from the Step 5); recording the point of the origin of coordinates of the calibration plate in the image coordinate system at the moment, and recording the coordinate values of the point as X, Y and Z, wherein the Euler angles of the rotation of the camera around the X, Y and Z axes are all 0 because the camera is ensured to be aligned with the coordinates of the calibration plate board in the direction when the image is taken, thereby obtaining the calibration plate
Figure BDA0002333599300000081
Step9, expressing the point location and the coordinate system in a homogeneous coordinate system mode respectively as follows:
Figure BDA0002333599300000082
representing the relation between the center point of the calibration plate and the world coordinate system of the robot;
Figure BDA0002333599300000083
representing the relation between a flange plate at the tail end of the robot and the central point of a calibration plate;
Figure BDA0002333599300000084
representing a relationship between the calibration plate and the visual camera;
Figure BDA0002333599300000085
as described above
Figure BDA0002333599300000086
Namely the tool coordinate system of the camera, namely the conversion relation between the camera coordinate system and the flange plate central point coordinate system.
Example 3
The method for determining the positions of the grabbing points of the robot mainly comprises the steps of obtaining a conversion matrix of image grabbing points and template grabbing points, converting the grabbing points of the robot under the template into the grabbing points of the robot under a current scanning image, scanning a target workpiece by the robot to obtain the target characteristics of a required image, comparing the characteristics with the characteristics in the template to give a conversion matrix H and teaching the grabbing points of the position of the template
Figure BDA0002333599300000087
Establishing a current image robot grasp point
Figure BDA0002333599300000088
The calculation schematic is as follows:
step1: the pose from the origin of the camera to the base coordinate is obtained, and the formula is obtained as follows:
Figure BDA0002333599300000089
Figure BDA00023335993000000810
the camera is started with a shot point, i.e. the origin of the camera.
Figure BDA0002333599300000091
A coordinate system of a calibrated camera origin and a tool end flange;
step2: and (3) solving the grabbing pose of the robot at the position 2, and obtaining a formula (2-5) by combining the following formulas (2-1), (2-2) and (2-3), namely the grabbing point of the robot at the position 2:
Figure BDA0002333599300000092
Figure BDA0002333599300000093
Figure BDA0002333599300000094
Figure BDA0002333599300000095
Figure BDA0002333599300000096
example 4
When the invention is used for the lock for the container, the detailed description of the specific operation steps is as follows:
step1 preparation 1:
the preparation 1 is mainly used for calibrating the relation between the camera coordinate system and the position of the flange plate at the tail end of the robot, and the using and operating steps of the algorithm are described in detail in the algorithm 1.
Step2 preparation 2:
the preparation 2 is mainly used for obtaining the surface information of each product, so that an irregular product surface information template library and a robot grabbing point table are formed. The specific operation process is as follows:
the robot receives the product arrival information and then moves to a designated position with a camera, scans the designated work area through linear motion to acquire product surface information, extracts characteristic lines and characteristic line midpoints in the surface of the product according to the surface characteristics of the product to form a product template, and is used for determining the grabbing position of the robot. And moving the robot to the midpoint of the characteristic line of the surface of the product, ensuring the gesture of the robot clamp to be consistent with the direction of the characteristic line, determining the grabbing point of the robot to the current product, and recording the grabbing point. And repeating the steps for the products with different specifications and types so as to establish an irregular product surface information template library and a robot grabbing point table. For robot grab point calculation in algorithm 2.
Step3:
After the operation of Step1 and Step2 is completed, when a new product moves to a designated work area and an in-place signal is sent to the robot, the robot receives the arrival information of the product and then moves to the designated position with a camera, the robot scans the designated work area through linear motion to acquire the surface information of the product, the robot performs template matching with the surface information template library of the product to determine whether the current product exists in the library, and if the current product does not exist in the library, the robot is informed to manually add a new product template to the product template library, and meanwhile, the robot captures point information. And if the current product is already stored in the product template library, matching the product with the product in the product template library so as to establish the product type, and after the product type is confirmed, extracting a characteristic line on the surface of the product of the type and matching the characteristic line with the template so as to establish a conversion matrix of the current product grabbing point.
And Step4, establishing the robot grabbing point pose under the current image according to the algorithm 2.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, several modifications can be made without departing from the principle of the present invention, and these modifications should also be construed as the protection scope of the present invention.

Claims (4)

1. A method for automatically identifying and grabbing a three-dimensional irregular object of a robot comprises the following specific operation methods:
firstly, early data collection: the client software firstly extracts key features in the lockset by acquiring images of the locksets, and simultaneously establishes a point cloud library of each product and stores the point cloud library in the client software;
secondly, data testing: installing a camera on a flange plate at the tail end of the robot, determining that the camera is installed at a specified position, placing a workpiece on a workbench, and when the workpiece reaches a scanning area of the robot, the robot carries the camera to perform linear motion to scan the surface of a product so as to acquire the surface information of the product;
thirdly, data analysis: when the robot starts to identify and grab the lock, the robot starts to scan the lock from an appointed position through a line laser 3-dimensional camera clamped at the tail end to obtain an image of the lock, the image is stored in client software, the client software is matched with a point cloud base stored by the client software through the image to determine the model of a product, if the model of the product exists, the client software is matched with the point cloud base through key features in the image, and the client sends the pose to the robot through Socket communication by determining the pose of a lock grabbing point;
and fourthly, the robot moves to the grabbing point position, and the grabbing of the lock is realized.
2. The method for automatic recognition and grasping of three-dimensional irregular objects of a robot according to claim 1, wherein said camera is a line laser 3-dimensional camera.
3. A method for automatically identifying and grabbing a three-dimensional irregular object of a robot is characterized in that the camera specified position determination method comprises the following steps:
robot camera calibration is mainly used for confirming the relation between a camera coordinate system and the position of a flange plate at the tail end of a robot, namely obtaining
Figure FDA0002333599290000011
Step1, printing a visual calibration plate, wherein the calibration plate is generally given by a camera manufacturer and is printed according to the use requirement;
step2, placing the calibration plate at a flat and wide position and ensuring that the position and posture of the robot can be reached;
step3 is to set a tool with a needle point at the end of a robot and teach the tool to establish the relation between the tool coordinate system and the robot world coordinate system
Figure FDA0002333599290000021
The tool is mainly used for calibrating the calibration plate;
step4 teaching the workpiece coordinate system of the calibration plate by using the robot tip to mount the needle tip, wherein the center point of the coordinate axis is the origin of the workpiece coordinate system, the direction of the coordinate system is as shown in FIG. 1, note that the current workpiece coordinate system is
Figure FDA0002333599290000022
Step5, moving the camera to the position right above the calibration plate to make the laser line projected by the camera aligned with the X axis of the calibration plate, the distance from the camera to the calibration plate along the direction of the laser line is consistent, and the distance from the camera to the calibration plate along the length direction of the camera, namely the direction vertical to the laser line, is also consistent, in other words, the installation plane on the camera is parallel to the plane of the calibration plate, at this time, the alignment of the camera coordinate system and the coordinate system of the calibration plate is realized;
step6, recording the point of the aligned robot flange (tool0) in the Step5 in the workpiece coordinate system
Figure FDA0002333599290000023
Lower pointPosition pAlignMiddle, so as to calculate the pre-start position pAlignPreStart (pAlignPreStart. x: pAlignMiddle. x, pAlignPreStart. y: pAlignMiddle. y-250) and the end position pAlignEnd (pAlignEnd. x: pAlignMiddle. x, pAlignEnd. y: pAlignMiddle. y +250) of the next camera movement;
step7 moving the robot's camera laser line to the pAlignPreStart point (to get
Figure FDA0002333599290000024
For reference, the same applies below); starting the robot to move from the pAlignPreStart point to the pAlignEnd point in a straight line at the speed of 100 mm/s; it should be noted that there is an acceleration and deceleration process at a constant speed when the robot moves from a stationary state to a moving state to reach 100mm/s, and we enable the robot to send a signal to trigger the trisector camera to take an image after the robot runs for 50mm in a straight line (at this time, the robot has already accelerated to finish), where a trigger point of the signal is a start position of a Y coordinate of an acquired image, and the position of a flange (tool0) at this point is recorded as pAlignStart, and palignstart.y is palignmiddle.y-200; note that the length of the Y-direction graph of the trisector is set to 350mm, as shown in fig. 2; step8, after the image is taken, checking whether the Z values of a plurality of arbitrary positions of the scanned image are consistent through software provided by a line laser camera, so as to confirm whether the image-taking plane of the camera is parallel to the plane of the calibration plate (if the difference is too much, such as nearly 1mm or more, and the image-taking plane needs to be adjusted to be parallel, the operation is resumed from the Step 5); recording the point of the origin of coordinates of the calibration plate in the image coordinate system at the moment, and recording the coordinate values of the point as X, Y and Z, wherein the Euler angles of the rotation of the camera around the X, Y and Z axes are all 0 because the camera is ensured to be aligned with the coordinates of the calibration plate board in the direction when the image is taken, thereby obtaining the calibration plate
Figure FDA0002333599290000031
Step9, expressing the point location and the coordinate system in a homogeneous coordinate system mode respectively as follows:
Figure FDA0002333599290000032
representing the relation between the center point of the calibration plate and the world coordinate system of the robot;
Figure FDA0002333599290000033
representing the relation between a flange plate at the tail end of the robot and the central point of a calibration plate;
Figure FDA0002333599290000034
representing a relationship between the calibration plate and the visual camera;
Figure FDA0002333599290000035
as described above
Figure FDA0002333599290000036
Namely the tool coordinate system of the camera, namely the conversion relation between the camera coordinate system and the flange plate central point coordinate system.
4. A method for automatically recognizing and grabbing three-dimensional irregular object by robot includes obtaining conversion matrix of image grabbing point and template grabbing point, converting grabbing point of robot under template into grabbing point of robot under current scanned image, scanning target workpiece by robot to obtain target feature of needed image, comparing said feature with feature in template to obtain conversion matrix H and teaching grabbing point of template position
Figure FDA0002333599290000037
Establishing a current image robot grasp point
Figure FDA0002333599290000038
The calculation schematic is as follows:
step1: the pose from the origin of the camera to the base coordinate is obtained, and the formula is obtained as follows:
Figure FDA0002333599290000041
Figure FDA0002333599290000042
the camera is started with a shot point, i.e. the origin of the camera.
Figure FDA0002333599290000043
A coordinate system of a calibrated camera origin and a tool end flange;
step2: and (3) solving the grabbing pose of the robot at the position 2, and obtaining a formula (2-5) by combining the following formulas (2-1), (2-2) and (2-3), namely the grabbing point of the robot at the position 2:
Figure FDA0002333599290000044
Figure FDA0002333599290000045
Figure FDA0002333599290000046
Figure FDA0002333599290000047
Figure FDA0002333599290000048
CN201911346789.5A 2019-12-24 2019-12-24 Method for automatically identifying and grabbing three-dimensional irregular object of robot Active CN111136656B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911346789.5A CN111136656B (en) 2019-12-24 2019-12-24 Method for automatically identifying and grabbing three-dimensional irregular object of robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911346789.5A CN111136656B (en) 2019-12-24 2019-12-24 Method for automatically identifying and grabbing three-dimensional irregular object of robot

Publications (2)

Publication Number Publication Date
CN111136656A true CN111136656A (en) 2020-05-12
CN111136656B CN111136656B (en) 2020-12-08

Family

ID=70519705

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911346789.5A Active CN111136656B (en) 2019-12-24 2019-12-24 Method for automatically identifying and grabbing three-dimensional irregular object of robot

Country Status (1)

Country Link
CN (1) CN111136656B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111595266A (en) * 2020-06-02 2020-08-28 西安航天发动机有限公司 Spatial complex trend catheter visual identification method
CN112549021A (en) * 2020-11-16 2021-03-26 北京配天技术有限公司 Robot control method, robot and storage device
CN113114766A (en) * 2021-04-13 2021-07-13 江苏大学 Potted plant information detection method and device based on ZED camera
CN113510697A (en) * 2021-04-23 2021-10-19 知守科技(杭州)有限公司 Manipulator positioning method, device, system, electronic device and storage medium
CN113770059A (en) * 2021-09-16 2021-12-10 中冶东方工程技术有限公司 Intelligent sorting system and method for steel structure parts
CN114074331A (en) * 2022-01-19 2022-02-22 成都考拉悠然科技有限公司 Disordered grabbing method based on vision and robot
CN114113163A (en) * 2021-12-01 2022-03-01 北京航星机器制造有限公司 Automatic digital ray detection device and method based on intelligent robot
CN115070779A (en) * 2022-08-22 2022-09-20 菲特(天津)检测技术有限公司 Robot grabbing control method and system and electronic equipment
CN115446392A (en) * 2022-10-13 2022-12-09 芜湖行健智能机器人有限公司 Intelligent chamfering system and method for disordered plate
WO2023040095A1 (en) * 2021-09-16 2023-03-23 梅卡曼德(北京)机器人科技有限公司 Camera calibration method and apparatus, electronic device, and storage medium
CN116330306A (en) * 2023-05-31 2023-06-27 之江实验室 Object grabbing method and device, storage medium and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1293752A (en) * 1999-03-19 2001-05-02 松下电工株式会社 Three-D object recognition method and pin picking system using the method
EP2455198A1 (en) * 2010-11-17 2012-05-23 Samsung Electronics Co., Ltd. Control of robot hand to contact an object
CN105014667A (en) * 2015-08-06 2015-11-04 浙江大学 Camera and robot relative pose calibration method based on pixel space optimization
US9403278B1 (en) * 2015-03-19 2016-08-02 Waterloo Controls Inc. Systems and methods for detecting and picking up a waste receptacle
CN106041937A (en) * 2016-08-16 2016-10-26 河南埃尔森智能科技有限公司 Control method of manipulator grabbing control system based on binocular stereoscopic vision
CN106778790A (en) * 2017-02-15 2017-05-31 苏州博众精工科技有限公司 A kind of target identification based on three-dimensional point cloud and localization method and system
CN108827154A (en) * 2018-07-09 2018-11-16 深圳辰视智能科技有限公司 A kind of robot is without teaching grasping means, device and computer readable storage medium
CN109493384A (en) * 2018-09-20 2019-03-19 顺丰科技有限公司 Camera position and orientation estimation method, system, equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1293752A (en) * 1999-03-19 2001-05-02 松下电工株式会社 Three-D object recognition method and pin picking system using the method
EP2455198A1 (en) * 2010-11-17 2012-05-23 Samsung Electronics Co., Ltd. Control of robot hand to contact an object
US9403278B1 (en) * 2015-03-19 2016-08-02 Waterloo Controls Inc. Systems and methods for detecting and picking up a waste receptacle
CN105014667A (en) * 2015-08-06 2015-11-04 浙江大学 Camera and robot relative pose calibration method based on pixel space optimization
CN106041937A (en) * 2016-08-16 2016-10-26 河南埃尔森智能科技有限公司 Control method of manipulator grabbing control system based on binocular stereoscopic vision
CN106778790A (en) * 2017-02-15 2017-05-31 苏州博众精工科技有限公司 A kind of target identification based on three-dimensional point cloud and localization method and system
CN108827154A (en) * 2018-07-09 2018-11-16 深圳辰视智能科技有限公司 A kind of robot is without teaching grasping means, device and computer readable storage medium
CN109493384A (en) * 2018-09-20 2019-03-19 顺丰科技有限公司 Camera position and orientation estimation method, system, equipment and storage medium

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111595266A (en) * 2020-06-02 2020-08-28 西安航天发动机有限公司 Spatial complex trend catheter visual identification method
CN112549021A (en) * 2020-11-16 2021-03-26 北京配天技术有限公司 Robot control method, robot and storage device
CN113114766A (en) * 2021-04-13 2021-07-13 江苏大学 Potted plant information detection method and device based on ZED camera
CN113114766B (en) * 2021-04-13 2024-03-08 江苏大学 Potted plant information detection method based on ZED camera
CN113510697A (en) * 2021-04-23 2021-10-19 知守科技(杭州)有限公司 Manipulator positioning method, device, system, electronic device and storage medium
CN113770059A (en) * 2021-09-16 2021-12-10 中冶东方工程技术有限公司 Intelligent sorting system and method for steel structure parts
WO2023040095A1 (en) * 2021-09-16 2023-03-23 梅卡曼德(北京)机器人科技有限公司 Camera calibration method and apparatus, electronic device, and storage medium
CN114113163B (en) * 2021-12-01 2023-12-08 北京航星机器制造有限公司 Automatic digital ray detection device and method based on intelligent robot
CN114113163A (en) * 2021-12-01 2022-03-01 北京航星机器制造有限公司 Automatic digital ray detection device and method based on intelligent robot
CN114074331A (en) * 2022-01-19 2022-02-22 成都考拉悠然科技有限公司 Disordered grabbing method based on vision and robot
CN115070779A (en) * 2022-08-22 2022-09-20 菲特(天津)检测技术有限公司 Robot grabbing control method and system and electronic equipment
CN115070779B (en) * 2022-08-22 2023-03-24 菲特(天津)检测技术有限公司 Robot grabbing control method and system and electronic equipment
CN115446392A (en) * 2022-10-13 2022-12-09 芜湖行健智能机器人有限公司 Intelligent chamfering system and method for disordered plate
CN115446392B (en) * 2022-10-13 2023-08-04 芜湖行健智能机器人有限公司 Intelligent chamfering system and method for unordered plates
CN116330306B (en) * 2023-05-31 2023-08-15 之江实验室 Object grabbing method and device, storage medium and electronic equipment
CN116330306A (en) * 2023-05-31 2023-06-27 之江实验室 Object grabbing method and device, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN111136656B (en) 2020-12-08

Similar Documents

Publication Publication Date Title
CN111136656B (en) Method for automatically identifying and grabbing three-dimensional irregular object of robot
EP3550470B1 (en) Object recognition processing apparatus and method, and object picking apparatus and method
CN108182689B (en) Three-dimensional identification and positioning method for plate-shaped workpiece applied to robot carrying and polishing field
JP5788460B2 (en) Apparatus and method for picking up loosely stacked articles by robot
CN110370286A (en) Dead axle motion rigid body spatial position recognition methods based on industrial robot and monocular camera
US7283661B2 (en) Image processing apparatus
WO2017015898A1 (en) Control system for robotic unstacking equipment and method for controlling robotic unstacking
KR102056664B1 (en) Method for work using the sensor and system for performing thereof
US8379224B1 (en) Prismatic alignment artifact
CN104959989A (en) Elevator door plank feeding positioning method guided through vision
CN110146017B (en) Industrial robot repeated positioning precision measuring method
CN106269548A (en) A kind of object automatic sorting method and device thereof
CN110980276B (en) Method for implementing automatic casting blanking by three-dimensional vision in cooperation with robot
CN114029243B (en) Soft object grabbing and identifying method for sorting robot
CN112361958B (en) Line laser and mechanical arm calibration method
Hsu et al. Development of a faster classification system for metal parts using machine vision under different lighting environments
CN112561886A (en) Automatic workpiece sorting method and system based on machine vision
CN112518748A (en) Automatic grabbing method and system of vision mechanical arm for moving object
CN114913346A (en) Intelligent sorting system and method based on product color and shape recognition
CN114918723B (en) Workpiece positioning control system and method based on surface detection
Hashimoto et al. Current status and future trends on robot vision technology
US20240003675A1 (en) Measurement system, measurement device, measurement method, and measurement program
CN115063670A (en) Automatic sorting method, device and system
JP7174074B2 (en) Image processing equipment, work robots, substrate inspection equipment and specimen inspection equipment
JP6644846B1 (en) Work position and orientation recognition device and picking system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20210330

Address after: 315400 Chengdong new district, Ningbo Economic Development Zone, Zhejiang Province

Patentee after: Zhichang Technology Group Co.,Ltd.

Address before: Room 320, building 1, 358 Huayan village, Nanqiao Town, Fengxian District, Shanghai

Patentee before: SHANGHAI GENE AUTOMATION TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right