CN113378626A - Visual grabbing method for elastic strips - Google Patents
Visual grabbing method for elastic strips Download PDFInfo
- Publication number
- CN113378626A CN113378626A CN202110434494.4A CN202110434494A CN113378626A CN 113378626 A CN113378626 A CN 113378626A CN 202110434494 A CN202110434494 A CN 202110434494A CN 113378626 A CN113378626 A CN 113378626A
- Authority
- CN
- China
- Prior art keywords
- grabbing
- elastic strip
- strip
- mechanical arm
- elastic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 46
- 230000000007 visual effect Effects 0.000 title claims abstract description 19
- 239000011159 matrix material Substances 0.000 claims abstract description 15
- 230000007246 mechanism Effects 0.000 claims abstract description 13
- 238000006243 chemical reaction Methods 0.000 claims abstract description 12
- 238000012545 processing Methods 0.000 claims abstract description 7
- 230000008569 process Effects 0.000 claims description 8
- 230000011218 segmentation Effects 0.000 claims description 3
- 238000004519 manufacturing process Methods 0.000 description 7
- 238000009616 inductively coupled plasma Methods 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 230000009466 transformation Effects 0.000 description 4
- 238000011960 computer-aided design Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
Abstract
The invention discloses a visual elastic strip grabbing method, which comprises the following steps: 1) fixing the positions of a fixed scanner and a mechanical arm, wherein the fixed scanner is used for scanning the elastic strips on the conveying belt, and a grabbing mechanism is arranged on the mechanical arm; 2) grabbing the calibration plate by a grabbing mechanism on the mechanical arm to move two poses at will, and simultaneously acquiring point cloud images of the two poses by a fixed scanner; 3) the data processing system calculates a conversion matrix relation between the mechanical arm and the fixed scanner according to the corresponding relation between the two groups of point cloud images; 4) after the fixed scanner scans and obtains the position of the elastic strip, the mechanical arm matches and positions the grabbing mechanism according to the calculated conversion matrix relation, and then the grabbing work of the elastic strip is carried out. The elastic strip visual grabbing method can automatically grab the elastic strips on the assembly line.
Description
Technical Field
The invention relates to a bullet strip detection system, in particular to a bullet strip visual grabbing method.
Background
Need detect its each item characteristic parameter in the bullet strip production process, just can get into next process after corresponding parameter detects qualified, when detecting each item characteristic parameter to the bullet strip, need the bullet strip to put stably, otherwise can influence the accuracy that detects data.
The elastic strip can be accurately grabbed on a production line transmission belt by firstly ensuring that the elastic strip is placed stably before scanning, and no proper elastic strip visual grabbing method exists at present due to the particularity of the structure of the elastic strip.
Disclosure of Invention
The invention aims to provide a visual elastic strip grabbing method which can automatically grab elastic strips on a production line.
In order to achieve the purpose, the invention provides a visual elastic strip grabbing method, which comprises the following steps:
1) fixing the positions of a fixed scanner and a mechanical arm, wherein the fixed scanner is used for scanning the elastic strips on the conveying belt, and a grabbing mechanism is arranged on the mechanical arm;
2) grabbing the calibration plate by a grabbing mechanism on the mechanical arm to move two poses at will, and simultaneously acquiring point cloud images of the two poses by a fixed scanner;
3) the data processing system calculates a conversion matrix relation between the mechanical arm and the fixed scanner according to the corresponding relation between the two groups of point cloud images;
4) after the fixed scanner scans and obtains the position of the elastic strip, the mechanical arm matches and positions the grabbing mechanism according to the calculated conversion matrix relation, and then the grabbing work of the elastic strip is carried out.
Preferably, before the elastic strip is grabbed, the position of the elastic strip is edited and preprocessed through digital-analog data of the elastic strip, local coordinate information of the elastic strip is defined, and grabbing point positions and posture parameters of the elastic strip are preset in a grabbing system.
Preferably, the fixed scanner identifies the elastic strip by a method combining two-dimensional identification and three-dimensional identification; the method comprises the steps of firstly collecting multiple groups of data of elastic strips in a conveying belt, extracting a characteristic model of the elastic strips through the collected multiple groups of data, obtaining an elastic strip area in an image by using a two-dimensional data identification method, and then carrying out data segmentation on the area through three-dimensional data to obtain three-dimensional data of an elastic strip target.
Preferably, in the matching and positioning process in step 4), a 3Dsc method is firstly adopted for rough matching, and then an ICP algorithm is adopted for fine matching.
Preferably, a precise conveying belt moving signal is output in real time by the conveying control device in the scanning and grabbing process, so that the positioned elastic strip is consistent with the grabbed elastic strip.
Preferably, the fixed scanner scans the elastic strips of the scene to obtain data, the data obtained by scanning are analyzed and processed through the data processing system, one of the elastic strips is designated to calculate the position and the posture of the elastic strip, the position and the posture parameters required by the mechanical arm to grab are calculated according to the calibrated conversion matrix relationship, and the position and the posture parameters are transmitted to the mechanical arm control system to carry out grabbing actions.
Preferably, the stationary scanner is a depth camera or a line laser scanner.
Preferably, the fixed scanner is fixed to the oblique upper side of the trough through the portal frame and is used for shooting the trough on the conveying belt in a facing mode.
According to the technical scheme, the visual elastic strip grabbing method overcomes the problems in the background technology, can automatically grab the elastic strips on the production line, improves the detection efficiency of the elastic strips, and further improves the production efficiency.
Additional features and advantages of the invention will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
fig. 1 is a schematic diagram of a preferred embodiment of a spring-loaded grip system.
Description of the reference numerals
1 mechanical arm 2 grabbing mechanism
3 calibration board 4 fixed scanner
Detailed Description
The following detailed description of embodiments of the invention refers to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present invention, are given by way of illustration and explanation only, not limitation.
In the present invention, unless otherwise specified, directional words included in terms such as "upper, lower, left, right, front, rear, inner, and outer" and the like merely represent the directions of the terms in a normal use state or are colloquially known by those skilled in the art, and should not be construed as limiting the terms.
Referring to the elastic strip grabbing system shown in fig. 1, the elastic strip visual grabbing method comprises the following steps:
1) fixing the positions of a fixed scanner 4 and a mechanical arm 1, wherein the fixed scanner 4 is used for scanning elastic strips on a conveying belt, and a grabbing mechanism 2 is arranged on the mechanical arm 1;
2) grabbing the calibration plate 3 by a grabbing mechanism on the mechanical arm 1 to move two poses at will, and simultaneously acquiring point cloud images of the two poses by a fixed scanner 4;
3) the data processing system calculates the conversion matrix relation between the mechanical arm 1 and the fixed scanner 4 according to the corresponding relation between the two groups of point cloud images;
4) after the fixed scanner 4 scans and obtains the position of the elastic strip, the mechanical arm 1 matches and positions the grabbing mechanism 2 according to the calculated conversion matrix relation, and then the grabbing work of the elastic strip is carried out.
Through the implementation of the steps, the elastic strip visual grabbing method can automatically grab the elastic strips on the production line, the elastic strip detection efficiency is improved, and further the production efficiency is improved. In step 3), the formula for obtaining the transformation matrix according to the two sets of coordinates can be expressed as:
after conversion, the following can be obtained:
that is to say that the first and second electrodes,the coordinate transformation problem can be solved as a matrix AX ═ XB problem, that is, the problem can be solved according to an algebraic correlation method, and the transformation matrix between the mechanical arm 1 and the camera is solved
In this embodiment, in order to obtain a more accurate grasping effect, preferably, before grasping the elastic strip, the position of the elastic strip is edited and preprocessed through digital-analog data of the elastic strip, local coordinate information of the elastic strip is defined, and a grasping point position and a posture parameter of the elastic strip are preset in the grasping system.
In this embodiment, in order to obtain a more precise grasping effect, it is preferable that the fixed scanner 4 performs the identification of the elastic strip by a method combining two-dimensional identification and three-dimensional identification; the method comprises the steps of firstly collecting multiple groups of data of elastic strips in a conveying belt, extracting a characteristic model of the elastic strips through the collected multiple groups of data, obtaining an elastic strip area in an image by using a two-dimensional data identification method, and then carrying out data segmentation on the area through three-dimensional data to obtain three-dimensional data of an elastic strip target.
In this embodiment, in order to obtain a more accurate grabbing effect, it is preferable that, in the step 4) of matching and positioning, the coarse matching is performed by using a 3Dsc method, and then the fine matching is performed by using an ICP algorithm. The 3Dsc method is a feature description method based on the shape contour, and the method can well reflect the distribution situation of sampling points on the contour by describing the shape features by using a histogram under a logarithmic polar coordinate system. And performing fine matching by adopting an ICP (inductively coupled plasma) algorithm to obtain the corresponding relation between the object in the target scene and the CAD (computer-aided design) model. ICP is essentially an optimal registration method based on the least square method, the algorithm repeatedly selects a pair of correspondence points, calculates an optimal rigid body transformation until the convergence accuracy requirement for correct registration is met, and since ICP requires a relatively good initial position, a coarse registration method is first performed to obtain a relatively good correspondence.
In the CAD model, an ideal grabbing position and grabbing attitude P are selected in advance according to experience knowledgeCADModelGo through toChange matrixAndconverting P to coordinates in the coordinate system of the body of the mechanical arm 1PBaseI.e. the position where the robot arm 1 is to be finally gripped. And after the grabbing is successful, continuously repeating the steps to realize the automatic grabbing process of the elastic strip.
In this embodiment, in order to improve the grasping accuracy, it is preferable that an accurate conveying belt moving signal is output in real time by the conveying control device during the scanning grasping process so that the positioned elastic strip coincides with the grasped elastic strip. When the transmission control device outputs the transmission belt moving signal, the grabbing is stopped, and when the transmission control device outputs the transmission belt stopping signal, the grabbing is started.
In the embodiment, the fixed scanner 4 scans the elastic strips in the scene to obtain data, analyzes and processes the data obtained by scanning through the data processing system, specifies one of the elastic strips to calculate the position and the posture of the elastic strip, then calculates the position and posture parameters required by the mechanical arm 1 to grab according to the calibrated conversion matrix relationship, and transmits the position and posture parameters to the mechanical arm control system to implement the grabbing action.
In this embodiment, in order to further provide a stationary scanner 4, preferably, the stationary scanner 4 is a depth camera or a line laser scanner.
In this embodiment, the stationary scanner 4 is preferably fixed to the trough obliquely above by a gantry and is directly opposite to the trough on the conveyor belt for shooting.
The preferred embodiments of the present invention have been described in detail with reference to the accompanying drawings, however, the present invention is not limited to the specific details of the above embodiments, and various simple modifications can be made to the technical solution of the present invention within the technical idea of the present invention, and these simple modifications are within the protective scope of the present invention.
It should be noted that the various technical features described in the above embodiments can be combined in any suitable manner without contradiction, and the invention is not described in any way for the possible combinations in order to avoid unnecessary repetition.
In addition, any combination of the various embodiments of the present invention is also possible, and the same should be considered as the disclosure of the present invention as long as it does not depart from the spirit of the present invention.
Claims (8)
1. A visual elastic strip grabbing method is characterized by comprising the following steps:
1) fixing the positions of a fixed scanner (4) and a mechanical arm (1), wherein the fixed scanner (4) is used for scanning elastic strips on a conveying belt, and a grabbing mechanism (2) is arranged on the mechanical arm (1);
2) grabbing the calibration plate (3) by a grabbing mechanism on the mechanical arm (1) to move two poses at will, and simultaneously acquiring point cloud images of the two poses by a fixed scanner (4);
3) the data processing system calculates a conversion matrix relation between the mechanical arm (1) and the fixed scanner (4) according to the corresponding relation between the two groups of point cloud images;
4) after the fixed scanner (4) scans the position of obtaining the elastic strip, the mechanical arm (1) matches and positions the grabbing mechanism (2) according to the calculated conversion matrix relation, and then the grabbing work of the elastic strip is carried out.
2. The visual bullet strip gripping method according to claim 1, wherein before gripping the bullet strip, the position of the bullet strip is edited and preprocessed through digital-analog data of the bullet strip, so as to define local coordinate information of the bullet strip, and the gripping point location and the attitude parameters of the bullet strip are preset in the gripping system.
3. The visual bullet strip gripping method according to claim 2, wherein the fixed scanner (4) identifies the bullet strips by a method combining two-dimensional identification and three-dimensional identification;
the method comprises the steps of firstly collecting multiple groups of data of elastic strips in a conveying belt, extracting a characteristic model of the elastic strips through the collected multiple groups of data, obtaining an elastic strip area in an image by using a two-dimensional data identification method, and then carrying out data segmentation on the area through three-dimensional data to obtain three-dimensional data of an elastic strip target.
4. The visual bullet strip grabbing method according to claim 1, wherein in the matching and positioning process of the step 4), a 3Dsc method is adopted for rough matching, and then an ICP algorithm is adopted for fine matching.
5. The visual elastic strip grabbing method according to claim 1, wherein a precise conveying belt movement signal is output in real time by a conveying control device in a scanning grabbing process, so that the positioned elastic strip is consistent with the grabbed elastic strip.
6. The visual elastic strip grabbing method according to claim 1, characterized in that a fixed scanner (4) scans elastic strips in a scene to obtain data, the data obtained by scanning is analyzed and processed through a data processing system, one elastic strip is specified to calculate the position and the posture of the elastic strip, the position and the posture parameters required by grabbing of the mechanical arm (1) are calculated according to a calibrated conversion matrix relationship, and the calculated position and posture parameters are transmitted to a mechanical arm control system to implement grabbing.
7. The visual bullet strip gripping method according to claim 1, characterised in that said stationary scanner (4) is a depth camera or a line laser scanner.
8. The visual elastic strip grabbing method according to claim 1, wherein the fixed scanner (4) is fixed to the oblique upper side of the trough through a portal frame and is used for shooting the trough on the conveying belt.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110434494.4A CN113378626A (en) | 2021-04-22 | 2021-04-22 | Visual grabbing method for elastic strips |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110434494.4A CN113378626A (en) | 2021-04-22 | 2021-04-22 | Visual grabbing method for elastic strips |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113378626A true CN113378626A (en) | 2021-09-10 |
Family
ID=77569899
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110434494.4A Pending CN113378626A (en) | 2021-04-22 | 2021-04-22 | Visual grabbing method for elastic strips |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113378626A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114179095A (en) * | 2022-02-15 | 2022-03-15 | 江苏智仁景行新材料研究院有限公司 | Manipulator precision control system based on three-dimensional visual perception |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106041937A (en) * | 2016-08-16 | 2016-10-26 | 河南埃尔森智能科技有限公司 | Control method of manipulator grabbing control system based on binocular stereoscopic vision |
CN109927036A (en) * | 2019-04-08 | 2019-06-25 | 青岛小优智能科技有限公司 | A kind of method and system of 3D vision guidance manipulator crawl |
CN110509300A (en) * | 2019-09-30 | 2019-11-29 | 河南埃尔森智能科技有限公司 | Stirrup processing feeding control system and control method based on 3D vision guidance |
CN110930456A (en) * | 2019-12-11 | 2020-03-27 | 北京工业大学 | Three-dimensional identification and positioning method of sheet metal part based on PCL point cloud library |
US20200276713A1 (en) * | 2019-02-28 | 2020-09-03 | Intelligrated Headquarters, Llc | Vision calibration system for robotic carton unloading |
CN112476434A (en) * | 2020-11-24 | 2021-03-12 | 新拓三维技术(深圳)有限公司 | Visual 3D pick-and-place method and system based on cooperative robot |
-
2021
- 2021-04-22 CN CN202110434494.4A patent/CN113378626A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106041937A (en) * | 2016-08-16 | 2016-10-26 | 河南埃尔森智能科技有限公司 | Control method of manipulator grabbing control system based on binocular stereoscopic vision |
US20200276713A1 (en) * | 2019-02-28 | 2020-09-03 | Intelligrated Headquarters, Llc | Vision calibration system for robotic carton unloading |
CN109927036A (en) * | 2019-04-08 | 2019-06-25 | 青岛小优智能科技有限公司 | A kind of method and system of 3D vision guidance manipulator crawl |
CN110509300A (en) * | 2019-09-30 | 2019-11-29 | 河南埃尔森智能科技有限公司 | Stirrup processing feeding control system and control method based on 3D vision guidance |
CN110930456A (en) * | 2019-12-11 | 2020-03-27 | 北京工业大学 | Three-dimensional identification and positioning method of sheet metal part based on PCL point cloud library |
CN112476434A (en) * | 2020-11-24 | 2021-03-12 | 新拓三维技术(深圳)有限公司 | Visual 3D pick-and-place method and system based on cooperative robot |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114179095A (en) * | 2022-02-15 | 2022-03-15 | 江苏智仁景行新材料研究院有限公司 | Manipulator precision control system based on three-dimensional visual perception |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111791239B (en) | Method for realizing accurate grabbing by combining three-dimensional visual recognition | |
CN107618030B (en) | Robot dynamic tracking grabbing method and system based on vision | |
CN109202912B (en) | Method for registering target contour point cloud based on monocular depth sensor and mechanical arm | |
CN105729468B (en) | A kind of robotic workstation based on the enhancing of more depth cameras | |
CN110580725A (en) | Box sorting method and system based on RGB-D camera | |
CN110014426B (en) | Method for grabbing symmetrically-shaped workpieces at high precision by using low-precision depth camera | |
CN107186708B (en) | Hand-eye servo robot grabbing system and method based on deep learning image segmentation technology | |
CN110751691B (en) | Automatic pipe fitting grabbing method based on binocular vision | |
CN111645074A (en) | Robot grabbing and positioning method | |
CN110509281A (en) | The apparatus and method of pose identification and crawl based on binocular vision | |
KR101049266B1 (en) | Mobile robot | |
CN108177143B (en) | Robot positioning and grabbing method and system based on laser vision guidance | |
CN209850931U (en) | Automatic grabbing robot | |
CN107053173A (en) | The method of robot grasping system and grabbing workpiece | |
CN105225225B (en) | A kind of leather system for automatic marker making method and apparatus based on machine vision | |
CN110434516A (en) | A kind of Intelligent welding robot system and welding method | |
CN113696186A (en) | Mechanical arm autonomous moving and grabbing method based on visual-touch fusion under complex illumination condition | |
CN111067197A (en) | Robot sole dynamic gluing system and method based on 3D scanning | |
CN112010024A (en) | Automatic container grabbing method and system based on laser and vision fusion detection | |
CN112109072B (en) | Accurate 6D pose measurement and grabbing method for large sparse feature tray | |
CN113378626A (en) | Visual grabbing method for elastic strips | |
CN113878576A (en) | Robot vision sorting process programming method | |
CN113034526B (en) | Grabbing method, grabbing device and robot | |
CN114419437A (en) | Workpiece sorting system based on 2D vision and control method and control device thereof | |
CN116985141A (en) | Industrial robot intelligent control method and system based on deep learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |