CN109373898A - A kind of complex parts pose estimating system and method based on three-dimensional measurement point cloud - Google Patents
A kind of complex parts pose estimating system and method based on three-dimensional measurement point cloud Download PDFInfo
- Publication number
- CN109373898A CN109373898A CN201811428771.5A CN201811428771A CN109373898A CN 109373898 A CN109373898 A CN 109373898A CN 201811428771 A CN201811428771 A CN 201811428771A CN 109373898 A CN109373898 A CN 109373898A
- Authority
- CN
- China
- Prior art keywords
- point cloud
- pose
- workpiece
- measured
- degree
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000005259 measurement Methods 0.000 title claims abstract description 73
- 238000000034 method Methods 0.000 title claims abstract description 21
- 239000011159 matrix material Substances 0.000 claims abstract description 37
- 238000006243 chemical reaction Methods 0.000 claims abstract description 21
- 238000013461 design Methods 0.000 claims abstract description 16
- 238000012545 processing Methods 0.000 claims description 26
- 230000009466 transformation Effects 0.000 claims description 10
- 239000003550 marker Substances 0.000 claims description 4
- 238000013519 translation Methods 0.000 claims description 3
- 230000007704 transition Effects 0.000 abstract 1
- 230000008569 process Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention belongs to automatic measurement fields, and a kind of complex parts pose estimating system and method based on three-dimensional measurement point cloud is specifically disclosed, it is achieved by the steps of complex parts pose estimation: the transition matrix between calibration Six-DOF industrial robot end flange and raster pattern planar array scanning instrument measurement coordinate system;It drives the movement of raster pattern planar array scanning instrument to be scanned to measured workpiece by Six-DOF industrial robot, and then obtains the three dimensional point cloud of measured workpiece;The three dimensional point cloud after conversion is obtained under the three dimensional point cloud of measured workpiece is converted to robot basis coordinates system;Three dimensional point cloud after conversion is matched with the Three Dimensional Design Model of measured workpiece, to obtain the pose of measured workpiece opposed robots' basis coordinates system, the estimation of complex parts pose is completed with this.Measurement range of the present invention is extensively, it can be achieved that the precise measurement of measured workpiece multizone and complete pattern, can accurately obtain the pose of measured workpiece.
Description
Technical Field
The invention belongs to the field of automatic measurement, and particularly relates to a complex part pose estimation system and method based on three-dimensional measurement point cloud.
Background
Robot three-dimensional automatic measurement requires planning a complete collision-free path, and the planned path requires knowing the pose of the workpiece with respect to the robot base coordinate system. The traditional workpiece attitude estimation method comprises a manual alignment method, which depends on manual operation, has great randomness and inaccurate estimation; the method for acquiring the pose of the workpiece by using visual positioning is also applicable to workpieces with unobvious features and is complex in calculation.
The traditional measuring modes are complex, accurate workpiece poses are difficult to acquire, and subsequent path planning is deviated, especially when workpieces are huge, a workpiece coordinate system is difficult to acquire, so that large-amplitude deviation occurs in subsequent robot measuring point planning, and unexpected collision between a robot measuring system and the external environment is possibly caused.
Disclosure of Invention
Aiming at the defects or improvement requirements of the prior art, the invention provides a complex part pose estimation system and method based on three-dimensional measurement point cloud, which utilize a six-degree-of-freedom industrial robot to drive a grating type area array scanner to carry out local characteristic scanning on a measured workpiece, acquire point cloud data of key parts, convert the point cloud data into a robot base coordinate system through a matrix calibrated by hands and eyes to obtain new point cloud data, and then utilize a design model of the measured workpiece to be matched with the point cloud to acquire a conversion matrix from the design model to the point cloud data, wherein the conversion matrix is the pose of the measured workpiece relative to the robot base coordinate system.
In order to achieve the above object, according to one aspect of the present invention, a complex part pose estimation system based on three-dimensional measurement point cloud is provided, which includes a six-degree-of-freedom industrial robot, a grating type area array scanner, a measured workpiece, a marker point support frame, and a data processing upper computer, wherein:
the tail end of the six-degree-of-freedom industrial robot clamps the grating type area array scanner and is used for driving the grating type area array scanner to move so as to scan and measure a workpiece to be measured placed on the mark point support frame, blue light emitted by the grating type area array scanner covers the surface of the workpiece to be measured and mark points on the mark point support frame during measurement, and at least 3 public mark points which are not on the same straight line exist in the two-time measurement; the six-degree-of-freedom industrial robot is connected with the data processing upper computer so as to transmit the motion parameters of the six-degree-of-freedom industrial robot to the data processing upper computer in real time;
the grating type area array scanner is arranged at the tail end of the six-degree-of-freedom industrial robot and used for scanning and measuring a workpiece to be measured to obtain three-dimensional point cloud data of the workpiece to be measured, and the grating type area array scanner is connected with the data processing upper computer to transmit the measured data to the processing upper computer in real time;
the data processing upper computer is used for receiving the motion parameters of the six-degree-of-freedom industrial robot and the measurement data of the grating type area array scanner, and calculating to obtain the pose of the measured workpiece under the robot base coordinate system, so that the estimation of the pose of the complex part is completed.
As a further preferred, before measurement, the position relationship between the end flange of the six-degree-of-freedom industrial robot and the measurement coordinate system of the grating type area array scanner is obtained through hand-eye calibration.
More preferably, the distance between adjacent marking points on the marking point support is preferably 3cm to 4 cm.
Preferably, the six-degree-of-freedom industrial robot is connected with the data processing upper computer through a six-degree-of-freedom industrial robot controller.
Preferably, the grating type area array scanner is connected with a six-degree-of-freedom industrial robot controller to achieve synchronization of signal triggering and data acquisition.
According to another aspect of the invention, a complex part pose estimation method based on three-dimensional measurement point cloud is provided, which comprises the following steps:
s1 calibration conversion matrix between six-freedom-degree industrial robot tail end flange plate and grating type area array scanner measurement coordinate system
S2, the six-degree-of-freedom industrial robot drives the grating type area array scanner to move so as to scan the workpiece to be measured, and then three-dimensional point cloud data p of the workpiece to be measured is obtainedi(i=1,2,...,s);
S3, converting the three-dimensional point cloud data of the workpiece to be detected to a robot base coordinate system to obtain the converted three-dimensional point cloud data:
wherein,the pose of the robot is measured for the first time;
and S4, matching the converted three-dimensional point cloud data with the three-dimensional design model of the workpiece to be measured to obtain the pose of the workpiece to be measured relative to the robot base coordinate system, thereby finishing the estimation of the pose of the complex part.
Further preferably, in step S4, the pose of the workpiece to be measured with respect to the robot base coordinate system is obtained by the following method:
firstly, calculating a transformation matrix from point cloud to three-dimensional design model coordinate system after each matching is finishedWherein the matching times are n;
then according to the conversion matrix T corresponding to each matchingkCalculating the pose of the measured workpiece under the robot base coordinate systemThe result is obtained.
Generally, compared with the prior art, the above technical solution conceived by the present invention mainly has the following technical advantages:
1. the invention overcomes the defect that the traditional pose estimation method needs manual measurement, can utilize a six-degree-of-freedom industrial robot to drive a scanner to realize remote operation, acquire point cloud data and calculate the pose of a workpiece, is safe and reliable, cannot cause threat to operators, has wide measurement range, and can realize the accurate measurement of multiple regions and complete appearance of the workpiece to be measured.
2. The grating type area array scanner has an automatic splicing function based on the mark points, can avoid the splicing error of the point cloud measured by the scanner caused by the absolute positioning error of the robot, improves the precision of the measured point cloud, and realizes the splicing of the mark points by pasting the mark points on the periphery of the measured area of the workpiece to be measured and ensuring that at least three repeated mark points are observed under two adjacent visual angles in the measuring process, thereby ensuring that complete measured point cloud data are spliced.
3. The invention can solve the problem that manual measurement is difficult to obtain more accurate position parameters, particularly the acquisition of the rotation matrix R, and can more accurately acquire the pose of the workpiece to be measured by utilizing an accurate hand-eye matrix and a point cloud matching algorithm.
4. The invention has simple structure, can well solve the problem of the pose of the randomly placed workpiece relative to the pose of the robot base coordinate system, is applicable to pose estimation of the same type and has strong universality.
Drawings
Fig. 1 is a schematic structural diagram of a complex part pose estimation system based on a three-dimensional measurement point cloud according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a three-dimensional feature of an exemplary workpiece;
FIG. 3 is a schematic diagram of the pose relationship between the scanner and the end flange of the robot and the acquisition of a three-dimensional point cloud, where OBXYZ is the robot-based coordinate system { B }, OEXYZ is a machineRobot end flange coordinate system { E }, OSXYZ is the scanner measurement coordinate system S;
FIG. 4 is a schematic diagram of a point cloud matching process, where xyz is the coordinate system of the three-dimensional measurement point cloud, x0y0z0T is a transformation matrix from a coordinate system of the three-dimensional measurement point cloud to the model coordinate system.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
As shown in fig. 1, a complex part pose estimation system based on three-dimensional measurement point cloud according to an embodiment of the present invention includes a six-degree-of-freedom industrial robot 100, a workpiece 200 to be measured, a grating type area array scanner 300, a marker point support 400, and a data processing upper computer 500.
The end of the six-degree-of-freedom industrial robot 100 clamps the grating type area array scanner 300, the six-degree-of-freedom industrial robot 100 moves in a working space according to a certain track to drive the grating type area array scanner 300 to move, the grating type area array scanner 300 scans and measures a measured workpiece 200 placed on a mark point support frame 400, during measurement, blue light emitted by the grating type area array scanner 300 needs to be ensured to cover the surface of the measured workpiece and a part of mark points on the mark point support frame, and at least 3 public mark points which are not on the same straight line need to be ensured to exist during two times of measurement, so as to be used for splicing point clouds; the six-degree-of-freedom industrial robot 100 is connected to the data processing upper computer 500, so that the motion parameters (including the position, attitude, motion state, and other parameters) of the six-degree-of-freedom industrial robot 100 are transmitted to the data processing upper computer 500 in real time.
The grating type area array scanner 300 is installed at the end of the six-degree-of-freedom industrial robot 100, scans and measures a region to be measured of a workpiece to be measured under the driving of the six-degree-of-freedom industrial robot 100 to obtain the appearance point cloud of the workpiece, and the grating type area array scanner 300 is also connected with the data processing upper computer 500 to transmit three-dimensional point cloud data obtained through measurement to the processing upper computer 500 in real time. The testing principle of the grating type area array scanner 300 is that a binocular stereo vision three-dimensional measurement technology based on a triangulation principle is adopted, a reference grating with different phase differences is projected to a measured object through the scanner, a dual camera is used for collecting deformed grating images modulated by the surface of the object from two visual angles, then the phase value of each pixel point is calculated by using the deformed grating, the three-dimensional point cloud coordinate of the object is calculated according to the phase value, and the grating type area array scanner is adopted for obtaining the three-dimensional point cloud data of the measured workpiece. Specifically, the grating type area array scanner communicates with the upper computer data processing software through the switch 700 to complete the collection and transmission of point cloud data, and the grating type area array scanner may adopt, for example, ato Compact Scan of germany Gom company, and PowerScan of the world's view-only technology.
The data processing upper computer 500 is used for receiving the motion parameters of the six-degree-of-freedom industrial robot 100 and the measurement data of the grating type area array scanner 300, and calculating to obtain the pose of the measured workpiece under the robot base coordinate system, so as to complete the estimation of the pose of the complex part. The upper computer data processing software in the data processing upper computer 500 has the functions of point cloud data acquisition, point cloud splicing, point cloud simplification, point cloud position conversion, point cloud-design model matching and the like, and can realize coordinate conversion (conversion to a six-degree-of-freedom industrial robot base coordinate system) and point cloud matching functions based on measured point cloud, so that the relative pose of the measured workpiece relative to the six-degree-of-freedom industrial robot base coordinate system is output, namely the pose of the measured workpiece under the robot base coordinate system is calculated according to the received motion parameters of the six-degree-of-freedom industrial robot 100 and the measurement data of the grating type area array scanner 300, and the pose is obtained.
Wherein, the point cloud data acquisition is to acquire the coordinate p of the point on the measured workpiece under the measurement coordinate system of the grating type area array scanner by utilizing the stereo vision three-dimensional measurement technologyi(i 1, 2.., r.., s); the point cloud splicing is to splice the point clouds of 2 nd, 3 rd, … th and s th into the point cloud of 1 st by utilizing two point clouds before and after the mark point splicing measurement, and the total point cloud coordinate obtained in the way represents the position of a measurement coordinate system of the grating type area array scanner when the six-degree-of-freedom industrial robot measures the point clouds for the first time; point cloud simplification is to uniformly sample large-scale measurement point cloud initially measured by a robot optical measurement system, reduce the data scale and improve the data processing and calculating efficiency; the point cloud position conversion is carried out according to the relative pose between the grating type area array scanner and the end flange plate of the six-degree-of-freedom industrial robotPose of robot tail end flange plate relative to robot base coordinate system when robot measures point for the first timeAnd measuring the obtained point cloud pi(i 1, 2.. r.. s.) is calculated by the formulaConverting the coordinate system into a six-degree-of-freedom industrial robot base coordinate system; the point cloud-design model matching is to match and align the point cloud converted into the six-degree-of-freedom industrial robot base coordinate system and the design model to the same coordinate system so as to obtain a conversion matrix, wherein the conversion matrix is the pose of the workpiece to be measured in the robot base coordinate system.
The positional relationship between the end flange of the six-degree-of-freedom industrial robot 100 and the measurement coordinate system of the grating type area array scanner 300 before measurement is obtained by hand-eye calibration, which is the prior art and is not described herein again. In the process of acquiring the point cloud of the workpiece to be measured 200, the blue light of the grating type area array scanner 300 covers the workpiece and part of the mark points, so that the subsequent point cloud can be spliced.
Specifically, the six-degree-of-freedom industrial robot 100 is connected with the data processing upper computer 500 through the six-degree-of-freedom industrial robot controller 600, the grating type area array scanner 300 is connected with the six-degree-of-freedom industrial robot controller 600, and signal triggering of upper computer software and synchronization of part point cloud data acquisition are achieved through control of the six-degree-of-freedom industrial robot controller 600.
The six-degree-of-freedom industrial robot 100 and the grating type area array scanner 300 are connected to the same data processing upper computer 500, pose parameters including angles of six axes and end positions of the robot during first measurement are recorded in the measurement process, and measurement data are synchronized to the data processing upper computer 500.
The height of the mark point support frame cannot be larger than the lowest height of the measured workpiece 200 during assembly, so that the point cloud of the measured workpiece is not influenced, the subsequent point cloud matching process is not interfered, meanwhile, the distance between the mark points is required to be 3 cm-4 cm, and the requirements that the point clouds measured in two times can be spliced and the mark points cannot be too many in one measuring range are met.
During actual work, firstly, the six-degree-of-freedom industrial robot, the area array scanner and upper computer processing software need to be ensured to be in an open state, the posture of the robot in actual measurement is controlled by controlling the robot demonstrator, a measurement point needs to be ensured to meet measurement requirements, and two images need to be ensured to at least contain 3 mark points which are not on the same straight line in two times of measurement, so that the point cloud splicing is completed. After the acquisition of the measured point cloud data is finished, the measured point cloud is simplified by utilizing upper computer data processing software for improving the point cloud quality and reducing the data volume, hand-eye calibration parameters and the measurement pose of a first robot are input, the measured point cloud is converted into a robot base coordinate system, then point cloud-design model matching can be carried out, the point cloud model is matched into a design model coordinate system, and simultaneously, a conversion matrix T during each time of point cloud matching is recordedkAfter matching is completed, calculation is performedThe pose of the measured part to be solved relative to the six-degree-of-freedom industrial robot coordinate system is obtained.
The invention discloses a complex part pose estimation system based on three-dimensional measurement point cloud, which is used for estimating the pose of a complex part, and the basic idea is to obtain a conversion matrix of a grating type area array scanner and a six-degree-of-freedom industrial robot through hand-eye calibration, convert the workpiece measurement point cloud below a base coordinate system of the six-degree-of-freedom industrial robot, and match the point cloud with a workpiece three-dimensional design model through three-dimensional point cloud matching to obtain the conversion matrix, wherein the method specifically comprises the following steps:
s1 calibration conversion matrix between six-freedom-degree industrial robot tail end flange plate and grating type area array scanner measurement coordinate systemWherein, R is a rotation matrix of the measurement coordinate system of the grating type area array scanner relative to the end flange of the six-degree-of-freedom industrial robot, t is a translation matrix of the measurement coordinate system of the grating type area array scanner relative to the end flange of the six-degree-of-freedom industrial robot, and the calibration is specifically performed by adopting hand-eye calibration, which is the prior art and is not repeated herein;
s2, the six-degree-of-freedom industrial robot drives the grating type area array scanner to move so as to scan the workpiece to be measured, and then three-dimensional point cloud data p of the workpiece to be measured is obtainedi(i 1, 2., r., s), that is, three-dimensional point cloud data of the workpiece to be measured is obtained by a grating type area array scanner, which is the prior art and is not described herein again;
s3, converting the three-dimensional point cloud data of the workpiece to be detected to a robot base coordinate system to obtain the converted three-dimensional point cloud data:
wherein,the pose of the robot during the first measurement, namely the conversion pose from a flange plate at the tail end of the robot to a robot base coordinate system during the first measurement, can be obtained by calculation according to the motion parameters of the robot, and T is a conversion matrix from a connecting rod j-1 of the robot to a connecting rod j;
s4, matching the converted three-dimensional point cloud data with the three-dimensional design model of the workpiece to be measured to obtain the pose of the workpiece to be measured relative to the robot base coordinate system, thereby completing the estimation of the pose of the complex part, specifically, calculating the conversion matrix from the point cloud to the three-dimensional design model coordinate system after each matching is completed through the ICP matching algorithmSetting the matching times as n, and then matching the corresponding transformation matrix T according to each timekCalculating the pose of the measured workpiece under the robot base coordinate systemThe result is obtained.
The point cloud of the three-dimensional design model of the workpiece to be measured is assumed to be Q, and comprises the point cloud Qi(i 1, 2.. a., l.) P represents the converted three-dimensional point cloud data, including the point cloudBpi(i=1,2,...,r,...,s),TkThe solution of (c) is as follows:
s41 pairs all points in PBpiSearching the closest point Q corresponding to each point from QiCalculating the centroid muP、μQAnd difference of coordinates
S42 a 3 × 3 order covariance matrix H is calculated from the set of points P, Q:
wherein HijRepresents the ith row and jth column elements of the matrix H;
s43 constructs a 4 × 4 order symmetric matrix W from H:
s44, calculating the eigenvalue of matrix W, and extracting the eigenvector corresponding to the maximum eigenvalueAnd further solving a rotation matrix R and a translation matrix t:
t=μQ-R×μP
further, the following is obtained:
obtaining a transformation matrix TkThen, using TkSolving for pi(i 1, 2.. r.. s.) matched position pi′,pi′=Tk×piUpdating p at next matchi=pi', then repeating steps S41-S44 to match n timesTo improve the matching accuracy, and then according to each transformation matrix TkCalculating the pose of the measured workpiece under the robot base coordinate systemIs obtained, wherein T is1I.e. the transformation matrix, T, obtained in the first matching2I.e. the transformation matrix obtained in the second matching, and so on, TnI.e. the transformation matrix obtained at the n-th matching.
The method can estimate the pose of a sealing surface flange sample of a certain model, skillfully acquire the pose of a workpiece in a robot base coordinate system after a series of coordinate transformation is carried out on measured three-dimensional point cloud data, and can carry out subsequent path planning and simulation on the robot after the pose is acquired.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.
Claims (7)
1. The utility model provides a complicated part position appearance estimation system based on three-dimensional measurement point cloud, its characterized in that includes six degree of freedom industrial robot (100), grating formula area array scanner (300), by survey work piece (200), mark point support frame (400) and data processing host computer (500), wherein:
the tail end of the six-degree-of-freedom industrial robot (100) clamps the grating type area array scanner (300) and is used for driving the grating type area array scanner (300) to move to scan and measure a workpiece (200) to be measured placed on the mark point support frame (400), blue light emitted by the grating type area array scanner (300) covers the surface of the workpiece to be measured and mark points on the mark point support frame during measurement, and at least 3 public mark points which are not on the same straight line exist in the two previous and next measurements; the six-degree-of-freedom industrial robot (100) is connected with the data processing upper computer (500) so as to transmit the motion parameters of the six-degree-of-freedom industrial robot (100) to the data processing upper computer (500) in real time;
the grating type area array scanner (300) is arranged at the tail end of the six-degree-of-freedom industrial robot (100) and used for scanning and measuring a workpiece to be measured (200) to obtain three-dimensional point cloud data of the workpiece to be measured, and the grating type area array scanner (300) is connected with the data processing upper computer (500) to transmit the measurement data to the processing upper computer (500) in real time;
the data processing upper computer (500) is used for receiving the motion parameters of the six-degree-of-freedom industrial robot (100) and the measurement data of the grating type area array scanner (300), and calculating to obtain the pose of the measured workpiece under the robot base coordinate system, so that the estimation of the pose of the complex part is completed.
2. The complex part pose estimation system based on three-dimensional measurement point cloud as claimed in claim 1, characterized in that the position relation between the end flange of the six-degree-of-freedom industrial robot (100) and the measurement coordinate system of the grating type area array scanner (300) is obtained by hand-eye calibration before measurement.
3. The complex part pose estimation system based on three-dimensional measurement point cloud of claim 1, wherein the distance between adjacent marker points on the marker point support frame (400) is preferably 3cm to 4 cm.
4. The complex part pose estimation system based on three-dimensional measurement point cloud of claim 1, characterized in that the six-degree-of-freedom industrial robot (100) is connected with the data processing upper computer (500) through a six-degree-of-freedom industrial robot controller (600).
5. The complex part pose estimation system based on three-dimensional measurement point cloud of claim 4, characterized in that the raster type area array scanner (300) is connected with a six-degree-of-freedom industrial robot controller (600) to realize the synchronization of data processing upper computer signal triggering and raster type area array scanner data acquisition.
6. A complex part pose estimation method based on three-dimensional measurement point cloud is characterized by being carried out by the system according to any one of claims 1-5, and comprising the following steps:
s1 calibration conversion matrix between six-freedom-degree industrial robot tail end flange plate and grating type area array scanner measurement coordinate systemWherein R is a rotation matrix of a measurement coordinate system of the grating type area array scanner relative to a tail end flange plate of the six-degree-of-freedom industrial robot, and t is a translation matrix of the measurement coordinate system of the grating type area array scanner relative to the tail end flange plate of the six-degree-of-freedom industrial robot;
s2, the six-degree-of-freedom industrial robot drives the grating type area array scanner to move so as to scan the workpiece to be measured, and then three-dimensional point cloud data p of the workpiece to be measured is obtainedi(i=1,2,...,s);
S3, converting the three-dimensional point cloud data of the workpiece to be detected to a robot base coordinate system to obtain the converted three-dimensional point cloud data:
wherein,the pose of the robot is measured for the first time;
and S4, matching the converted three-dimensional point cloud data with the three-dimensional design model of the workpiece to be measured to obtain the pose of the workpiece to be measured relative to the robot base coordinate system, thereby finishing the estimation of the pose of the complex part.
7. The method for estimating the pose of a complex part based on three-dimensional measurement point cloud according to claim 6, wherein the pose of the measured workpiece relative to the robot base coordinate system is obtained in step S4 by adopting the following method:
firstly, calculating a transformation matrix from point cloud to three-dimensional design model coordinate system after each matching is finishedWherein the matching times are n;
then according to the conversion matrix T corresponding to each matchingkCalculating the pose of the measured workpiece under the robot base coordinate systemThe result is obtained.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811428771.5A CN109373898B (en) | 2018-11-27 | 2018-11-27 | Complex part pose estimation system and method based on three-dimensional measurement point cloud |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811428771.5A CN109373898B (en) | 2018-11-27 | 2018-11-27 | Complex part pose estimation system and method based on three-dimensional measurement point cloud |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109373898A true CN109373898A (en) | 2019-02-22 |
CN109373898B CN109373898B (en) | 2020-07-10 |
Family
ID=65377364
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811428771.5A Active CN109373898B (en) | 2018-11-27 | 2018-11-27 | Complex part pose estimation system and method based on three-dimensional measurement point cloud |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109373898B (en) |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110310331A (en) * | 2019-06-18 | 2019-10-08 | 哈尔滨工程大学 | A kind of position and orientation estimation method based on linear feature in conjunction with point cloud feature |
CN110434679A (en) * | 2019-07-25 | 2019-11-12 | 王东 | A kind of Intelligent Machining method for the workpiece with random size error |
CN110434671A (en) * | 2019-07-25 | 2019-11-12 | 王东 | A kind of cast member surface machining track calibration method based on pattern measurement |
CN110553584A (en) * | 2019-08-30 | 2019-12-10 | 长春理工大学 | Measuring tool, automatic measuring system and measuring method for small-sized complex parts |
CN110634185A (en) * | 2019-07-31 | 2019-12-31 | 众宏(上海)自动化股份有限公司 | Visual algorithm for quickly forming point cloud for gear repair |
CN110634161A (en) * | 2019-08-30 | 2019-12-31 | 哈尔滨工业大学(深圳) | Method and device for quickly and accurately estimating pose of workpiece based on point cloud data |
CN110640585A (en) * | 2019-10-25 | 2020-01-03 | 华中科技大学 | Three-dimensional non-contact measuring device and method for blade grinding and polishing |
CN111043963A (en) * | 2019-12-31 | 2020-04-21 | 芜湖哈特机器人产业技术研究院有限公司 | Three-dimensional scanning system measuring method of carriage container based on two-dimensional laser radar |
CN111551111A (en) * | 2020-05-13 | 2020-08-18 | 华中科技大学 | Part feature robot rapid visual positioning method based on standard ball array |
CN112161619A (en) * | 2020-09-16 | 2021-01-01 | 杭州思锐迪科技有限公司 | Pose detection method, three-dimensional scanning path planning method and detection system |
CN112307562A (en) * | 2020-10-30 | 2021-02-02 | 泉州装备制造研究所 | Method for assembling complex parts on large-scale airplane by combining thermal deformation and gravity deformation |
CN112577447A (en) * | 2020-12-07 | 2021-03-30 | 新拓三维技术(深圳)有限公司 | Three-dimensional full-automatic scanning system and method |
CN112828878A (en) * | 2019-11-22 | 2021-05-25 | 中国科学院沈阳自动化研究所 | Three-dimensional measurement and tracking method for large-scale equipment in butt joint process |
CN112828552A (en) * | 2021-01-29 | 2021-05-25 | 华中科技大学 | Intelligent butt joint method and system for flange parts |
CN113386136A (en) * | 2021-06-30 | 2021-09-14 | 华中科技大学 | Robot posture correction method and system based on standard spherical array target estimation |
CN113894785A (en) * | 2021-10-27 | 2022-01-07 | 华中科技大学无锡研究院 | Control method, device and system for in-situ measurement and processing of blades of water turbine |
CN114036666A (en) * | 2021-11-04 | 2022-02-11 | 山西汾西重工有限责任公司 | Method for predicting wall thickness deviation of casting part |
CN114279326A (en) * | 2021-12-22 | 2022-04-05 | 易思维(天津)科技有限公司 | Global positioning method of three-dimensional scanning equipment |
CN114417616A (en) * | 2022-01-20 | 2022-04-29 | 青岛理工大学 | Digital twin modeling method and system for assembly robot teleoperation environment |
CN114459377A (en) * | 2022-02-10 | 2022-05-10 | 中国航发沈阳发动机研究所 | Device and method for measuring blade profile of aircraft engine |
CN114485488A (en) * | 2021-07-13 | 2022-05-13 | 北京航天计量测试技术研究所 | Automatic measuring system and measuring method for exhaust area of turbine guider |
CN114742883A (en) * | 2022-03-30 | 2022-07-12 | 华中科技大学 | Automatic assembly method and system based on plane type workpiece positioning algorithm |
CN115049730A (en) * | 2022-05-31 | 2022-09-13 | 北京有竹居网络技术有限公司 | Part assembling method, part assembling device, electronic device and storage medium |
US11644296B1 (en) | 2021-12-17 | 2023-05-09 | Industrial Technology Research Institute | 3D measuring equipment and 3D measuring method |
CN114417616B (en) * | 2022-01-20 | 2024-11-05 | 青岛理工大学 | Digital twin modeling method and system for teleoperation environment of assembly robot |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101097131A (en) * | 2006-06-30 | 2008-01-02 | 廊坊智通机器人系统有限公司 | Method for marking workpieces coordinate system |
US20090234502A1 (en) * | 2008-03-12 | 2009-09-17 | Denso Wave Incorporated | Apparatus for determining pickup pose of robot arm with camera |
CN101566461A (en) * | 2009-05-18 | 2009-10-28 | 西安交通大学 | Method for quickly measuring blade of large-sized water turbine |
JP4649554B1 (en) * | 2010-02-26 | 2011-03-09 | 株式会社三次元メディア | Robot controller |
CN106370106A (en) * | 2016-09-30 | 2017-02-01 | 上海航天精密机械研究所 | Industrial robot and linear guide rail-combined linear laser scanning measurement system and method |
CN106959080A (en) * | 2017-04-10 | 2017-07-18 | 上海交通大学 | A kind of large complicated carved components three-dimensional pattern optical measuring system and method |
CN107270833A (en) * | 2017-08-09 | 2017-10-20 | 武汉智诺维科技有限公司 | A kind of complex curved surface parts three-dimension measuring system and method |
-
2018
- 2018-11-27 CN CN201811428771.5A patent/CN109373898B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101097131A (en) * | 2006-06-30 | 2008-01-02 | 廊坊智通机器人系统有限公司 | Method for marking workpieces coordinate system |
US20090234502A1 (en) * | 2008-03-12 | 2009-09-17 | Denso Wave Incorporated | Apparatus for determining pickup pose of robot arm with camera |
CN101566461A (en) * | 2009-05-18 | 2009-10-28 | 西安交通大学 | Method for quickly measuring blade of large-sized water turbine |
JP4649554B1 (en) * | 2010-02-26 | 2011-03-09 | 株式会社三次元メディア | Robot controller |
CN106370106A (en) * | 2016-09-30 | 2017-02-01 | 上海航天精密机械研究所 | Industrial robot and linear guide rail-combined linear laser scanning measurement system and method |
CN106959080A (en) * | 2017-04-10 | 2017-07-18 | 上海交通大学 | A kind of large complicated carved components three-dimensional pattern optical measuring system and method |
CN107270833A (en) * | 2017-08-09 | 2017-10-20 | 武汉智诺维科技有限公司 | A kind of complex curved surface parts three-dimension measuring system and method |
Non-Patent Citations (2)
Title |
---|
杨光 等: "自动化三维精密测量技术及其在锻压领域的应用", 《锻压技术》 * |
杨守瑞: "大型构件复杂曲面自动化测量方法与技术", 《中国博士学位论文全文数据库 信息科技辑》 * |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110310331A (en) * | 2019-06-18 | 2019-10-08 | 哈尔滨工程大学 | A kind of position and orientation estimation method based on linear feature in conjunction with point cloud feature |
CN110434679A (en) * | 2019-07-25 | 2019-11-12 | 王东 | A kind of Intelligent Machining method for the workpiece with random size error |
CN110434671A (en) * | 2019-07-25 | 2019-11-12 | 王东 | A kind of cast member surface machining track calibration method based on pattern measurement |
CN110634185A (en) * | 2019-07-31 | 2019-12-31 | 众宏(上海)自动化股份有限公司 | Visual algorithm for quickly forming point cloud for gear repair |
CN110553584A (en) * | 2019-08-30 | 2019-12-10 | 长春理工大学 | Measuring tool, automatic measuring system and measuring method for small-sized complex parts |
CN110634161A (en) * | 2019-08-30 | 2019-12-31 | 哈尔滨工业大学(深圳) | Method and device for quickly and accurately estimating pose of workpiece based on point cloud data |
CN110640585A (en) * | 2019-10-25 | 2020-01-03 | 华中科技大学 | Three-dimensional non-contact measuring device and method for blade grinding and polishing |
CN112828878A (en) * | 2019-11-22 | 2021-05-25 | 中国科学院沈阳自动化研究所 | Three-dimensional measurement and tracking method for large-scale equipment in butt joint process |
CN112828878B (en) * | 2019-11-22 | 2022-10-25 | 中国科学院沈阳自动化研究所 | Three-dimensional measurement and tracking method for large-scale equipment in butt joint process |
CN111043963A (en) * | 2019-12-31 | 2020-04-21 | 芜湖哈特机器人产业技术研究院有限公司 | Three-dimensional scanning system measuring method of carriage container based on two-dimensional laser radar |
CN111551111A (en) * | 2020-05-13 | 2020-08-18 | 华中科技大学 | Part feature robot rapid visual positioning method based on standard ball array |
CN112161619A (en) * | 2020-09-16 | 2021-01-01 | 杭州思锐迪科技有限公司 | Pose detection method, three-dimensional scanning path planning method and detection system |
CN112161619B (en) * | 2020-09-16 | 2022-11-15 | 思看科技(杭州)股份有限公司 | Pose detection method, three-dimensional scanning path planning method and detection system |
CN112307562A (en) * | 2020-10-30 | 2021-02-02 | 泉州装备制造研究所 | Method for assembling complex parts on large-scale airplane by combining thermal deformation and gravity deformation |
CN112307562B (en) * | 2020-10-30 | 2022-03-01 | 泉州装备制造研究所 | Method for assembling complex parts on large-scale airplane by combining thermal deformation and gravity deformation |
CN112577447A (en) * | 2020-12-07 | 2021-03-30 | 新拓三维技术(深圳)有限公司 | Three-dimensional full-automatic scanning system and method |
CN112577447B (en) * | 2020-12-07 | 2022-03-22 | 新拓三维技术(深圳)有限公司 | Three-dimensional full-automatic scanning system and method |
CN112828552B (en) * | 2021-01-29 | 2022-05-20 | 华中科技大学 | Intelligent butt joint method and system for flange parts |
CN112828552A (en) * | 2021-01-29 | 2021-05-25 | 华中科技大学 | Intelligent butt joint method and system for flange parts |
CN113386136B (en) * | 2021-06-30 | 2022-05-20 | 华中科技大学 | Robot posture correction method and system based on standard spherical array target estimation |
CN113386136A (en) * | 2021-06-30 | 2021-09-14 | 华中科技大学 | Robot posture correction method and system based on standard spherical array target estimation |
CN114485488B (en) * | 2021-07-13 | 2024-04-09 | 北京航天计量测试技术研究所 | Automatic measurement system and measurement method for exhaust area of turbine guider |
CN114485488A (en) * | 2021-07-13 | 2022-05-13 | 北京航天计量测试技术研究所 | Automatic measuring system and measuring method for exhaust area of turbine guider |
CN113894785B (en) * | 2021-10-27 | 2023-06-09 | 华中科技大学无锡研究院 | Control method, device and system for in-situ measurement and processing of turbine blades |
CN113894785A (en) * | 2021-10-27 | 2022-01-07 | 华中科技大学无锡研究院 | Control method, device and system for in-situ measurement and processing of blades of water turbine |
CN114036666A (en) * | 2021-11-04 | 2022-02-11 | 山西汾西重工有限责任公司 | Method for predicting wall thickness deviation of casting part |
US11644296B1 (en) | 2021-12-17 | 2023-05-09 | Industrial Technology Research Institute | 3D measuring equipment and 3D measuring method |
TWI806294B (en) * | 2021-12-17 | 2023-06-21 | 財團法人工業技術研究院 | 3d measuring equipment and 3d measuring method |
CN114279326A (en) * | 2021-12-22 | 2022-04-05 | 易思维(天津)科技有限公司 | Global positioning method of three-dimensional scanning equipment |
CN114279326B (en) * | 2021-12-22 | 2024-05-28 | 易思维(天津)科技有限公司 | Global positioning method of three-dimensional scanning equipment |
CN114417616A (en) * | 2022-01-20 | 2022-04-29 | 青岛理工大学 | Digital twin modeling method and system for assembly robot teleoperation environment |
CN114417616B (en) * | 2022-01-20 | 2024-11-05 | 青岛理工大学 | Digital twin modeling method and system for teleoperation environment of assembly robot |
CN114459377A (en) * | 2022-02-10 | 2022-05-10 | 中国航发沈阳发动机研究所 | Device and method for measuring blade profile of aircraft engine |
CN114742883A (en) * | 2022-03-30 | 2022-07-12 | 华中科技大学 | Automatic assembly method and system based on plane type workpiece positioning algorithm |
CN114742883B (en) * | 2022-03-30 | 2024-09-24 | 华中科技大学 | Automatic assembly method and system based on plane workpiece positioning algorithm |
CN115049730A (en) * | 2022-05-31 | 2022-09-13 | 北京有竹居网络技术有限公司 | Part assembling method, part assembling device, electronic device and storage medium |
CN115049730B (en) * | 2022-05-31 | 2024-04-26 | 北京有竹居网络技术有限公司 | Component mounting method, component mounting device, electronic apparatus, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN109373898B (en) | 2020-07-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109373898B (en) | Complex part pose estimation system and method based on three-dimensional measurement point cloud | |
CN111775146B (en) | Visual alignment method under industrial mechanical arm multi-station operation | |
CN109927036A (en) | A kind of method and system of 3D vision guidance manipulator crawl | |
US10585167B2 (en) | Relative object localization process for local positioning system | |
CN110202573B (en) | Full-automatic hand-eye calibration and working plane calibration method and device | |
CN103895023B (en) | A kind of tracking measurement method of the mechanical arm tail end tracing measurement system based on coding azimuth device | |
JP5371927B2 (en) | Coordinate system calibration method and robot system | |
CN112325796A (en) | Large-scale workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing | |
JP2022516852A (en) | Robot visual guidance method and device by integrating overview vision and local vision | |
Mi et al. | A vision-based displacement measurement system for foundation pit | |
JP2004508954A (en) | Positioning device and system | |
CN106323286B (en) | A kind of robot coordinate system and the transform method of three-dimensional measurement coordinate system | |
CN110171009A (en) | A kind of robot handheld teaching apparatus based on stereoscopic vision | |
CN112958960B (en) | Robot hand-eye calibration device based on optical target | |
CN115284292A (en) | Mechanical arm hand-eye calibration method and device based on laser camera | |
CN114964213A (en) | Building engineering construction positioning system and method based on attitude perception and visual scanning | |
CN109773589B (en) | Method, device and equipment for online measurement and machining guidance of workpiece surface | |
Li et al. | Extrinsic calibration of non-overlapping multi-camera system with high precision using circular encoded point ruler | |
Chen et al. | Heterogeneous multi-sensor calibration based on graph optimization | |
CN113838147B (en) | Blade assembly visual guiding method and system based on depth camera | |
CN115847491A (en) | Space on-orbit maintenance robot target autonomous measurement method | |
CN109059761A (en) | A kind of hand-held target head calibration method based on EIV model | |
Van Toan et al. | A Single 2D LiDAR Extrinsic Calibration for Autonomous Mobile Robots | |
TW201923498A (en) | Control method of self-propelled equipment achieving the aim of improving the location precision of the self-propelled equipment by utilizing an optical tracing technology | |
Yoder et al. | Experiments comparing precision of stereo-vision approaches for control of an industrial manipulator |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
EE01 | Entry into force of recordation of patent licensing contract | ||
EE01 | Entry into force of recordation of patent licensing contract |
Application publication date: 20190222 Assignee: WUHAN POWER3D TECHNOLOGY Ltd. Assignor: HUAZHONG University OF SCIENCE AND TECHNOLOGY Contract record no.: X2022420000110 Denomination of invention: A pose estimation system and method for complex parts based on 3d measurement point cloud Granted publication date: 20200710 License type: Common License Record date: 20220930 |