CN109373898A - A system and method for pose estimation of complex parts based on 3D measurement point cloud - Google Patents
A system and method for pose estimation of complex parts based on 3D measurement point cloud Download PDFInfo
- Publication number
- CN109373898A CN109373898A CN201811428771.5A CN201811428771A CN109373898A CN 109373898 A CN109373898 A CN 109373898A CN 201811428771 A CN201811428771 A CN 201811428771A CN 109373898 A CN109373898 A CN 109373898A
- Authority
- CN
- China
- Prior art keywords
- point cloud
- workpiece
- pose
- measured
- degree
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000005259 measurement Methods 0.000 title claims abstract description 77
- 238000000034 method Methods 0.000 title claims abstract description 21
- 239000011159 matrix material Substances 0.000 claims abstract description 37
- 230000009466 transformation Effects 0.000 claims abstract description 24
- 238000013461 design Methods 0.000 claims abstract description 15
- 238000012545 processing Methods 0.000 claims description 26
- 239000003550 marker Substances 0.000 claims description 10
- 238000006243 chemical reaction Methods 0.000 claims description 5
- 238000013519 translation Methods 0.000 claims description 3
- 238000012876 topography Methods 0.000 abstract description 2
- 230000008569 process Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
本发明属于自动化测量领域,并具体公开了一种基于三维测量点云的复杂零件位姿估计系统及方法,其通过如下步骤实现复杂零件位姿估计:标定六自由度工业机器人末端法兰盘与光栅式面阵扫描仪测量坐标系之间的转换矩阵;通过六自由度工业机器人带动光栅式面阵扫描仪运动以对被测工件进行扫描,进而获取被测工件的三维点云数据;将被测工件的三维点云数据转换至机器人基坐标系下以获取转换后的三维点云数据;将转换后的三维点云数据与被测工件的三维设计模型进行匹配,以获得被测工件相对机器人基坐标系的位姿,以此完成复杂零件位姿的估计。本发明测量范围广,可实现被测工件多区域及完整形貌的精确测量,能够准确的获取被测工件的位姿。
The invention belongs to the field of automatic measurement, and specifically discloses a complex part pose estimation system and method based on three-dimensional measurement point cloud, which realizes the complex part pose estimation through the following steps: calibrating the end flange of a six-degree-of-freedom industrial robot and the The raster area array scanner measures the transformation matrix between coordinate systems; the raster area array scanner is driven by a six-degree-of-freedom industrial robot to scan the workpiece to be measured, and then the 3D point cloud data of the workpiece to be measured is obtained; The 3D point cloud data of the measured workpiece is converted to the robot base coordinate system to obtain the converted 3D point cloud data; the converted 3D point cloud data is matched with the 3D design model of the measured workpiece to obtain the measured workpiece relative to the robot. The pose of the base coordinate system is used to estimate the pose of complex parts. The invention has a wide measurement range, can achieve accurate measurement of multiple regions and complete topography of the workpiece to be tested, and can accurately obtain the pose of the workpiece to be tested.
Description
技术领域technical field
本发明属于自动化测量领域,更具体地,涉及一种基于三维测量点云的复杂零件位姿估计系统及方法。The invention belongs to the field of automatic measurement, and more particularly, relates to a complex part pose estimation system and method based on three-dimensional measurement point cloud.
背景技术Background technique
机器人三维自动测量需要规划完整无碰撞的路径,而规划路径就必须知道工件相对于机器人基坐标系的姿态。传统的工件姿态估计方法有人工对准法,这种方法依赖人工操作,随意性大,估计不准确;还有利用视觉定位来获取工件的位姿方法,其利用工件的特征进行图像匹配来获取工件位姿,这种方法不适用于特征不明显的工件,并且计算复杂。Robot three-dimensional automatic measurement needs to plan a complete and collision-free path, and the planning path must know the pose of the workpiece relative to the robot's base coordinate system. The traditional workpiece pose estimation methods include manual alignment method, which relies on manual operation, is random, and is inaccurate in estimation; there is also the use of visual positioning to obtain the pose of the workpiece, which uses the features of the workpiece to obtain image matching. Workpiece pose, this method is not suitable for workpieces with insignificant features, and the calculation is complicated.
这些传统的测量方式不仅复杂,而且难以获取准确的工件位姿,导致后续的路径规划出现偏差,尤其是当工件非常庞大的时候,是非常难获得工件坐标系的,这样就会导致后续的机器人测点规划的时候出现大幅度的偏差,从而可能导致机器人测量系统与外界环境发生不可预期的碰撞。These traditional measurement methods are not only complicated, but also difficult to obtain accurate workpiece poses, resulting in deviations in subsequent path planning. Especially when the workpiece is very large, it is very difficult to obtain the workpiece coordinate system, which will lead to subsequent robots. There is a large deviation in the planning of measuring points, which may lead to unexpected collision between the robot measuring system and the external environment.
发明内容SUMMARY OF THE INVENTION
针对现有技术的以上缺陷或改进需求,本发明提供了一种基于三维测量点云的复杂零件位姿估计系统及方法,其利用六自由度工业机器人带动光栅式面阵扫描仪对被测工件进行局部特征性扫描,获取关键部位点云数据,将点云数据通过手眼标定的矩阵转换到机器人基坐标系下得到新的点云数据,再利用被测工件的设计模型与点云匹配,获取设计模型到点云数据的转换矩阵,该转换矩阵便是被测工件相对机器人基坐标系的位姿,本发明可对不可接触的大型复杂零件进行位姿估计,可适应各种复杂场景,具有操作方便,适用性强等优点。In view of the above defects or improvement needs of the prior art, the present invention provides a complex part pose estimation system and method based on a three-dimensional measurement point cloud, which utilizes a six-degree-of-freedom industrial robot to drive a raster area array scanner to measure the workpiece to be measured. Perform local characteristic scanning to obtain point cloud data of key parts, convert the point cloud data to the robot base coordinate system through the matrix of hand-eye calibration to obtain new point cloud data, and then use the design model of the workpiece to be tested to match the point cloud to obtain The transformation matrix from the model to the point cloud data is designed, and the transformation matrix is the pose of the measured workpiece relative to the base coordinate system of the robot. The present invention can estimate the pose of the inaccessible large and complex parts, and can adapt to various complex scenes. It has the advantages of convenient operation and strong applicability.
为实现上述目的,按照本发明的一个方面,提出了一种基于三维测量点云的复杂零件位姿估计系统,其包括六自由度工业机器人、光栅式面阵扫描仪、被测工件、标志点支撑架以及数据处理上位机,其中:In order to achieve the above object, according to one aspect of the present invention, a complex part pose estimation system based on a three-dimensional measurement point cloud is proposed, which includes a six-degree-of-freedom industrial robot, a raster area scanner, a measured workpiece, and a marker point. A support frame and a data processing host computer, wherein:
所述六自由度工业机器人的末端夹持所述光栅式面阵扫描仪,用于带动光栅式面阵扫描仪移动,以对放置在标志点支撑架上的被测工件进行扫描测量,测量时光栅式面阵扫描仪发射的蓝光覆盖被测工件的表面及标志点支撑架上的标志点,且前后两次测量至少存在3个不在同一直线上的公共标志点;该六自由度工业机器人与数据处理上位机相连,以将六自由度工业机器人的运动参数实时传输给数据处理上位机;The end of the six-degree-of-freedom industrial robot clamps the raster area array scanner to drive the raster area array scanner to move, so as to scan and measure the workpiece to be measured placed on the support frame of the mark point. The blue light emitted by the raster area array scanner covers the surface of the workpiece to be tested and the marking points on the marking point support frame, and there are at least three common marking points that are not on the same straight line in the two measurements before and after; The data processing host computer is connected to transmit the motion parameters of the six-degree-of-freedom industrial robot to the data processing host computer in real time;
所述光栅式面阵扫描仪安装在六自由度工业机器人的末端,用于对被测工件进行扫描测量获取被测工件的三维点云数据,该光栅式面阵扫描仪与数据处理上位机相连,以将测量数据实时传输给处理上位机;The raster area array scanner is installed at the end of the six-degree-of-freedom industrial robot, and is used to scan and measure the measured workpiece to obtain the three-dimensional point cloud data of the measured workpiece. The raster area array scanner is connected to the data processing host computer. , to transmit the measurement data to the processing host computer in real time;
所述数据处理上位机用于接收六自由度工业机器人的运动参数以及光栅式面阵扫描仪的测量数据,并计算获得被测工件在机器人基坐标系下的位姿,以此完成复杂零件位姿的估计。The data processing host computer is used to receive the motion parameters of the six-degree-of-freedom industrial robot and the measurement data of the raster area array scanner, and calculate and obtain the pose of the measured workpiece in the robot base coordinate system, so as to complete the positioning of complex parts. pose estimation.
作为进一步优选的,测量之前通过手眼标定获得六自由度工业机器人末端法兰盘与光栅式面阵扫描仪测量坐标系之间的位置关系。As a further preference, the positional relationship between the end flange of the six-degree-of-freedom industrial robot and the measurement coordinate system of the raster area array scanner is obtained through hand-eye calibration before the measurement.
作为进一步优选的,标志点支撑架上相邻标志点之间的间距优选为3cm~4cm。As a further preference, the distance between adjacent marking points on the marking point support frame is preferably 3 cm to 4 cm.
作为进一步优选的,六自由度工业机器人通过六自由度工业机器人控制器与数据处理上位机相连。As a further preference, the six-degree-of-freedom industrial robot is connected to the data processing host computer through the six-degree-of-freedom industrial robot controller.
作为进一步优选的,光栅式面阵扫描仪与六自由度工业机器人控制器相连接,以实现信号触发和数据采集的同步。As a further preferred option, the raster area array scanner is connected with the 6-DOF industrial robot controller to realize synchronization of signal triggering and data acquisition.
按照本发明的另一方面,提供了一种基于三维测量点云的复杂零件位姿估计方法,其包括如下步骤:According to another aspect of the present invention, a method for estimating the pose of a complex part based on a three-dimensional measurement point cloud is provided, which includes the following steps:
S1标定六自由度工业机器人末端法兰盘与光栅式面阵扫描仪测量坐标系之间的转换矩阵 S1 calibrates the transformation matrix between the end flange of the 6-DOF industrial robot and the measurement coordinate system of the raster area array scanner
S2通过六自由度工业机器人带动光栅式面阵扫描仪运动以对被测工件进行扫描,进而获取被测工件的三维点云数据pi(i=1,2,...,s);S2 drives the raster area scanner to move through the six-degree-of-freedom industrial robot to scan the measured workpiece, and then obtains the three-dimensional point cloud data pi ( i =1,2,...,s) of the measured workpiece;
S3将被测工件的三维点云数据转换至机器人基坐标系下以获取转换后的三维点云数据:S3 converts the 3D point cloud data of the measured workpiece to the robot base coordinate system to obtain the converted 3D point cloud data:
其中,为第一次测量时机器人的位姿;in, is the pose of the robot during the first measurement;
S4将转换后的三维点云数据与被测工件的三维设计模型进行匹配,以获得被测工件相对机器人基坐标系的位姿,以此完成复杂零件位姿的估计。S4 matches the converted 3D point cloud data with the 3D design model of the measured workpiece to obtain the pose of the measured workpiece relative to the robot base coordinate system, so as to complete the estimation of the pose of complex parts.
作为进一步优选的,步骤S4中采用如下方法获得被测工件相对机器人基坐标系的位姿:As a further preference, in step S4, the following method is used to obtain the pose of the workpiece to be measured relative to the base coordinate system of the robot:
首先计算每次匹配完成后点云到三维设计模型坐标系的转换矩阵其中,匹配次数为n;First calculate the transformation matrix from the point cloud to the coordinate system of the 3D design model after each matching is completed Among them, the number of matches is n;
然后根据每次匹配对应的转换矩阵Tk计算被测工件在机器人基坐标系下的位姿即为所求。Then calculate the pose of the workpiece under the robot base coordinate system according to the transformation matrix T k corresponding to each match That is what is asked for.
总体而言,通过本发明所构思的以上技术方案与现有技术相比,主要具备以下的技术优点:In general, compared with the prior art, the above technical solutions conceived by the present invention mainly have the following technical advantages:
1.本发明克服传统位姿估计方法需要人工进入测量的缺陷,可利用六自由度工业机器人带动扫描仪实现远程操作,获取点云数据,计算工件位姿,安全可靠,不会对操作人员造成威胁,测量范围广,可实现被测工件多区域及完整形貌的精确测量。1. The present invention overcomes the defect that the traditional pose estimation method requires manual entry and measurement, and can use a six-degree-of-freedom industrial robot to drive a scanner to realize remote operation, obtain point cloud data, and calculate the pose of the workpiece, which is safe and reliable, and will not cause operator damage. Threats, wide measurement range, can achieve accurate measurement of multiple areas and complete topography of the measured workpiece.
2.本发明采用的光栅式面阵扫描仪具备基于标志点的自动拼接功能,可避免由于机器人绝对定位误差所引入的扫描仪测量点云拼接误差,提高测量点云精度,标志点拼接的实现方式是在待测工件测量区域周边粘贴标志点,且测量过程中保证相邻两视角下观测到至少三个重复的标记点,进而保证拼接出完整的测量点云数据。2. The raster area array scanner used in the present invention has the function of automatic splicing based on marker points, which can avoid the splicing error of the scanner measurement point cloud caused by the absolute positioning error of the robot, improve the accuracy of the measurement point cloud, and realize the splicing of marker points. The method is to paste mark points around the measurement area of the workpiece to be measured, and ensure that at least three repeated mark points are observed from two adjacent viewing angles during the measurement process, thereby ensuring that the complete measurement point cloud data is spliced.
3.本发明可解决人工测量难以获得较为准确的位置参数的问题,尤其是旋转矩阵R的获取,利用了精确的手眼矩阵和点云匹配算法,能够较为准确的获取被测工件的位姿。3. The present invention can solve the problem that it is difficult to obtain relatively accurate position parameters by manual measurement, especially the acquisition of the rotation matrix R, using the accurate hand-eye matrix and point cloud matching algorithm, can relatively accurately obtain the pose of the measured workpiece.
4.本发明结构简单,能够很好地解决随意摆放的工件位姿相对机器人基坐标系的位姿问题,且针对同一类型的位姿估计均可适用,通用性强。4. The present invention has a simple structure, can well solve the pose problem of the workpiece poses placed at random relative to the robot base coordinate system, and can be applied to the same type of pose estimation, and has strong versatility.
附图说明Description of drawings
图1为本发明实施例提供的基于三维测量点云的复杂零件位姿估计系统的结构示意图;1 is a schematic structural diagram of a complex part pose estimation system based on a three-dimensional measurement point cloud provided by an embodiment of the present invention;
图2为一种典型的工件三维特征示意图;2 is a schematic diagram of a typical three-dimensional feature of a workpiece;
图3为扫描仪和机器人末端法兰盘位姿关系以及获取三维点云的示意图,其中,OB-XYZ为机器人基坐标系{B},OE-XYZ为机器人末端法兰坐标系{E},OS-XYZ为扫描仪测量坐标系{S};Figure 3 is a schematic diagram of the position and attitude relationship between the scanner and the robot end flange and obtaining a three-dimensional point cloud, wherein O B -XYZ is the robot base coordinate system {B}, and O E -XYZ is the robot end flange coordinate system {E }, O S -XYZ is the scanner measurement coordinate system {S};
图4为点云匹配过程的示意图,其中,xyz为三维测量点云的坐标系,x0y0z0为模型坐标系,T为三维测量点云的坐标系到模型坐标系的变换矩阵。Figure 4 is a schematic diagram of the point cloud matching process, where xyz is the coordinate system of the 3D measurement point cloud, x 0 y 0 z 0 is the model coordinate system, and T is the transformation matrix from the coordinate system of the 3D measurement point cloud to the model coordinate system.
具体实施方式Detailed ways
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。此外,下面所描述的本发明各个实施方式中所涉及到的技术特征只要彼此之间未构成冲突就可以相互组合。In order to make the objectives, technical solutions and advantages of the present invention clearer, the present invention will be further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are only used to explain the present invention, but not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not conflict with each other.
如图1所示,本发明实施例提供的一种基于三维测量点云的复杂零件位姿估计系统,其包括六自由度工业机器人100、被测工件200、光栅式面阵扫描仪300、标志点支撑架400以及数据处理上位机500。As shown in FIG. 1, an embodiment of the present invention provides a complex part pose estimation system based on a three-dimensional measurement point cloud, which includes a six-degree-of-freedom industrial robot 100, a measured workpiece 200, a raster area scanner 300, a logo Point support frame 400 and data processing host computer 500 .
该六自由度工业机器人100的末端夹持光栅式面阵扫描仪300,六自由度工业机器人100在其工作空间内按一定轨迹运动,以带动光栅式面阵扫描仪300移动,光栅式面阵扫描仪300对放置在标志点支撑架400上的被测工件200进行扫描测量,测量时,需要保证光栅式面阵扫描仪300发射的蓝光覆盖被测工件的表面及标志点支撑架上的部分标志点,且前后两次测量的时候需要保证至少存在3个不在同一直线上的公共标志点,以用于点云的拼接;该六自由度工业机器人100与数据处理上位机500相连,以将六自由度工业机器人100的运动参数(包括位置、姿态、运动状态等参数)实时传输给数据处理上位机500。The end of the six-degree-of-freedom industrial robot 100 clamps the raster-type area array scanner 300, and the six-degree-of-freedom industrial robot 100 moves according to a certain trajectory in its working space to drive the raster-type area array scanner 300 to move, and the raster-type area array scanner 300 moves. The scanner 300 scans and measures the workpiece under test 200 placed on the mark point support frame 400. During measurement, it is necessary to ensure that the blue light emitted by the raster area array scanner 300 covers the surface of the test workpiece and the part on the mark point support frame. Mark points, and it is necessary to ensure that there are at least 3 common marker points that are not on the same line during the two measurements before and after, for the splicing of point clouds; the six-degree-of-freedom industrial robot 100 is connected to the data processing host computer 500 to connect The motion parameters of the six-degree-of-freedom industrial robot 100 (including parameters such as position, attitude, and motion state) are transmitted to the data processing host computer 500 in real time.
该光栅式面阵扫描仪300安装在六自由度工业机器人100的末端,其在六自由度工业机器人100的带动下对被测工件的待测区域进行扫描测量,获取工件的外形点云,该光栅式面阵扫描仪300也与数据处理上位机500相连,以将测量所得的三维点云数据实时传输给处理上位机500。该光栅式面阵扫描仪300的测试原理是采用基于三角测量原理的双目立体视觉三维测量技术,通过扫描仪向被测物体投射具有不同相位差的参考光栅,利用双相机从两个视角采集被物体表面调制后的变形光栅图像,然后利用变形光栅计算每个像素点的相位值并根据相位值计算物体的三维点云坐标,采用光栅式面阵扫描仪获取被测工件的三维点云数据是现有技术,在此不赘述。具体的,光栅式面阵扫描仪通过交换机700与上位机数据处理软件通信,完成点云数据的采集和传输,光栅式面阵扫描仪例如可采用德国Gom公司的ATOS Compact Scan,我国惟景科技的PowerScan。The raster area array scanner 300 is installed at the end of the six-degree-of-freedom industrial robot 100, and is driven by the six-degree-of-freedom industrial robot 100 to scan and measure the to-be-measured area of the workpiece to be measured, and obtain the shape point cloud of the workpiece. The raster area array scanner 300 is also connected with the data processing host computer 500 to transmit the measured three-dimensional point cloud data to the processing host computer 500 in real time. The test principle of the raster area array scanner 300 is to use the binocular stereo vision three-dimensional measurement technology based on the principle of triangulation, project reference gratings with different phase differences to the measured object through the scanner, and use dual cameras to collect data from two perspectives. The deformed grating image modulated by the surface of the object, and then use the deformed grating to calculate the phase value of each pixel point and calculate the three-dimensional point cloud coordinates of the object according to the phase value, and use the raster area array scanner to obtain the three-dimensional point cloud data of the measured workpiece It is the prior art and will not be repeated here. Specifically, the raster area array scanner communicates with the data processing software of the upper computer through the switch 700 to complete the collection and transmission of point cloud data. PowerScan.
该数据处理上位机500用于接收六自由度工业机器人100的运动参数以及光栅式面阵扫描仪300的测量数据,并计算获得被测工件在机器人基坐标系下的位姿,以此完成复杂零件位姿的估计。数据处理上位机500中的上位机数据处理软件具备点云数据获取、点云拼接、点云精简、点云位置转换、点云-设计模型匹配等功能,可实现基于测量点云的坐标转换(转换到六自由度工业机器人基坐标系下)和点云匹配功能,从而输出被测工件相对六自由度工业机器人基坐标系的相对位姿,也即根据接收的六自由度工业机器人100的运动参数以及光栅式面阵扫描仪300的测量数据计算出被测工件在机器人基坐标系下的位姿,该位姿即为所求。The data processing host computer 500 is used to receive the motion parameters of the six-degree-of-freedom industrial robot 100 and the measurement data of the raster area array scanner 300, and calculate and obtain the pose of the measured workpiece in the robot base coordinate system, so as to complete the complex Estimation of part pose. The host computer data processing software in the data processing host computer 500 has the functions of point cloud data acquisition, point cloud splicing, point cloud reduction, point cloud position conversion, point cloud-design model matching, etc., and can realize coordinate conversion based on measurement point cloud ( Convert to the base coordinate system of the six-degree-of-freedom industrial robot) and point cloud matching function, so as to output the relative pose of the measured workpiece relative to the six-degree-of-freedom industrial robot base coordinate system, that is, according to the received motion of the six-degree-of-freedom industrial robot 100 The parameters and the measurement data of the raster area array scanner 300 are used to calculate the pose of the workpiece to be measured in the robot base coordinate system, and the pose is the desired one.
其中,点云数据获取是利用立体视觉三维测量技术获取被测工件上的点在光栅式面阵扫描仪测量坐标系下的坐标pi(i=1,2,...,r,...,s);点云拼接是利用标志点拼合测量前后的两幅点云,将第2,3,…,s幅的点云拼接到第1幅点云中,这样获取的总的点云坐标表示相对于六自由度工业机器人第一次测量点云时光栅式面阵扫描仪的测量坐标系的位置;点云精简是对机器人光学测量系统初始测得的大规模测量点云进行均匀采样,减少数据规模,从而提升数据处理计算效率;点云位置转换是根据光栅式面阵扫描仪与六自由度工业机器人末端法兰盘之间的相对位姿机器人第一次测量点时的机器人末端法兰盘相对机器人基坐标系的位姿及测量获得的点云pi(i=1,2,...,r,...,s),通过计算公式转换到六自由度工业机器人基坐标系下;点云-设计模型匹配是将转换到六自由度工业机器人基坐标系下的点云与设计模型匹配对齐至同一坐标系下从而获取转换矩阵,该转换矩阵即为被测工件在机器人基坐标系下的位姿。Among them, the point cloud data acquisition is to use the stereo vision three-dimensional measurement technology to obtain the coordinates p i (i=1,2,...,r,... .,s); point cloud splicing is to use the marker points to splicing the two point clouds before and after the measurement, and splicing the 2nd, 3rd, ..., s point clouds into the first point cloud, so that the total point cloud obtained The coordinates represent the position relative to the measurement coordinate system of the raster area array scanner when the six-degree-of-freedom industrial robot measures the point cloud for the first time; point cloud reduction is to uniformly sample the large-scale measurement point cloud initially measured by the robot's optical measurement system , reducing the data scale, thereby improving the efficiency of data processing and calculation; the point cloud position conversion is based on the relative pose between the raster area array scanner and the end flange of the six-degree-of-freedom industrial robot The pose of the robot end flange relative to the robot base coordinate system when the robot measures the point for the first time and the point cloud p i (i=1,2,...,r,...,s) obtained by measurement, through the calculation formula Converted to the base coordinate system of the six-degree-of-freedom industrial robot; point cloud-design model matching is to match and align the point cloud converted to the base coordinate system of the six-degree-of-freedom industrial robot with the design model to the same coordinate system to obtain the transformation matrix. The transformation matrix is the pose of the measured workpiece in the robot base coordinate system.
在测量之前六自由度工业机器人100的末端法兰盘与光栅式面阵扫描仪300的测量坐标系之前的位置关系通过手眼标定获得,其为现有技术,在此不赘述。本发明在获取被测工件200点云过程中,光栅式面阵扫描仪300的蓝光覆盖工件和部分标志点,以便后续的点云可以拼接。Before the measurement, the positional relationship between the end flange of the 6-DOF industrial robot 100 and the measurement coordinate system of the raster area array scanner 300 is obtained by hand-eye calibration, which is the prior art, and will not be described here. In the present invention, in the process of acquiring the point cloud of the workpiece 200 to be tested, the blue light of the raster area array scanner 300 covers the workpiece and part of the marker points, so that the subsequent point clouds can be spliced.
具体的,六自由度工业机器人100通过六自由度工业机器人控制器600与数据处理上位机500相连,光栅式面阵扫描仪300与六自由度工业机器人控制器600相连,通过六自由度工业机器人控制器600的控制实现上位机软件的信号触发和零件点云数据采集的同步。Specifically, the 6-DOF industrial robot 100 is connected to the data processing host computer 500 through the 6-DOF industrial robot controller 600, and the raster area array scanner 300 is connected to the 6-DOF industrial robot controller 600. The control of the controller 600 realizes the synchronization of the signal triggering of the upper computer software and the data acquisition of the part point cloud.
本发明的六自由度工业机器人100与光栅式面阵扫描仪300连接至同一数据处理上位机500,在测量过程中记录机器人第一次测量时的位姿参数,包括六个轴的角度和末端位置,将测量数据同步到数据处理上位机500。The six-degree-of-freedom industrial robot 100 and the raster area array scanner 300 of the present invention are connected to the same data processing host computer 500, and the pose parameters of the robot during the first measurement, including the angles and ends of the six axes, are recorded during the measurement process. position, and synchronize the measurement data to the data processing upper computer 500.
装配时标志点支撑架的高度不能大于被测工件200的最低高度,以免影响被测工件的点云,干扰后面的点云匹配过程,同时,需要保证标志点之间的间距为3cm~4cm,满足前后两次测量的点云能够拼接上且一次测量范围内标志点不能过多的要求。When assembling, the height of the support frame of the mark points should not be greater than the minimum height of the measured workpiece 200, so as not to affect the point cloud of the measured workpiece and interfere with the subsequent point cloud matching process. It satisfies the requirement that the point clouds of the two measurements before and after can be spliced together and the marker points within one measurement range cannot be too many.
实际工作时,首先需要保证六自由度工业机器人、面阵扫描仪和上位机处理软件在开启状态,通过控制机器人示教器来控制机器人实际测量的姿态,需要保证测点能够满足测量要求,前后两次测量需要保证两幅图像中至少包含了3个不在同一条直线上的三个标志点,从而完成点云的拼接。在完成测量点云数据采集后,利用上位机数据处理软件对测量的点云进行精简,目的是为了提高点云质量和降低数据量,然后输入手眼标定参数和第一次机器人的测量位姿,将测量点云转换到机器人基坐标系下,之后就可以进行点云-设计模型匹配,将点云模型匹配到设计模型坐标系,同时记录下每次点云匹配时的转换矩阵Tk,匹配完成后,计算便是需要求解的被测零件相对于六自由度工业机器人坐标系的位姿。In actual work, first of all, it is necessary to ensure that the six-degree-of-freedom industrial robot, the area array scanner and the processing software of the host computer are in the open state. By controlling the robot teach pendant to control the actual measurement posture of the robot, it is necessary to ensure that the measurement points can meet the measurement requirements. For the two measurements, it is necessary to ensure that the two images contain at least three marker points that are not on the same straight line, so as to complete the point cloud stitching. After the measurement point cloud data collection is completed, the data processing software of the host computer is used to simplify the measured point cloud, in order to improve the quality of the point cloud and reduce the amount of data, and then input the hand-eye calibration parameters and the first measurement pose of the robot. Convert the measured point cloud to the robot base coordinate system, and then perform point cloud-design model matching, match the point cloud model to the design model coordinate system, and record the transformation matrix T k for each point cloud matching, matching After completion, calculate It is the pose of the measured part to be solved relative to the coordinate system of the six-degree-of-freedom industrial robot.
采用本发明的基于三维测量点云的复杂零件位姿估计系统进行复杂零件位姿估计,基本思想是通过手眼标定获得光栅式面阵扫描仪与六自由度工业机器人的转换矩阵,将工件测量点云转换到六自由度工业机器人的基坐标系下面,再通过三维点云匹配将点云与工件三维设计模型相匹配获取转换矩阵,具体包括如下步骤:The complex part pose estimation system based on the three-dimensional measurement point cloud of the present invention is used to estimate the complex part pose. The cloud is converted to the base coordinate system of the six-degree-of-freedom industrial robot, and then the point cloud is matched with the 3D design model of the workpiece through 3D point cloud matching to obtain the transformation matrix, which includes the following steps:
S1标定六自由度工业机器人末端法兰盘与光栅式面阵扫描仪测量坐标系之间的转换矩阵其中,R为光栅式面阵扫描仪测量坐标系相对六自由度工业机器人末端法兰盘的旋转矩阵,t为光栅式面阵扫描仪测量坐标系相对六自由度工业机器人末端法兰盘的平移矩阵,具体采用手眼标定进行标定,其为现有技术,在此不赘述;S1 calibrates the transformation matrix between the end flange of the 6-DOF industrial robot and the measurement coordinate system of the raster area array scanner Among them, R is the rotation matrix of the coordinate system measured by the raster area array scanner relative to the end flange of the 6-DOF industrial robot, and t is the translation of the coordinate system measured by the raster area array scanner relative to the end flange of the 6-DOF industrial robot. The matrix is specifically calibrated by hand-eye calibration, which is the prior art and will not be repeated here;
S2通过六自由度工业机器人带动光栅式面阵扫描仪运动以对被测工件进行扫描,进而获取被测工件的三维点云数据pi(i=1,2,...,r,...,s),即通过光栅式面阵扫描仪获取被测工件的三维点云数据,其为现有技术,在此不赘述;S2 drives the raster area scanner to move through a six-degree-of-freedom industrial robot to scan the measured workpiece, and then obtains the 3D point cloud data pi ( i =1,2,...,r,.. ., s), namely obtain the three-dimensional point cloud data of the workpiece under test by the raster area array scanner, which is the prior art, and will not be repeated here;
S3将被测工件的三维点云数据转换至机器人基坐标系下以获取转换后的三维点云数据:S3 converts the 3D point cloud data of the measured workpiece to the robot base coordinate system to obtain the converted 3D point cloud data:
其中,为第一次测量时机器人的位姿,即第一次测量时机器人末端法兰盘到机器人基坐标系的转换位姿,其根据机器人的运动参数即可运算获得,T是机器人连杆j-1到连杆j的转换矩阵;in, is the pose of the robot during the first measurement, that is, the transformation pose from the flange at the end of the robot to the base coordinate system of the robot during the first measurement, which can be obtained by operation according to the motion parameters of the robot, T is the robot connecting rod j- 1 to the transformation matrix of link j;
S4将转换后的三维点云数据与被测工件的三维设计模型进行匹配,以获得被测工件相对机器人基坐标系的位姿,以此完成复杂零件位姿的估计,具体的,通过ICP匹配算法计算每一次匹配完成之后点云到三维设计模型坐标系的转换矩阵设定匹配次数为n,然后根据每次匹配对应的转换矩阵Tk计算被测工件在机器人基坐标系下的位姿即为所求。S4 matches the converted 3D point cloud data with the 3D design model of the measured workpiece to obtain the pose of the measured workpiece relative to the robot base coordinate system, so as to complete the estimation of the pose of complex parts. Specifically, through ICP matching The algorithm calculates the transformation matrix from the point cloud to the coordinate system of the 3D design model after each matching is completed Set the number of matches as n, and then calculate the pose of the workpiece under the robot base coordinate system according to the transformation matrix T k corresponding to each match That is what is asked for.
假设Q表示被测工件三维设计模型的点云,包括点云qi(i=1,2,...,a,...,l),P表示转换后的三维点云数据,包括点云Bpi(i=1,2,...,r,...,s),Tk的求解方式如下:Suppose Q represents the point cloud of the 3D design model of the tested workpiece, including the point cloud qi ( i =1,2,...,a,...,l), and P represents the converted 3D point cloud data, including the point cloud Cloud B p i (i=1,2,...,r,...,s), Tk is solved as follows:
S41对P中所有点Bpi从Q中搜索各点对应的最近点qi,计算质心μP、μQ及坐标差 S41 searches for the nearest point qi corresponding to each point in Q for all points B p i in P , and calculates the centroid μ P , μ Q and the coordinate difference
S42由点集P、Q计算3×3阶协方差矩阵H:S42 calculates the 3×3 order covariance matrix H from the point sets P and Q:
其中,Hij表示矩阵H的第i行第j列元素;Among them, H ij represents the element of the i-th row and the j-th column of the matrix H;
S43由H构造4×4阶对称矩阵W:S43 constructs a 4×4 order symmetric matrix W from H:
S44计算矩阵W的特征值,提取最大特征值对应的特征向量进而求解旋转矩阵R和平移矩阵t:S44 calculates the eigenvalue of the matrix W, and extracts the eigenvector corresponding to the largest eigenvalue Then solve the rotation matrix R and translation matrix t:
t=μQ-R×μP t = μQ- R×μP
进而求得:And then get:
获得转换矩阵Tk后,利用Tk求解pi(i=1,2,...,r,...,s)匹配后的位置pi′,pi′=Tk×pi,下一次匹配时更新pi=pi′,然后重复步骤S41-S44,以此匹配n次,进而提高匹配精度,然后根据每次的转换矩阵Tk计算被测工件在机器人基坐标系下的位姿即为所求,其中,T1即为第一次匹配时获得的转换矩阵,T2即为第二次匹配时获得的转换矩阵,以此类推,Tn即为第n次匹配时获得的转换矩阵。After obtaining the transformation matrix T k , use T k to solve the matched position p i ' of p i (i=1,2,...,r,...,s), p i '=T k × pi , In the next match, update p i = p i ', and then repeat steps S41-S44 to match n times to improve the matching accuracy, and then calculate the measured workpiece in the robot base coordinate system according to the transformation matrix T k each time. pose That is, where T 1 is the transformation matrix obtained during the first match, T 2 is the transformation matrix obtained during the second match, and so on, T n is the transformation matrix obtained during the nth match transformation matrix.
本发明的方法例如可以估计某型号的密封面法兰盘样件的位姿,通过测量的三维点云数据经过一系列的坐标转换之后可以巧妙的获取工件在机器人基坐标系下的位姿,获取位姿之后,便可以对机器人进行后续的路径规划与仿真。For example, the method of the present invention can estimate the pose of a certain type of sealing surface flange sample, and can skillfully obtain the pose of the workpiece in the robot base coordinate system through the measured three-dimensional point cloud data after a series of coordinate transformations, After the pose is obtained, the subsequent path planning and simulation of the robot can be performed.
本领域的技术人员容易理解,以上所述仅为本发明的较佳实施例而已,并不用以限制本发明,凡在本发明的精神和原则之内所作的任何修改、等同替换和改进等,均应包含在本发明的保护范围之内。Those skilled in the art can easily understand that the above are only preferred embodiments of the present invention, and are not intended to limit the present invention. Any modifications, equivalent replacements and improvements made within the spirit and principles of the present invention, etc., All should be included within the protection scope of the present invention.
Claims (7)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811428771.5A CN109373898B (en) | 2018-11-27 | 2018-11-27 | A system and method for pose estimation of complex parts based on 3D measurement point cloud |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811428771.5A CN109373898B (en) | 2018-11-27 | 2018-11-27 | A system and method for pose estimation of complex parts based on 3D measurement point cloud |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109373898A true CN109373898A (en) | 2019-02-22 |
CN109373898B CN109373898B (en) | 2020-07-10 |
Family
ID=65377364
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811428771.5A Active CN109373898B (en) | 2018-11-27 | 2018-11-27 | A system and method for pose estimation of complex parts based on 3D measurement point cloud |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109373898B (en) |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110310331A (en) * | 2019-06-18 | 2019-10-08 | 哈尔滨工程大学 | A Pose Estimation Method Based on the Combination of Line Features and Point Cloud Features |
CN110434671A (en) * | 2019-07-25 | 2019-11-12 | 王东 | A kind of cast member surface machining track calibration method based on pattern measurement |
CN110434679A (en) * | 2019-07-25 | 2019-11-12 | 王东 | A kind of Intelligent Machining method for the workpiece with random size error |
CN110553584A (en) * | 2019-08-30 | 2019-12-10 | 长春理工大学 | Measuring tool, automatic measuring system and measuring method for small-sized complex parts |
CN110634161A (en) * | 2019-08-30 | 2019-12-31 | 哈尔滨工业大学(深圳) | A fast and high-precision estimation method and device for workpiece pose based on point cloud data |
CN110634185A (en) * | 2019-07-31 | 2019-12-31 | 众宏(上海)自动化股份有限公司 | Visual algorithm for quickly forming point cloud for gear repair |
CN110640585A (en) * | 2019-10-25 | 2020-01-03 | 华中科技大学 | A three-dimensional non-contact measuring device and method for blade grinding and polishing |
CN111043963A (en) * | 2019-12-31 | 2020-04-21 | 芜湖哈特机器人产业技术研究院有限公司 | Three-dimensional scanning system measuring method of carriage container based on two-dimensional laser radar |
CN111551111A (en) * | 2020-05-13 | 2020-08-18 | 华中科技大学 | A fast visual positioning method of part feature robot based on standard spherical array |
CN111929300A (en) * | 2020-08-11 | 2020-11-13 | 广西机械工业研究院有限责任公司 | Automatic detection line of three-dimensional image scanning robot |
CN112161619A (en) * | 2020-09-16 | 2021-01-01 | 杭州思锐迪科技有限公司 | Pose detection method, three-dimensional scanning path planning method and detection system |
CN112307562A (en) * | 2020-10-30 | 2021-02-02 | 泉州装备制造研究所 | Assembly method of complex parts on large aircraft with integrated thermal deformation and gravity deformation |
CN112577447A (en) * | 2020-12-07 | 2021-03-30 | 新拓三维技术(深圳)有限公司 | Three-dimensional full-automatic scanning system and method |
CN112828552A (en) * | 2021-01-29 | 2021-05-25 | 华中科技大学 | A method and system for intelligent docking of flange parts |
CN112828878A (en) * | 2019-11-22 | 2021-05-25 | 中国科学院沈阳自动化研究所 | A three-dimensional measurement and tracking method for large-scale equipment docking process |
CN113386136A (en) * | 2021-06-30 | 2021-09-14 | 华中科技大学 | Robot posture correction method and system based on standard spherical array target estimation |
CN113894785A (en) * | 2021-10-27 | 2022-01-07 | 华中科技大学无锡研究院 | Control method, device and system for in-situ measurement and machining of turbine blades |
CN114036666A (en) * | 2021-11-04 | 2022-02-11 | 山西汾西重工有限责任公司 | Method for predicting wall thickness deviation of casting part |
CN114279326A (en) * | 2021-12-22 | 2022-04-05 | 易思维(天津)科技有限公司 | Global positioning method of three-dimensional scanning equipment |
CN114417616A (en) * | 2022-01-20 | 2022-04-29 | 青岛理工大学 | Digital twin modeling method and system for assembly robot teleoperation environment |
CN114459377A (en) * | 2022-02-10 | 2022-05-10 | 中国航发沈阳发动机研究所 | Device and method for measuring blade profile of aircraft engine |
CN114485488A (en) * | 2021-07-13 | 2022-05-13 | 北京航天计量测试技术研究所 | Automatic measuring system and measuring method for exhaust area of turbine guider |
CN114742883A (en) * | 2022-03-30 | 2022-07-12 | 华中科技大学 | An automated assembly method and system based on a planar workpiece positioning algorithm |
CN115049730A (en) * | 2022-05-31 | 2022-09-13 | 北京有竹居网络技术有限公司 | Part assembling method, part assembling device, electronic device and storage medium |
US11644296B1 (en) | 2021-12-17 | 2023-05-09 | Industrial Technology Research Institute | 3D measuring equipment and 3D measuring method |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101097131A (en) * | 2006-06-30 | 2008-01-02 | 廊坊智通机器人系统有限公司 | Method for marking workpieces coordinate system |
US20090234502A1 (en) * | 2008-03-12 | 2009-09-17 | Denso Wave Incorporated | Apparatus for determining pickup pose of robot arm with camera |
CN101566461A (en) * | 2009-05-18 | 2009-10-28 | 西安交通大学 | Method for quickly measuring blade of large-sized water turbine |
JP4649554B1 (en) * | 2010-02-26 | 2011-03-09 | 株式会社三次元メディア | Robot controller |
CN106370106A (en) * | 2016-09-30 | 2017-02-01 | 上海航天精密机械研究所 | Industrial robot and linear guide rail-combined linear laser scanning measurement system and method |
CN106959080A (en) * | 2017-04-10 | 2017-07-18 | 上海交通大学 | A kind of large complicated carved components three-dimensional pattern optical measuring system and method |
CN107270833A (en) * | 2017-08-09 | 2017-10-20 | 武汉智诺维科技有限公司 | A kind of complex curved surface parts three-dimension measuring system and method |
-
2018
- 2018-11-27 CN CN201811428771.5A patent/CN109373898B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101097131A (en) * | 2006-06-30 | 2008-01-02 | 廊坊智通机器人系统有限公司 | Method for marking workpieces coordinate system |
US20090234502A1 (en) * | 2008-03-12 | 2009-09-17 | Denso Wave Incorporated | Apparatus for determining pickup pose of robot arm with camera |
CN101566461A (en) * | 2009-05-18 | 2009-10-28 | 西安交通大学 | Method for quickly measuring blade of large-sized water turbine |
JP4649554B1 (en) * | 2010-02-26 | 2011-03-09 | 株式会社三次元メディア | Robot controller |
CN106370106A (en) * | 2016-09-30 | 2017-02-01 | 上海航天精密机械研究所 | Industrial robot and linear guide rail-combined linear laser scanning measurement system and method |
CN106959080A (en) * | 2017-04-10 | 2017-07-18 | 上海交通大学 | A kind of large complicated carved components three-dimensional pattern optical measuring system and method |
CN107270833A (en) * | 2017-08-09 | 2017-10-20 | 武汉智诺维科技有限公司 | A kind of complex curved surface parts three-dimension measuring system and method |
Non-Patent Citations (2)
Title |
---|
杨光 等: "自动化三维精密测量技术及其在锻压领域的应用", 《锻压技术》 * |
杨守瑞: "大型构件复杂曲面自动化测量方法与技术", 《中国博士学位论文全文数据库 信息科技辑》 * |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110310331A (en) * | 2019-06-18 | 2019-10-08 | 哈尔滨工程大学 | A Pose Estimation Method Based on the Combination of Line Features and Point Cloud Features |
CN110434671A (en) * | 2019-07-25 | 2019-11-12 | 王东 | A kind of cast member surface machining track calibration method based on pattern measurement |
CN110434679A (en) * | 2019-07-25 | 2019-11-12 | 王东 | A kind of Intelligent Machining method for the workpiece with random size error |
CN110634185A (en) * | 2019-07-31 | 2019-12-31 | 众宏(上海)自动化股份有限公司 | Visual algorithm for quickly forming point cloud for gear repair |
CN110553584A (en) * | 2019-08-30 | 2019-12-10 | 长春理工大学 | Measuring tool, automatic measuring system and measuring method for small-sized complex parts |
CN110634161A (en) * | 2019-08-30 | 2019-12-31 | 哈尔滨工业大学(深圳) | A fast and high-precision estimation method and device for workpiece pose based on point cloud data |
CN110640585A (en) * | 2019-10-25 | 2020-01-03 | 华中科技大学 | A three-dimensional non-contact measuring device and method for blade grinding and polishing |
CN112828878B (en) * | 2019-11-22 | 2022-10-25 | 中国科学院沈阳自动化研究所 | A three-dimensional measurement and tracking method for large-scale equipment docking process |
CN112828878A (en) * | 2019-11-22 | 2021-05-25 | 中国科学院沈阳自动化研究所 | A three-dimensional measurement and tracking method for large-scale equipment docking process |
CN111043963A (en) * | 2019-12-31 | 2020-04-21 | 芜湖哈特机器人产业技术研究院有限公司 | Three-dimensional scanning system measuring method of carriage container based on two-dimensional laser radar |
CN111551111A (en) * | 2020-05-13 | 2020-08-18 | 华中科技大学 | A fast visual positioning method of part feature robot based on standard spherical array |
CN111929300A (en) * | 2020-08-11 | 2020-11-13 | 广西机械工业研究院有限责任公司 | Automatic detection line of three-dimensional image scanning robot |
CN112161619A (en) * | 2020-09-16 | 2021-01-01 | 杭州思锐迪科技有限公司 | Pose detection method, three-dimensional scanning path planning method and detection system |
CN112161619B (en) * | 2020-09-16 | 2022-11-15 | 思看科技(杭州)股份有限公司 | Pose detection method, three-dimensional scanning path planning method and detection system |
CN112307562B (en) * | 2020-10-30 | 2022-03-01 | 泉州装备制造研究所 | Method for assembling complex parts on large-scale airplane by combining thermal deformation and gravity deformation |
CN112307562A (en) * | 2020-10-30 | 2021-02-02 | 泉州装备制造研究所 | Assembly method of complex parts on large aircraft with integrated thermal deformation and gravity deformation |
CN112577447A (en) * | 2020-12-07 | 2021-03-30 | 新拓三维技术(深圳)有限公司 | Three-dimensional full-automatic scanning system and method |
CN112577447B (en) * | 2020-12-07 | 2022-03-22 | 新拓三维技术(深圳)有限公司 | Three-dimensional full-automatic scanning system and method |
CN112828552A (en) * | 2021-01-29 | 2021-05-25 | 华中科技大学 | A method and system for intelligent docking of flange parts |
CN112828552B (en) * | 2021-01-29 | 2022-05-20 | 华中科技大学 | Intelligent butt joint method and system for flange parts |
CN113386136A (en) * | 2021-06-30 | 2021-09-14 | 华中科技大学 | Robot posture correction method and system based on standard spherical array target estimation |
CN113386136B (en) * | 2021-06-30 | 2022-05-20 | 华中科技大学 | A Robot Pose Correction Method and System Based on Standard Spherical Array Target Estimation |
CN114485488B (en) * | 2021-07-13 | 2024-04-09 | 北京航天计量测试技术研究所 | Automatic measurement system and measurement method for exhaust area of turbine guider |
CN114485488A (en) * | 2021-07-13 | 2022-05-13 | 北京航天计量测试技术研究所 | Automatic measuring system and measuring method for exhaust area of turbine guider |
CN113894785A (en) * | 2021-10-27 | 2022-01-07 | 华中科技大学无锡研究院 | Control method, device and system for in-situ measurement and machining of turbine blades |
CN113894785B (en) * | 2021-10-27 | 2023-06-09 | 华中科技大学无锡研究院 | Control method, device and system for in-situ measurement and processing of water turbine blades |
CN114036666A (en) * | 2021-11-04 | 2022-02-11 | 山西汾西重工有限责任公司 | Method for predicting wall thickness deviation of casting part |
US11644296B1 (en) | 2021-12-17 | 2023-05-09 | Industrial Technology Research Institute | 3D measuring equipment and 3D measuring method |
TWI806294B (en) * | 2021-12-17 | 2023-06-21 | 財團法人工業技術研究院 | 3d measuring equipment and 3d measuring method |
CN114279326A (en) * | 2021-12-22 | 2022-04-05 | 易思维(天津)科技有限公司 | Global positioning method of three-dimensional scanning equipment |
CN114279326B (en) * | 2021-12-22 | 2024-05-28 | 易思维(天津)科技有限公司 | Global positioning method of three-dimensional scanning equipment |
CN114417616A (en) * | 2022-01-20 | 2022-04-29 | 青岛理工大学 | Digital twin modeling method and system for assembly robot teleoperation environment |
CN114417616B (en) * | 2022-01-20 | 2024-11-05 | 青岛理工大学 | A digital twin modeling method and system for assembly robot teleoperation environment |
CN114459377A (en) * | 2022-02-10 | 2022-05-10 | 中国航发沈阳发动机研究所 | Device and method for measuring blade profile of aircraft engine |
CN114742883A (en) * | 2022-03-30 | 2022-07-12 | 华中科技大学 | An automated assembly method and system based on a planar workpiece positioning algorithm |
CN114742883B (en) * | 2022-03-30 | 2024-09-24 | 华中科技大学 | An automated assembly method and system based on planar workpiece positioning algorithm |
CN115049730A (en) * | 2022-05-31 | 2022-09-13 | 北京有竹居网络技术有限公司 | Part assembling method, part assembling device, electronic device and storage medium |
CN115049730B (en) * | 2022-05-31 | 2024-04-26 | 北京有竹居网络技术有限公司 | Component mounting method, component mounting device, electronic apparatus, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN109373898B (en) | 2020-07-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109373898B (en) | A system and method for pose estimation of complex parts based on 3D measurement point cloud | |
CN109990701B (en) | A mobile measurement system and method for a large-scale complex surface three-dimensional topography robot | |
Koide et al. | General hand–eye calibration based on reprojection error minimization | |
CN105066884B (en) | A kind of robot end's deviations bearing calibration and system | |
US6816755B2 (en) | Method and apparatus for single camera 3D vision guided robotics | |
CN102155923B (en) | Splicing measuring method and system based on three-dimensional target | |
US8520067B2 (en) | Method for calibrating a measuring system | |
JP4021413B2 (en) | Measuring device | |
JP5922572B2 (en) | Practical 3D vision system apparatus and method | |
US11403780B2 (en) | Camera calibration device and camera calibration method | |
CN110555889A (en) | CALTag and point cloud information-based depth camera hand-eye calibration method | |
JP6324025B2 (en) | Information processing apparatus and information processing method | |
WO2018043525A1 (en) | Robot system, robot system control device, and robot system control method | |
JP2005515910A (en) | Method and apparatus for single camera 3D vision guide robotics | |
CN106056587A (en) | Full-view linear laser scanning 3D imaging calibration device and full-view linear laser scanning 3D imaging calibration method | |
CN111127568A (en) | A camera pose calibration method based on spatial point information | |
JP6855491B2 (en) | Robot system, robot system control device, and robot system control method | |
JP2010172986A (en) | Robot vision system and automatic calibration method | |
JPH10253322A (en) | Method and apparatus for designating position of object in space | |
KR101972432B1 (en) | A laser-vision sensor and calibration method thereof | |
CN113890955A (en) | Scanning method, device and system of multiple sets of photographing scanners | |
CN108180834A (en) | A kind of industrial robot is the same as three-dimensional imaging instrument position orientation relation scene real-time calibration method | |
CN113281723A (en) | Calibration method for structural parameters between 3D laser radar and camera based on AR tag | |
CN106323286A (en) | Transforming method of robot coordinate system and three-dimensional measurement coordinate system | |
CN115284292A (en) | Mechanical arm hand-eye calibration method and device based on laser camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
EE01 | Entry into force of recordation of patent licensing contract |
Application publication date: 20190222 Assignee: WUHAN POWER3D TECHNOLOGY Ltd. Assignor: HUAZHONG University OF SCIENCE AND TECHNOLOGY Contract record no.: X2022420000110 Denomination of invention: A pose estimation system and method for complex parts based on 3d measurement point cloud Granted publication date: 20200710 License type: Common License Record date: 20220930 |
|
EE01 | Entry into force of recordation of patent licensing contract |