CN116758160A - Position and orientation detection method and assembly method of optical components assembly process based on orthogonal vision system - Google Patents

Position and orientation detection method and assembly method of optical components assembly process based on orthogonal vision system Download PDF

Info

Publication number
CN116758160A
CN116758160A CN202310735104.6A CN202310735104A CN116758160A CN 116758160 A CN116758160 A CN 116758160A CN 202310735104 A CN202310735104 A CN 202310735104A CN 116758160 A CN116758160 A CN 116758160A
Authority
CN
China
Prior art keywords
optical element
coordinate system
camera
global
cameras
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310735104.6A
Other languages
Chinese (zh)
Other versions
CN116758160B (en
Inventor
陈冠华
刘国栋
陈凤东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology Shenzhen
Original Assignee
Harbin Institute of Technology Shenzhen
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology Shenzhen filed Critical Harbin Institute of Technology Shenzhen
Priority to CN202310735104.6A priority Critical patent/CN116758160B/en
Publication of CN116758160A publication Critical patent/CN116758160A/en
Application granted granted Critical
Publication of CN116758160B publication Critical patent/CN116758160B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

基于正交视觉系统的光学元件装配过程位姿检测方法及装配方法,属于工业装配视觉检测领域,本发明为解决现有光学元件装配中光学元件边缘与装配框容易发生碰撞,导致装配失误的问题。本发明方法包括:S1、构建视觉检测系统步骤;所述视觉检测系统包括一台全局相机和两台侧视角相机,S2、视觉检测系统中三台相机的联合标定步骤;其中,以全局相机的相机坐标系作为统一的全局坐标系;S3、光学元件位姿解算步骤;三台相机同步获取光学元件的俯视图和两个侧视图,从三台相机的图像中提取光学元件边缘在各自像素坐标系下的解析式,结合联合标定数据,将不同相机的像素坐标系下的边缘对齐到统一的全局坐标系下;通过这些边缘的解析式确定光学元件的位姿。

The position and orientation detection method and assembly method of optical component assembly process based on orthogonal vision system belong to the field of industrial assembly visual detection. The present invention solves the problem in the existing optical component assembly that the edge of the optical component and the assembly frame are prone to collision, resulting in assembly errors. . The method of the present invention includes: S1, the step of constructing a visual detection system; the visual detection system includes one global camera and two side-view cameras; S2, the joint calibration step of the three cameras in the visual detection system; wherein, the global camera The camera coordinate system is used as a unified global coordinate system; S3, the optical element pose solution step; the three cameras simultaneously obtain the top view and two side views of the optical element, and extract the respective pixel coordinates of the edge of the optical element from the images of the three cameras The analytical formula under the system, combined with the joint calibration data, aligns the edges in the pixel coordinate systems of different cameras to a unified global coordinate system; the pose of the optical element is determined through the analytical formula of these edges.

Description

基于正交视觉系统的光学元件装配过程位姿检测方法及装配 方法Position detection method and assembly of optical components during assembly based on orthogonal vision system method

技术领域Technical field

本发明涉及工业装配视觉检测领域,具体为在工业自动化光学元件装配中对光学元件六自由度位姿解算的方法。The invention relates to the field of visual inspection of industrial assembly, specifically a method for solving the six-degree-of-freedom pose of optical components in the assembly of industrial automated optical components.

背景技术Background technique

在光学元件自动化装配过程中,利用机械手加持光学元件,并放置到机械框内,实现光学元件自动化装配。装配过程中,对光学元件、机械手、装配框之间的相对位置姿态检测的精度要求较高。现有直接视觉检测方法对透明光学元件的定位可靠性不佳,精度有限,加之光学元件、装配框都存在不可忽略的制造公差,因此在FOA装配平台(final opticsassembl,终端光学系统)光学元件装配中光学元件边缘与装配框容易发生碰撞,导致装配失误。In the process of automated assembly of optical components, a robot is used to hold the optical components and place them in the mechanical frame to realize automated assembly of optical components. During the assembly process, high accuracy is required for detecting the relative positions and postures between optical components, manipulators, and assembly frames. The existing direct visual inspection method has poor positioning reliability and limited accuracy for transparent optical components. In addition, there are non-negligible manufacturing tolerances in both optical components and assembly frames. Therefore, the optical components are assembled on the FOA assembly platform (final optics assembly, terminal optical system). The edge of the optical element is prone to collide with the assembly frame, leading to assembly errors.

发明内容Contents of the invention

针对现有光学元件装配中光学元件边缘与装配框容易发生碰撞,导致装配失误的问题,本发明提供一种基于正交视觉检测系统的光学元件装配视觉检测方法及装配方法,在正交的位置部署视觉对光学元件检测边缘位置,结合联合标定数据,解算出光学元件部分几何尺寸、空间位置、姿态,实现光学元件可靠和精确定位,指导机械手进行位姿调整。In order to solve the problem of easy collision between the edge of the optical element and the assembly frame in the existing optical element assembly, resulting in assembly errors, the present invention provides a visual inspection method and assembly method for optical element assembly based on an orthogonal visual inspection system. Deploy vision to detect the edge position of the optical element, and combine it with the joint calibration data to calculate the geometric size, spatial position, and attitude of the optical element, achieve reliable and precise positioning of the optical element, and guide the manipulator to adjust its posture.

本发明所述基于正交视觉系统的光学元件装配过程位姿检测方法,该方法包括以下步骤:The pose detection method of the optical element assembly process based on the orthogonal vision system of the present invention includes the following steps:

S1、构建视觉检测系统步骤;S1. Steps to build a visual inspection system;

所述视觉检测系统包括一台全局相机和两台侧视角相机,其中两台侧视角相机前方使用条形光源对光学元件掠射照明;The visual inspection system includes a global camera and two side-view cameras, in which a strip light source is used in front of the two side-view cameras to illuminate the optical elements glancingly;

S2、视觉检测系统中三台相机的联合标定步骤;S2. Joint calibration steps of three cameras in the visual inspection system;

其中,以全局相机的相机坐标系作为统一的全局坐标系;标定出三台相机内参、外参,建立各相机的像素坐标系到统一的全局坐标系的空间映射关系;Among them, the camera coordinate system of the global camera is used as the unified global coordinate system; the internal and external parameters of the three cameras are calibrated, and the spatial mapping relationship between the pixel coordinate system of each camera and the unified global coordinate system is established;

S3、光学元件位姿解算步骤;S3. Optical component pose calculation steps;

三台相机同步获取光学元件的俯视图和两个侧视图,从三台相机的图像中提取光学元件边缘在各自像素坐标系下的解析式,结合联合标定数据,将不同相机的像素坐标系下的边缘对齐到统一的全局坐标系下;通过这些边缘的解析式确定光学元件的位姿。Three cameras simultaneously acquire the top view and two side views of the optical element. The analytical formulas of the edge of the optical element in their respective pixel coordinate systems are extracted from the images of the three cameras. Combined with the joint calibration data, the pixel coordinates of the different cameras are combined. The edges are aligned to a unified global coordinate system; the pose of the optical element is determined through the analytical expressions of these edges.

优选地,待检测的光学元件水平方向置于FOA装配平台上,全局相机设置于FOA装配平台上方,检测时位于光学元件侧方的两台相机为1号侧视角相机和2号侧视角相机,Preferably, the optical element to be inspected is placed on the FOA assembly platform in the horizontal direction, and the global camera is set above the FOA assembly platform. The two cameras located on the side of the optical element during inspection are Side View Camera No. 1 and Side View Camera No. 2.

三台相机的姿态相互正交,三台相机的光轴相交于光学元件上表面的一点,建立如下坐标系:The postures of the three cameras are orthogonal to each other, and the optical axes of the three cameras intersect at a point on the upper surface of the optical element. The following coordinate system is established:

全局相机的全局坐标系为OG-XGYGZG,全局相机的光轴方向与水平面垂直;The global coordinate system of the global camera is O G -X G Y G Z G , and the optical axis direction of the global camera is perpendicular to the horizontal plane;

1号侧视角相机坐标系为OC1-XC1YC1ZC1,2号侧视角相机坐标系为OC2-XC2YC2ZC2,两个相机的光轴方向与水平面平行;The coordinate system of side view camera No. 1 is O C1 -X C1 Y C1 Z C1 , and the coordinate system of side view camera No. 2 is O C2 -X C2 Y C2 Z C2 . The optical axes of the two cameras are parallel to the horizontal plane;

全局相机坐标系OG的X轴正方向与2号侧视角相机坐标系OC2的ZC2轴正方向相同;The positive direction of the X axis of the global camera coordinate system O G is the same as the positive direction of the Z C2 axis of the No. 2 side view camera coordinate system O C2 ;

光学元件的模型坐标系OM为OM-XMYMZM,且原点定义在光学元件上表面左上角的顶点A上,ZM轴与光学元件上表面垂直,ZM轴正方向向上,XM轴正方向为光学元件上表面左上角的顶点指向光学元件上表面左下角的顶点方向。The model coordinate system O M of the optical element is O M -X M Y M Z M , and the origin is defined at the vertex A in the upper left corner of the upper surface of the optical element. The Z M axis is perpendicular to the upper surface of the optical element, and the positive direction of the Z M axis is upward. , the positive direction of the

优选地,步骤S2的联合标定过程为:Preferably, the joint calibration process of step S2 is:

首先采用张正友标定方法分别标定三台相机的内参矩阵,建立各自二维像素坐标系到三维相机坐标系的映射关系;First, the Zhang Zhengyou calibration method is used to calibrate the internal parameter matrices of the three cameras respectively, and establish the mapping relationship between the respective two-dimensional pixel coordinate system and the three-dimensional camera coordinate system;

再以全局相机的相机坐标系作为统一的全局坐标系,分别标定两台侧视角相机的相机坐标系到全局坐标系的坐标系变换,获取两台侧视角相机的外参矩阵;Then use the camera coordinate system of the global camera as the unified global coordinate system, respectively calibrate the coordinate system transformation from the camera coordinate system of the two side-view cameras to the global coordinate system, and obtain the extrinsic parameter matrices of the two side-view cameras;

所述外参矩阵包括:1号侧视角相机坐标系OC1到全局坐标系OG的转换关系TC1=(RC1,tC1),RC1为1号侧视角相机坐标系OC1到全局坐标系OG的旋转矩阵,tC1为1号侧视角相机坐标系OC1到全局坐标系OG的平移向量;The external parameter matrix includes: the conversion relationship T C1 = ( RC1 , t C1 ) from the No. 1 side view camera coordinate system O C1 to the global coordinate system O G. R C1 is the No. 1 side view camera coordinate system O C1 to the global coordinate system O G. The rotation matrix of the coordinate system O G , t C1 is the translation vector from the No. 1 side view camera coordinate system O C1 to the global coordinate system O G ;

及2号侧视角相机坐标系OC2到全局坐标系OG的转换关系为TC2=(RC2,tC2),RC2为2号侧视角相机坐标系OC2到全局坐标系OG的旋转矩阵,tC2为2号侧视角相机坐标系OC2到全局坐标系OG的平移向量。And the conversion relationship between the No. 2 side-view camera coordinate system O C2 and the global coordinate system O G is T C2 = ( RC2 ,t C2 ). R C2 is the conversion relationship between the No. 2 side-view camera coordinate system O C2 and the global coordinate system O G. Rotation matrix, t C2 is the translation vector from the No. 2 side view camera coordinate system O C2 to the global coordinate system O G.

优选地,采用立体标定块并结合AprilTag算法标定两个侧视角相机的外参矩阵,所述立体标定块为长方体,四个侧面及顶面依次定义为立方体的表面0-表面4,表面0-表面4依次采用id为0-4的36H11 AprilTag图案,立体标定块的坐标系OB原点定义在表面0上的AprilTag图案(id=0)中心,X轴、Y轴平行于该表面,正方向分别为图案左上顶点到右上顶点方向、图案左上顶点到左下顶点方向,Z轴方向指向模型内侧;Preferably, a stereo calibration block is used in combination with the AprilTag algorithm to calibrate the extrinsic parameter matrices of the two side view cameras. The stereo calibration block is a cuboid, and the four sides and the top surface are sequentially defined as the surface 0-surface 4 of the cube, and the surface 0- Surface 4 adopts the 36H11 AprilTag pattern with IDs 0-4 in turn. The origin of the coordinate system O B of the three-dimensional calibration block is defined at the center of the AprilTag pattern (id=0) on surface 0. The X-axis and Y-axis are parallel to the surface, and the positive direction They are the direction from the upper left vertex of the pattern to the upper right vertex, the direction from the upper left vertex of the pattern to the lower left vertex, and the Z-axis direction points to the inside of the model;

采用AprilTag算法标定外参矩阵的过程为:The process of calibrating the external parameter matrix using the AprilTag algorithm is:

立体标定块表面4保持水平,从立体标定块的表面0-3中任意选择一面,令被选择面AprilTag图案到1号侧视角相机光心沿光轴方向的距离为280mm,表面4的AprilTag图案到全局相机光心沿光轴方向的距离为1200mm,并保证上述两相机视场中完整出现对应的AprilTag图案;Keep the surface 4 of the three-dimensional calibration block horizontal. Select one side from the surfaces 0-3 of the three-dimensional calibration block. Let the distance between the AprilTag pattern on the selected surface and the optical center of the No. 1 side-view camera along the optical axis direction be 280mm. The AprilTag pattern on the surface 4 The distance to the optical center of the global camera along the optical axis is 1200mm, and the corresponding AprilTag pattern must appear completely in the fields of view of the above two cameras;

当立体标定块与全局相机、1号侧视角相机满足上述要求时,使用外部触发方式控制两台相机在不大于15ms的时间差下捕捉图像,使用AprilTag算法计算视场内标定图形到各自相机的位置姿态;立体标定块上所有的图形的坐标系与标定块的坐标系OB的变换关系由CAD数据已知,推导出拍照时刻标定块相对于全局相机和1号侧视角相机的位姿,解算这两个位姿之间的变换TC1=(RC1,tC1),即为OC1到OG的转换关系,获取1号侧视角相机在OG下的外参矩阵;When the stereo calibration block, the global camera, and the No. 1 side view camera meet the above requirements, use the external trigger method to control the two cameras to capture images with a time difference of no more than 15ms, and use the AprilTag algorithm to calculate the positions of the calibration graphics in the field of view to the respective cameras. Attitude; the transformation relationship between the coordinate system of all graphics on the stereo calibration block and the coordinate system O B of the calibration block is known from the CAD data. The pose of the calibration block relative to the global camera and the No. 1 side view camera at the time of taking the photo is deduced. Solution Calculate the transformation between these two poses T C1 = (R C1 ,t C1 ), which is the conversion relationship from O C1 to O G , and obtain the external parameter matrix of side view camera No. 1 under O G ;

2号侧视角相机在OG下的外参矩阵的获取方式与上述过程相同。The extrinsic parameter matrix of No. 2 side-view camera under O G is obtained in the same way as the above process.

优选地,S3步骤中光学元件位姿解算过程为:Preferably, the optical element pose solution process in step S3 is:

S31、利用夹具将光学元件放于FOA平台预定位置,令三台相机同步采集光学元件的俯视图像IM和两个侧视图像ISL、ISF,将三幅图像进行预处理生成灰度图像;S31. Use the fixture to place the optical element at the predetermined position on the FOA platform, let the three cameras simultaneously collect the top view image I M and the two side view images I SL and I SF of the optical element, and preprocess the three images to generate a grayscale image. ;

S32、采用LSD算法(Line Segment Detector,线段检测器算法)从三幅灰度图像中提取线段,对这些线段进行处理,获取三幅灰度图像在各自相机像素坐标系下的光学元件边缘;S32. Use the LSD algorithm (Line Segment Detector, line segment detector algorithm) to extract line segments from the three grayscale images, process these line segments, and obtain the edges of the optical elements of the three grayscale images in the respective camera pixel coordinate systems;

S33、获取光学元件边缘在三个相机坐标系下的解析式,具体的,根据步骤S32匹配结果,当俯视图像IM中的某一条边缘与侧视图像ISL或侧视图像ISF中的某一条边缘同时对应CAD模型中同一条边缘,则认为上述两张图像中的边缘对应光学元件的同一边缘;根据该原则获取三幅灰度图像在各自相机坐标系下的解析式;S33. Obtain the analytical formula of the edge of the optical element in the three camera coordinate systems. Specifically, according to the matching result of step S32, when an edge in the top-view image IM matches an edge in the side-view image I SL or the side-view image I SF If a certain edge simultaneously corresponds to the same edge in the CAD model, it is considered that the edges in the above two images correspond to the same edge of the optical element; according to this principle, the analytical formulas of the three grayscale images in their respective camera coordinate systems are obtained;

S34、根据相机内参矩阵、外参矩阵,及三个相机坐标系下解析式,获取光学元件边缘在全局坐标系下的解析式;S34. According to the camera internal parameter matrix, external parameter matrix, and the analytical formulas in the three camera coordinate systems, obtain the analytical formula of the edge of the optical element in the global coordinate system;

S35、根据光学元件边缘在全局坐标系下的解析式,利用PnP算法根据对应关系解算光学元件的位置姿态及上表面的尺寸。S35. According to the analytical formula of the edge of the optical element in the global coordinate system, use the PnP algorithm to calculate the position, posture and size of the upper surface of the optical element based on the corresponding relationship.

优选地,步骤S32中光学元件边缘的获取过程为:Preferably, the acquisition process of the edge of the optical element in step S32 is:

采用LSD算法从三幅灰度图像中提取出所有线段,并作如下处理:The LSD algorithm is used to extract all line segments from the three grayscale images and processed as follows:

步骤一、剔除L<25像素的过短线段,线段长度L按下式获取:Step 1. Eliminate too short line segments with L<25 pixels. The length of the line segment L is obtained as follows:

(x1,y1),(x2,y2)分别是线段两端点的像素坐标;(x 1 , y 1 ), (x 2 , y 2 ) are the pixel coordinates of the two end points of the line segment respectively;

步骤二、将还未融合的线段集合记为lOri,融合的线段集合记为lmeg,经步骤一剔除过短线段后的剩余线段首先放在集合lOri中;Step 2: Record the set of unfused line segments as l Ori and the set of fused line segments as l meg . The remaining line segments after eliminating the short line segments in step 1 are first placed in the set l Ori ;

步骤三、融合线段:Step 3. Fusion of line segments:

在集合lOri中搜索L最大的线段,记为l0,在集合lOri中搜索满足条件的线段:线段中点到l0所在直线欧式距离dis小于3像素的,同时线段与l0夹角ang小于0.5°;将l0及符合条件线段从集合lOri中移出;Search for the line segment with the largest L in the set l Ori , recorded as l 0 , and search for the line segment that meets the conditions in the set l Ori : the Euclidean distance dis between the midpoint of the line segment and the straight line where l 0 is less than 3 pixels, and the angle between the line segment and l 0 ang is less than 0.5°; remove l 0 and qualified line segments from the set l Ori ;

两条线段的欧式距离dis和夹角ang按下式计算:The Euclidean distance dis and angle ang between two line segments are calculated as follows:

其中:in:

(a1,b1),(a2,b2)为l0端点坐标,(a 1 ,b 1 ), (a 2 ,b 2 ) are the endpoint coordinates of l 0 ,

k为l0斜率, k is the slope of l 0 ,

x=(x1+x2)/2x=(x 1 +x 2 )/2

y=(y1+y2)/2y=(y 1 +y 2 )/2

Δx=x2-x1 Δx=x 2 -x 1

Δy=y2-y1 Δy=y 2 -y 1

Δa=a2-a1 Δa=a 2 -a 1

Δb=b2-b1Δb=b 2 -b1

步骤四、对于l0及符合条件线段,使用SMBR算法计算这些线段端点的最小外接矩形,将该外接矩形两条短边的中点像素坐标作为新的融合的线段的两个端点的像素坐标,将这条融合处理的线段放入到融合的线段集合lmeg中;Step 4. For l 0 and qualified line segments, use the SMBR algorithm to calculate the minimum circumscribed rectangle of the endpoints of these line segments, and use the pixel coordinates of the midpoint of the two short sides of the circumscribed rectangle as the pixel coordinates of the two endpoints of the new fused line segment. Put this fused line segment into the fused line segment set l meg ;

重复执行步骤三、四,直到集合lOri为空集,然后执行步骤五;Repeat steps three and four until the set l Ori is an empty set, and then perform step five;

步骤五、删除集合lmeg中错误线段:Step 5. Delete the wrong line segments in the set l meg :

首先,删除L<1200像素的线段;First, delete line segments with L<1200 pixels;

其次,使用光学元件CAD模型的先验知识识别属于光学元件边缘的线段,具体的,将被夹具夹持的光学元件复位至预定位置,将此时的光学元件的CAD投影至采集的图像中。由于提取的线段来自光学元件边缘,则按下列条件来筛选代表光学元件边缘的线段:该线段中点到与由CAD提取的某条模型边缘线段欧式距离dis小于15像素,夹角ang小于2°;将集合lmeg中不满足上述条件的线段删除,集合lmeg中剩余的线段作为光学元件边缘。Secondly, the prior knowledge of the optical element CAD model is used to identify the line segments belonging to the edge of the optical element. Specifically, the optical element clamped by the fixture is reset to a predetermined position, and the CAD of the optical element at this time is projected into the collected image. Since the extracted line segments come from the edge of the optical element, the line segments representing the edge of the optical element are filtered according to the following conditions: the Euclidean distance dis between the midpoint of the line segment and the edge line segment of a certain model extracted by CAD is less than 15 pixels, and the included angle ang is less than 2° ; Delete the line segments that do not meet the above conditions in the set l meg , and use the remaining line segments in the set l meg as the edge of the optical element.

优选地,S34获取光学元件边缘在全局坐标系下的解析式的过程为:Preferably, the process of obtaining the analytical expression of the edge of the optical element in the global coordinate system in S34 is:

根据已知的相机内参矩阵和外参矩阵,将两张图像中对应光学元件的同一边缘的直线使用反投影转换到相机坐标系下的平面,得到直线所在平面的三维参数,该平面过相机光心O;所述两张图像为一张俯视图和一张侧视图;According to the known internal parameter matrix and external parameter matrix of the camera, the straight line corresponding to the same edge of the optical element in the two images is converted to the plane under the camera coordinate system using back projection, and the three-dimensional parameters of the plane where the straight line is located are obtained. The plane passes through the camera light. Center O; the two images are a top view and a side view;

假设来自全局相机的边缘线段l1和一台侧视角相机的边缘线段l2对应空间中同一条直线,l1和全局相机的光心确定平面P1,l2和侧视角相机的光心确定平面P2,根据内参矩阵和外参矩阵将平面P1、P2变换到全局坐标系下,得到P1、P2在全局坐标系下的解析式为:Assume that the edge line segment l 1 from the global camera and the edge line segment l 2 of a side view camera correspond to the same straight line in space, l 1 and the optical center of the global camera determine the plane P 1 , l 2 and the optical center of the side view camera determine For plane P 2 , planes P 1 and P 2 are transformed into the global coordinate system according to the internal parameter matrix and external parameter matrix, and the analytical formulas of P 1 and P 2 in the global coordinate system are obtained:

P1:a1X+b1Y+c1Z+d1=0P 1 :a 1 X+b 1 Y+c 1 Z+d 1 =0

P2:a2X+b2Y+c2Z+d2=0P 2 :a 2 X+b 2 Y+c 2 Z+d 2 =0

其中向量(a1,b1,c1)是平面P1的法向量,d1是满足所有点在平面P1上后缀;向量(a2,b2,c2)是平面P2的法向量,d2是满足所有点在平面P2上后缀;c1=c2=1;Among them, the vector (a 1 , b 1 , c 1 ) is the normal vector of the plane P 1 , d 1 is the suffix that satisfies all points on the plane P 1 ; the vector (a 2 , b 2 , c 2 ) is the normal vector of the plane P 2 Vector, d 2 is the suffix that satisfies all points on the plane P 2 ; c 1 =c 2 =1;

由于在两台相机成像中,匹配的直线对应三维空间同一条直线,平面P1和P2必然相交,交线即为目标直线,即光学元件边缘所在直线解析式为:Since in the imaging of two cameras, the matching straight line corresponds to the same straight line in the three-dimensional space, the planes P 1 and P 2 must intersect, and the intersection line is the target straight line, that is, the analytical formula of the straight line where the edge of the optical element is located is:

P=p0+tVP=p 0 +tV

其中p0=(X0,Y0,Z0),是目标直线上的一个已知点,t是目标直线的单位方向向量,形式为t=iX+jY+kZ,系数i,j,k满足关系式:i2+j2+k2=1;Where p 0 = (X 0 , Y 0 , Z 0 ) is a known point on the target straight line, t is the unit direction vector of the target straight line, in the form of t=iX+jY+kZ, coefficients i, j, k Satisfy the relationship: i 2 +j 2 +k 2 =1;

重复此方法可得到光学元件上表面靠近两台侧视角相机的相邻两边缘所在直线在全局坐标系下的解析式:Repeating this method can obtain the analytical formula of the straight line in the global coordinate system where the upper surface of the optical element is close to the two adjacent edges of the two side-view cameras:

L1:P=p1+1VL 1 :P=p 1 + 1 V

L2:P=p2+2VL 2 :P=p 2 + 2 V

其中,p1是直线L1上的一个已知点,t1是直线L1的单位方向向量,p2是直线L2上的一个已知点,t2是直线L2的单位方向向量;Among them, p 1 is a known point on the straight line L 1 , t 1 is the unit direction vector of the straight line L 1 , p 2 is a known point on the straight line L 2 , t 2 is the unit direction vector of the straight line L 2 ;

解算到L1和L2距离之和最小的点A=(XM,YM,ZM),为光学元件上表面左上角顶点,即,在L1和L2上各取一个到A距离为410mm的点B和C,由A、B、C共同确定一个平面:Calculate the point A = (X M , Y M , Z M ) where the sum of the distances to L 1 and L 2 is the smallest Points B and C are 410mm apart, and a plane is determined by A, B, and C:

P0:a0X+b0Y+Z+c0=0P 0 :a 0 X+b 0 Y+Z+c 0 =0

即为光学元件上表面在全局坐标系下的解析式,按上述方法求解上表面与其他两个边缘所在平面的交线,即为剩下两个边缘所在直线在全局坐标系下的空间解析式:That is the analytical formula of the upper surface of the optical element in the global coordinate system. According to the above method, the intersection line between the upper surface and the plane where the other two edges are located is solved, which is the spatial analytical formula of the straight line where the remaining two edges are located in the global coordinate system. :

L3:P=p3+3VL 3 : P=p 3 + 3 V

L4:P=p4+4VL 4 : P=p 4 + 4 V

其中,p3是直线L3上的一个已知点,t3是直线L3的单位方向向量,p4是直线L4上的一个已知点,tt是直线L4的单位方向向量。Among them, p 3 is a known point on the straight line L 3 , t 3 is the unit direction vector of the straight line L 3 , p 4 is a known point on the straight line L 4 , and t t is the unit direction vector of the straight line L 4 .

优选地,S35解算光学元件的位置姿态及上表面的尺寸的过程为:Preferably, the process of S35 to calculate the position and posture of the optical element and the size of the upper surface is as follows:

光学元件模型坐标系OM到全局坐标系OG的姿态Z轴正方向定义为t2×t1,X轴正方向定义为t2。构建PnP对应关系:The positive Z-axis direction of the attitude from the optical element model coordinate system O M to the global coordinate system O G is defined as t 2 ×t 1 , and the positive X-axis direction is defined as t 2 . Construct PnP correspondence:

二维点(0,0),对应三维点A=(XM,YM,ZM);The two-dimensional point (0,0) corresponds to the three-dimensional point A=(X M , Y M , Z M );

二维点(1,0),对应三维点A+t2The two-dimensional point (1,0) corresponds to the three-dimensional point A+t 2 ;

二维点(0,1),对应三维点A+Norm(t2×t1)×t2Two-dimensional point (0,1) corresponds to three-dimensional point A+Norm(t 2 ×t 1 )×t 2 ;

二维点(1,1),对应三维点A+Norm(t2×t1)×t2+t2Two-dimensional point (1,1) corresponds to three-dimensional point A+Norm(t 2 ×t 1 )×t 2 +t 2 ;

其中Norm(t)为对t归一化的操作;Among them, Norm(t) is the operation of normalizing t;

通过建立上述四组点的PnP关系,即可求解出光学元件的模型坐标系OM到全局坐标系OG的转换关系(R,t),(R,t)作为光学元件的6自由度位姿,R为光学元件的模型坐标系OM到全局坐标系OG的旋转矩阵,t为光学元件的模型坐标系OM到全局坐标系OG的平移向量;By establishing the PnP relationship of the above four groups of points, the conversion relationship (R, t) from the model coordinate system O M of the optical element to the global coordinate system O G can be solved. (R, t) is used as the 6-degree-of-freedom position of the optical element. pose, R is the rotation matrix from the model coordinate system O M of the optical element to the global coordinate system O G , t is the translation vector from the model coordinate system O M of the optical element to the global coordinate system O G ;

位置矩阵t=AT=(XM,YM,ZM)TPosition matrix t=A T =(X M , Y M , Z M ) T ;

按照A点在全局坐标系下坐标的求解方法获取光学元件上表面剩余顶点在全局坐标系下坐标,剩余顶点按顺时针依次为B、C、D,According to the method of solving the coordinates of point A in the global coordinate system, obtain the coordinates of the remaining vertices on the upper surface of the optical element in the global coordinate system. The remaining vertices are B, C, and D in clockwise order.

光学元件的长为L=||((A+B)-(D+C))/2||2,宽为W=||((A+D)-(B+C))/2||2The length of the optical element is L=||((A+B)-(D+C))/2|| 2 and the width is W=||((A+D)-(B+C))/2| | 2 .

本发明还提供另一种技术方案:基于正交视觉系统的光学元件装配方法,在装配时,利用权利要求1-8所述的基于正交视觉系统的光学元件装配过程位姿检测方法实时获取光学元件的位姿,计算出当前光学元件的位姿与理想位姿的偏差,指导机械手进行位姿调整。The present invention also provides another technical solution: an optical element assembly method based on an orthogonal vision system. During assembly, the posture detection method of the optical element assembly process based on the orthogonal vision system described in claims 1-8 is used to obtain real-time The pose of the optical element is used to calculate the deviation between the current pose of the optical element and the ideal pose, and guide the manipulator to adjust the pose.

本发明的有益效果:Beneficial effects of the present invention:

(1)、高精度定位:通过三台相机组成的视觉检测系统,利用掠射光源照明,通过针对性的对光学元件边缘清晰成像,有效提高了对透明光学元件的定位精度。(1) High-precision positioning: Through a visual inspection system composed of three cameras, using grazing light source illumination, and clearly imaging the edges of optical elements in a targeted manner, the positioning accuracy of transparent optical elements is effectively improved.

(2)、减小制造公差影响:通过重新测算光学元件上表面的长宽,降低制造公差对装配过程的影响。(2) Reduce the impact of manufacturing tolerances: By recalculating the length and width of the upper surface of the optical element, the impact of manufacturing tolerances on the assembly process is reduced.

(3)、精确测量六自由度位姿:利用计算机视觉算法,基于三台相机获得的图像,精确测量光学元件的六自由度位姿,为后续装配过程提供路径指导。(3) Accurately measure the six-degree-of-freedom posture: Using computer vision algorithms and based on the images obtained by three cameras, the six-degree-of-freedom posture of the optical element is accurately measured to provide path guidance for the subsequent assembly process.

附图说明Description of the drawings

图1是本发明所述基于正交视觉系统的光学元件装配过程位姿检测方法的结构示意图;Figure 1 is a schematic structural diagram of the pose detection method in the assembly process of optical components based on an orthogonal vision system according to the present invention;

图2是立体标定块示意图,其中图2(a)为表面4示意图,图2(b)为立体标定块的立体结构示意图,图2(c)为立体标定块表面3示意图,图2(d)为立体标定块表面0示意图,图2(e)为立体标定块表面1示意图,图2(f)为立体标定块表面2示意图,;Figure 2 is a schematic diagram of the three-dimensional calibration block, where Figure 2(a) is a schematic diagram of the surface 4, Figure 2(b) is a schematic diagram of the three-dimensional structure of the three-dimensional calibration block, Figure 2(c) is a schematic diagram of the surface 3 of the three-dimensional calibration block, Figure 2(d) ) is a schematic diagram of the surface 0 of the three-dimensional calibration block, Figure 2(e) is a schematic diagram of the surface 1 of the three-dimensional calibration block, Figure 2(f) is a schematic diagram of the surface 2 of the three-dimensional calibration block,;

图3是线段在三维空间中反投影原理图。Figure 3 is a schematic diagram of back-projection of line segments in three-dimensional space.

附图中各部件说明:1、全局相机,2、1号侧视角相机,3、2号侧视角相机,4、FOA装配平台,5、光学元件,6、俯视图,7、1号侧视角相机所拍侧视图,8、2号侧视角相机所拍侧视图。Description of each component in the attached figure: 1. Global camera, 2. Side view camera No. 1, 3. Side view camera No. 2, 4. FOA assembly platform, 5. Optical components, 6. Top view, 7. Side view camera No. 1 Side view taken, side view taken by side view cameras No. 8 and 2.

具体实施方式Detailed ways

下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动的前提下所获得的所有其它实施例,都属于本发明保护的范围。The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some of the embodiments of the present invention, rather than all the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without any creative work fall within the scope of protection of the present invention.

需要说明的是,在不冲突的情况下,本发明中的实施例及实施例中的特征可以相互组合。It should be noted that, as long as there is no conflict, the embodiments and features in the embodiments of the present invention can be combined with each other.

下面结合附图和具体实施例对本发明作进一步说明,但不作为本发明的限定。The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments, but shall not be used as a limitation of the present invention.

具体实施方式一:下面结合图1至图3说明本实施方式,本实施方式所述基于正交视觉系统的光学元件装配过程位姿检测方法包括以下步骤:Specific Embodiment 1: This embodiment will be described below with reference to Figures 1 to 3. The posture detection method of the optical element assembly process based on the orthogonal vision system described in this embodiment includes the following steps:

S1、构建视觉检测系统步骤;S1. Steps to build a visual inspection system;

所述视觉检测系统包括一台全局相机和两台侧视角相机,其中两台侧视角相机前方使用条形光源对光学元件掠射照明;The visual inspection system includes a global camera and two side-view cameras, in which a strip light source is used in front of the two side-view cameras to illuminate the optical elements glancingly;

S2、视觉检测系统中三台相机的联合标定步骤;S2. Joint calibration steps of three cameras in the visual inspection system;

其中,以全局相机的相机坐标系作为统一的全局坐标系;标定出三台相机内参、外参,建立各相机的像素坐标系到统一的全局坐标系的空间映射关系;Among them, the camera coordinate system of the global camera is used as the unified global coordinate system; the internal and external parameters of the three cameras are calibrated, and the spatial mapping relationship between the pixel coordinate system of each camera and the unified global coordinate system is established;

S3、光学元件位姿解算步骤;S3. Optical component pose calculation steps;

三台相机同步获取光学元件的俯视图和两个侧视图,从三台相机的图像中提取光学元件边缘在各自像素坐标系下的解析式,结合联合标定数据,将不同相机的像素坐标系下的边缘对齐到统一的全局坐标系下;通过这些边缘的解析式确定光学元件的位姿。Three cameras simultaneously acquire the top view and two side views of the optical element. The analytical formulas of the edge of the optical element in their respective pixel coordinate systems are extracted from the images of the three cameras. Combined with the joint calibration data, the pixel coordinates of the different cameras are combined. The edges are aligned to a unified global coordinate system; the pose of the optical element is determined through the analytical expressions of these edges.

关于步骤S1,参见图1,本实施方式方法基于正交视觉系统实现,为了对光学元件从不同角度同时清晰成像,为后续光学元件位姿检测提供图像信息,设计视觉检测系统由三台相机组成,包括一台全局相机和两台侧视角相机,其中两台侧视角相机前方使用条形光源对光学元件掠射照明。Regarding step S1, see Figure 1. The method of this embodiment is implemented based on an orthogonal vision system. In order to clearly image the optical elements from different angles at the same time and provide image information for subsequent optical element pose detection, the visual inspection system is designed to consist of three cameras. , including a global camera and two side-view cameras, in which a strip light source is used in front of the two side-view cameras to illuminate the optical elements.

待检测的光学元件水平方向置于FOA装配平台上,全局相机设置于FOA装配平台上方,检测时位于光学元件侧方的两台相机为1号侧视角相机和2号侧视角相机,配置全局相机的工作距离为1200mm,1号侧视角相机和2号侧视角相机的工作距离为280mm。The optical element to be inspected is placed on the FOA assembly platform in the horizontal direction, and the global camera is set above the FOA assembly platform. The two cameras located on the side of the optical element during inspection are the No. 1 side view camera and the No. 2 side view camera. The global camera is configured The working distance of No. 1 Side View Camera and No. 2 Side View Camera is 280mm.

三台相机的姿态相互正交,三台相机的光轴相交于光学元件上表面的一点,建立如下坐标系:The postures of the three cameras are orthogonal to each other, and the optical axes of the three cameras intersect at a point on the upper surface of the optical element. The following coordinate system is established:

全局相机的全局坐标系为OG-XGYGZG,全局相机的光轴方向与水平面垂直;The global coordinate system of the global camera is O G -X G Y G Z G , and the optical axis direction of the global camera is perpendicular to the horizontal plane;

1号侧视角相机坐标系为OC1-XC1YC1ZC1,2号侧视角相机坐标系为OC2-XC2YC2ZC2,两个相机的光轴方向与水平面平行;The coordinate system of side view camera No. 1 is O C1 -X C1 Y C1 Z C1 , and the coordinate system of side view camera No. 2 is O C2 -X C2 Y C2 Z C2 . The optical axes of the two cameras are parallel to the horizontal plane;

全局相机坐标系OG的X轴正方向与2号侧视角相机坐标系OC2的ZC2轴正方向相同;The positive direction of the X axis of the global camera coordinate system O G is the same as the positive direction of the Z C2 axis of the No. 2 side view camera coordinate system O C2 ;

光学元件的模型坐标系OM为OM-XMYMZM,且原点定义在光学元件上表面左上角的顶点A上,ZM轴与光学元件上表面垂直,ZM轴正方向向上,XM轴正方向为光学元件上表面左上角的顶点指向光学元件上表面左下角的顶点方向。The model coordinate system O M of the optical element is O M -X M Y M Z M , and the origin is defined at the vertex A in the upper left corner of the upper surface of the optical element. The Z M axis is perpendicular to the upper surface of the optical element, and the positive direction of the Z M axis is upward. , the positive direction of the

关于步骤S2的联合标定,为了对视觉检测系统的三台相机标定内参、外参,建立各自像素坐标系到统一的全局三维坐标系的空间映射关系,首先分别标定三台相机的内部参数,建立像素坐标系到各自的相机坐标系的空间映射关系,再将全局相机的相机坐标系定义为统一的全局坐标系,分别标定两台侧视角相机的相机坐标系到全局坐标系的坐标系变换,即两台相机的外参。Regarding the joint calibration of step S2, in order to calibrate the internal and external parameters of the three cameras of the visual inspection system and establish the spatial mapping relationship between their respective pixel coordinate systems and the unified global three-dimensional coordinate system, first calibrate the internal parameters of the three cameras separately and establish The spatial mapping relationship between the pixel coordinate system and the respective camera coordinate system, and then the camera coordinate system of the global camera is defined as a unified global coordinate system, and the coordinate system transformation from the camera coordinate system of the two side-view cameras to the global coordinate system is calibrated respectively. That is, the external parameters of the two cameras.

其中内参矩阵采用张正友标定方法,标定三台相机的内参矩阵,建立各自二维像素坐标系到三维相机坐标系的映射关系;The internal parameter matrix adopts Zhang Zhengyou's calibration method to calibrate the internal parameter matrix of the three cameras and establish the mapping relationship between the respective two-dimensional pixel coordinate system and the three-dimensional camera coordinate system;

其中外参矩阵标定,以全局相机的相机坐标系作为统一的全局坐标系,分别标定两台侧视角相机的相机坐标系到全局坐标系的坐标系变换,获取两台侧视角相机的外参矩阵;Among them, the external parameter matrix calibration uses the camera coordinate system of the global camera as a unified global coordinate system, respectively calibrates the coordinate system transformation from the camera coordinate system of the two side-view cameras to the global coordinate system, and obtains the external parameter matrices of the two side-view cameras. ;

所述外参矩阵包括:1号侧视角相机坐标系OC1到全局坐标系OG的转换关系TC1=(RC1,tC1),RC1为1号侧视角相机坐标系OC1到全局坐标系OG的旋转矩阵,tC1为1号侧视角相机坐标系OC1到全局坐标系OG的平移向量;The external parameter matrix includes: the conversion relationship T C1 = ( RC1 , t C1 ) from the No. 1 side view camera coordinate system O C1 to the global coordinate system O G. R C1 is the No. 1 side view camera coordinate system O C1 to the global coordinate system O G. The rotation matrix of the coordinate system O G , t C1 is the translation vector from the No. 1 side view camera coordinate system O C1 to the global coordinate system O G ;

及2号侧视角相机坐标系OC2到全局坐标系OG的转换关系为TC2=(RC2,tC2),RC2为2号侧视角相机坐标系OC2到全局坐标系OG的旋转矩阵,tC2为2号侧视角相机坐标系OC2到全局坐标系OG的平移向量。And the conversion relationship between the No. 2 side-view camera coordinate system O C2 and the global coordinate system O G is T C2 = ( RC2 ,t C2 ). R C2 is the conversion relationship between the No. 2 side-view camera coordinate system O C2 and the global coordinate system O G. Rotation matrix, t C2 is the translation vector from the No. 2 side view camera coordinate system O C2 to the global coordinate system O G.

外参矩阵标定参见图2,在外参标定中,由于定义了全局相机的相机坐标系为统一的全局坐标系,因此只需要标定两台侧视角相机的外部参数,为正交的视觉检测系统的外部参数的标定,针对性的设计了立体标定块,结合AprilTag算法标定。设计的标定块采用树脂材料3D打印,外形为长方体,其尺寸规格如图2所示,加工精度为±0.05mm。所述立体标定块为长方体,四个侧面及顶面依次定义为立方体的表面0-表面4,表面0-表面4依次采用id为0-4的36H11 AprilTag图案,立体标定块的坐标系OB原点定义在表面0上的AprilTag图案(id=0)中心,X轴、Y轴平行于该表面,正方向分别为图案左上顶点到右上顶点方向、图案左上顶点到左下顶点方向,Z轴方向指向模型内侧;See Figure 2 for the external parameter matrix calibration. In the external parameter calibration, since the camera coordinate system of the global camera is defined as a unified global coordinate system, only the external parameters of the two side-view cameras need to be calibrated, which is an orthogonal visual inspection system. For the calibration of external parameters, a three-dimensional calibration block is designed specifically and combined with the AprilTag algorithm calibration. The designed calibration block is 3D printed with resin material and has a rectangular shape. Its size specifications are shown in Figure 2, and the processing accuracy is ±0.05mm. The three-dimensional calibration block is a cuboid. The four sides and the top surface are defined as surface 0-surface 4 of the cube in sequence. Surface 0-surface 4 adopt the 36H11 AprilTag pattern with id 0-4 in sequence. The coordinate system of the three-dimensional calibration block is O B The origin is defined at the center of the AprilTag pattern (id=0) on surface 0. The X-axis and Y-axis are parallel to the surface. The positive directions are the direction from the upper left vertex of the pattern to the upper right vertex, the direction from the upper left vertex of the pattern to the lower left vertex, and the Z axis direction. Inside the model;

采用AprilTag算法标定外参矩阵的过程为:The process of calibrating the external parameter matrix using the AprilTag algorithm is:

标定侧视角相机外参时,长方体标定块的布置和摆放,满足相机检测光学元件时的工作距离需求。立体标定块表面4保持水平,从立体标定块的表面0-3中任意选择一面,令被选择面AprilTag图案到1号侧视角相机光心沿光轴方向的距离为280mm,表面4的AprilTag图案到全局相机光心沿光轴方向的距离为1200mm,并保证上述两相机视场中完整出现对应的AprilTag图案;When calibrating the external parameters of the side-view camera, the arrangement and placement of the cuboid calibration blocks meet the working distance requirements of the camera when detecting optical components. Keep the surface 4 of the three-dimensional calibration block horizontal. Select one side from the surfaces 0-3 of the three-dimensional calibration block. Let the distance between the AprilTag pattern on the selected surface and the optical center of the No. 1 side-view camera along the optical axis direction be 280mm. The AprilTag pattern on the surface 4 The distance to the optical center of the global camera along the optical axis is 1200mm, and the corresponding AprilTag pattern must appear completely in the fields of view of the above two cameras;

当立体标定块与全局相机、1号侧视角相机满足上述要求时,使用外部触发方式控制两台相机在不大于15ms的时间差下捕捉图像,使用AprilTag算法计算视场内标定图形到各自相机的位置姿态;立体标定块上所有的图形的坐标系与标定块的坐标系OB的变换关系由CAD数据已知,推导出拍照时刻标定块相对于全局相机和1号侧视角相机的位姿,解算这两个位姿之间的变换TC1=(RC1,tC1),即为OC1到OG的转换关系,获取1号侧视角相机在OG下的外参矩阵;When the stereo calibration block, the global camera, and the No. 1 side view camera meet the above requirements, use the external trigger method to control the two cameras to capture images with a time difference of no more than 15ms, and use the AprilTag algorithm to calculate the positions of the calibration graphics in the field of view to the respective cameras. Attitude; the transformation relationship between the coordinate system of all graphics on the stereo calibration block and the coordinate system O B of the calibration block is known from the CAD data. The pose of the calibration block relative to the global camera and the No. 1 side view camera at the time of taking the photo is deduced. Solution Calculate the transformation between these two poses T C1 = (R C1 ,t C1 ), which is the conversion relationship from O C1 to O G , and obtain the external parameter matrix of side view camera No. 1 under O G ;

2号侧视角相机在OG下的外参矩阵的获取方式与上述过程相同。2号侧视角相机坐标系OC2到全局坐标系OG的转换关系为TC2=(RC2,tC2)。The extrinsic parameter matrix of No. 2 side-view camera under O G is obtained in the same way as the above process. The conversion relationship from the No. 2 side view camera coordinate system OC2 to the global coordinate system OG is TC2 = ( RC2 , t C2 ).

关于步骤S3,光学元件位姿解算过程为:Regarding step S3, the optical element pose solution process is:

S31、利用夹具将光学元件放于FOA平台预定位置,令三台相机同步采集光学元件的俯视图像IM和两个侧视图像ISL、ISF,将三幅图像进行预处理生成灰度图像;S31. Use the fixture to place the optical element at the predetermined position on the FOA platform, let the three cameras simultaneously collect the top view image I M and the two side view images I SL and I SF of the optical element, and preprocess the three images to generate a grayscale image. ;

具体的,首先机械手将光学元件夹持至预定位置,打开掠射光源,调节照明角度令光学元件边缘在相机视场中被照亮,根据预先设定好的RoI区域的平均像素灰度值调节相机曝光时间,令光学元件边缘对应像素灰度(取值范围为0-255)在220-240之间;通过外部触发方式控制三台相机在150ms内同步采集图像,采集的图像分别为全局相机图像IM,分辨率为5120×5120;侧视角相机图像ISL、ISF,分辨率均为4200×2160。Specifically, first, the manipulator clamps the optical element to a predetermined position, turns on the grazing light source, adjusts the lighting angle so that the edge of the optical element is illuminated in the camera's field of view, and adjusts it according to the average pixel gray value of the preset RoI area. The camera exposure time is such that the corresponding pixel grayscale (value range is 0-255) at the edge of the optical element is between 220-240; the three cameras are controlled through external triggering to simultaneously collect images within 150ms, and the collected images are global cameras. The image I M has a resolution of 5120×5120; the side view camera images I SL and I SF have a resolution of 4200×2160.

S32、采用LSD算法从三幅灰度图像中提取线段,对这些线段进行处理,获取三幅灰度图像在各自相机像素坐标系下的光学元件边缘;S32. Use the LSD algorithm to extract line segments from the three grayscale images, process these line segments, and obtain the edges of the optical elements of the three grayscale images in the respective camera pixel coordinate systems;

光学元件边缘的获取过程为:The acquisition process of the edge of the optical element is:

采用LSD算法从三幅灰度图像中提取出所有线段,并作如下处理:The LSD algorithm is used to extract all line segments from the three grayscale images and processed as follows:

步骤一、剔除L<25像素的过短线段,线段长度L按下式获取:Step 1. Eliminate too short line segments with L<25 pixels. The length of the line segment L is obtained as follows:

(x1,y1),(x2,y2)分别是线段两端点的像素坐标;(x 1 , y 1 ), (x 2 , y 2 ) are the pixel coordinates of the two end points of the line segment respectively;

步骤二、将还未融合的线段集合记为lOri,融合的线段集合记为lmeg,经步骤一剔除过短线段后的剩余线段首先放在集合lOri中;Step 2: Record the set of unfused line segments as l Ori and the set of fused line segments as l meg . The remaining line segments after eliminating the short line segments in step 1 are first placed in the set l Ori ;

步骤三、融合线段:Step 3. Fusion of line segments:

在集合lOri中搜索L最大的线段,记为l0,在集合lOri中搜索满足条件的线段:线段中点到l0所在直线欧式距离dis小于3像素的,同时线段与l0夹角ang小于0.5°;将l0及符合条件线段从集合lOri中移出;Search for the line segment with the largest L in the set l Ori , recorded as l 0 , and search for the line segment that meets the conditions in the set l Ori : the Euclidean distance dis between the midpoint of the line segment and the straight line where l 0 is less than 3 pixels, and the angle between the line segment and l 0 ang is less than 0.5°; remove l 0 and qualified line segments from the set l Ori ;

两条线段的欧式距离dis和夹角ang按下式计算:The Euclidean distance dis and angle ang between two line segments are calculated as follows:

其中:in:

(a1,b1),(a2,b2)为l0端点坐标,(a 1 ,b 1 ), (a 2 ,b 2 ) are the endpoint coordinates of l 0 ,

k为l0斜率, k is the slope of l 0 ,

x=(x1+x2)/2x=(x 1 +x 2 )/2

y=(y1+y2)/2y=(y 1 +y 2 )/2

Δx=x2-x1 Δx=x 2 -x 1

Δy=y2-y1 Δy=y 2 -y 1

Δa=a2-a1 Δa=a 2 -a 1

Δb=b2-b1 Δb=b 2 -b 1

步骤四、对于l0及符合条件线段,使用SMBR算法计算这些线段端点的最小外接矩形,将该外接矩形两条短边的中点像素坐标作为新的融合的线段的两个端点的像素坐标,将这条融合处理的线段放入到融合的线段集合lmeg中;Step 4. For l 0 and qualified line segments, use the SMBR algorithm to calculate the minimum circumscribed rectangle of the endpoints of these line segments, and use the pixel coordinates of the midpoint of the two short sides of the circumscribed rectangle as the pixel coordinates of the two endpoints of the new fused line segment. Put this fused line segment into the fused line segment set l meg ;

重复执行步骤三、四,直到集合lOri为空集,然后执行步骤五;Repeat steps three and four until the set l Ori is an empty set, and then perform step five;

步骤五、删除集合lmeg中错误线段:Step 5. Delete the wrong line segments in the set l meg :

首先,删除L<1200像素的线段;由于在实拍图中光学元件边缘的线段普遍长度都高于1200像素,所以剔除L<1200像素的线段,即删除过短线段。First, delete the line segments with L < 1200 pixels; since the length of the line segments at the edge of the optical components in the real-shot images is generally longer than 1200 pixels, the line segments with L < 1200 pixels are deleted, that is, the line segments that are too short are deleted.

其次,使用光学元件CAD模型的先验知识识别属于光学元件边缘的线段,具体的,将被夹具夹持的光学元件复位至预定位置,将此时的光学元件的CAD投影至采集的图像中。由于提取的线段来自光学元件边缘,所以正确代表边缘的线段需要同时满足下列两个条件(即按下列条件来筛选代表光学元件边缘的线段):该线段中点到与由CAD提取的某条模型边缘线段欧式距离dis小于15像素,夹角ang小于2°;将集合lmeg中不满足上述条件的线段删除,集合lmeg中剩余的线段作为光学元件边缘。Secondly, the prior knowledge of the optical element CAD model is used to identify the line segments belonging to the edge of the optical element. Specifically, the optical element clamped by the fixture is reset to a predetermined position, and the CAD of the optical element at this time is projected into the collected image. Since the extracted line segments come from the edge of the optical element, the line segment that correctly represents the edge needs to meet the following two conditions at the same time (that is, the line segments representing the edge of the optical element are filtered according to the following conditions): The midpoint of the line segment is consistent with a certain model extracted from CAD The Euclidean distance dis of the edge line segment is less than 15 pixels, and the included angle ang is less than 2°; the line segments in the set l meg that do not meet the above conditions are deleted, and the remaining line segments in the set l meg are used as the edge of the optical element.

S33、获取光学元件边缘在三个相机坐标系下的解析式,具体的,根据步骤S32匹配结果,当俯视图像IM中的某一条边缘与侧视图像ISL或侧视图像ISF中的某一条边缘同时对应CAD模型中同一条边缘,则认为上述两张图像中的边缘对应光学元件的同一边缘;根据该原则获取三幅灰度图像在各自相机坐标系下的解析式;S33. Obtain the analytical formula of the edge of the optical element in the three camera coordinate systems. Specifically, according to the matching result of step S32, when an edge in the top-view image IM matches an edge in the side-view image I SL or the side-view image I SF If a certain edge simultaneously corresponds to the same edge in the CAD model, it is considered that the edges in the above two images correspond to the same edge of the optical element; according to this principle, the analytical formulas of the three grayscale images in their respective camera coordinate systems are obtained;

S34、根据相机内参矩阵、外参矩阵,及三个相机坐标系下解析式,获取光学元件边缘在全局坐标系下的解析式;S34. According to the camera internal parameter matrix, external parameter matrix, and the analytical formulas in the three camera coordinate systems, obtain the analytical formula of the edge of the optical element in the global coordinate system;

获取光学元件边缘在全局坐标系下的解析式的过程为:The process of obtaining the analytical expression of the edge of an optical element in the global coordinate system is:

参见图3,首先给出在像素坐标系的边缘投影在三维空间下的原理说明,根据已知的相机内参矩阵和外参矩阵,将两张图像中对应光学元件的同一边缘的直线使用反投影转换到相机坐标系下的平面,得到直线所在平面的三维参数,该平面过相机光心O;所述两张图像为一张俯视图和一张侧视图;将两张图像中对应待测光学元件的同一边缘的直线使用反投影转换到相机坐标系下的平面,得到直线所在平面的三维参数,该平面过相机光心O。如图3所示,x1、x2为图像中一条直线上的两点,为已知量;X1、X2为这两点在三维空间中对应的点,X1、X2的空间坐标未知,而且即使使用了相机的标定数据也无法得知,但能够根据相机的内参,通过投影确定空间点X1、X2在过点O、x1、x2的平面上,即相机像素平面上的线段(直线)对应三维空间中的一个面。Referring to Figure 3, firstly, the principle description of edge projection in the pixel coordinate system in three-dimensional space is given. According to the known internal parameter matrix and external parameter matrix of the camera, the straight line corresponding to the same edge of the optical element in the two images is back-projected. Convert to the plane under the camera coordinate system to obtain the three-dimensional parameters of the plane where the straight line is located, which passes through the camera optical center O; the two images are a top view and a side view; the two images correspond to the optical element to be tested The straight line at the same edge is converted to a plane under the camera coordinate system using back projection, and the three-dimensional parameters of the plane where the straight line is located are obtained, and the plane passes through the camera optical center O. As shown in Figure 3, x 1 and x 2 are two points on a straight line in the image, which are known quantities; X 1 and X 2 are the corresponding points of these two points in the three-dimensional space, and the space of X 1 and X 2 The coordinates are unknown, and cannot be known even if the calibration data of the camera is used. However, according to the internal parameters of the camera, it can be determined through projection that the spatial points X 1 and X 2 are on the plane passing through the points O, x 1 and x 2 , that is, the camera pixel A line segment (straight line) on the plane corresponds to a surface in three-dimensional space.

假设来自全局相机的边缘线段l1和一台侧视角相机的边缘线段l2对应空间中同一条直线,l1和全局相机的光心确定平面P1,l2和侧视角相机的光心确定平面P2,根据内参矩阵和外参矩阵将平面P1、P2变换到全局坐标系下,得到P1、P2在全局坐标系下的解析式为:Assume that the edge line segment l 1 from the global camera and the edge line segment l 2 of a side view camera correspond to the same straight line in space, l 1 and the optical center of the global camera determine the plane P 1 , l 2 and the optical center of the side view camera determine For plane P 2 , planes P 1 and P 2 are transformed into the global coordinate system according to the internal parameter matrix and external parameter matrix, and the analytical formulas of P 1 and P 2 in the global coordinate system are obtained:

P1:a1X+b1Y+c1Z+d1=0P 1 :a 1 X+b 1 Y+c 1 Z+d 1 =0

P2:a2X+b2Y+c2Z+d2=0P 2 :a 2 X+b 2 Y+c 2 Z+d 2 =0

其中向量(a1,b1,c1)是平面P1的法向量,d1是满足所有点在平面P1上后缀;向量(a2,b2,c2)是平面P2的法向量,d2是满足所有点在平面P2上后缀;c1=c2=1;Among them, the vector (a 1 , b 1 , c 1 ) is the normal vector of the plane P 1 , d 1 is the suffix that satisfies all points on the plane P 1 ; the vector (a 2 , b 2 , c 2 ) is the normal vector of the plane P 2 Vector, d 2 is the suffix that satisfies all points on the plane P 2 ; c 1 =c 2 =1;

由于在两台相机成像中,匹配的直线对应三维空间同一条直线,平面P1和P2必然相交,交线即为目标直线,即光学元件边缘所在直线解析式为:Since in the imaging of two cameras, the matching straight line corresponds to the same straight line in the three-dimensional space, the planes P 1 and P 2 must intersect, and the intersection line is the target straight line, that is, the analytical formula of the straight line where the edge of the optical element is located is:

P=p0+tVP=p 0 +tV

其中p0=(X0,Y0,Z0),是目标直线上的一个已知点,t是目标直线的单位方向向量,形式为t=iX+jY+kZ,系数i,j,k满足关系式:i2+j2+k2=1;Where p 0 = (X 0 , Y 0 , Z 0 ) is a known point on the target straight line, t is the unit direction vector of the target straight line, in the form of t=iX+jY+kZ, coefficients i, j, k Satisfy the relationship: i 2 +j 2 +k 2 =1;

根据上述原理就可以获取光学元件上表面靠近两台侧视角相机的相邻两边缘所在直线在全局坐标系下的解析式:According to the above principle, the analytical formula of the straight line in the global coordinate system where the upper surface of the optical element is close to the two adjacent edges of the two side-view cameras can be obtained:

L1:P=p1+1VL 1 :P=p 1 + 1 V

L2:P=p2+2VL 2 :P=p 2 + 2 V

其中,p1是直线L1上的一个已知点,t1是直线L1的单位方向向量,p2是直线L2上的一个已知点,t2是直线L2的单位方向向量;Among them, p 1 is a known point on the straight line L 1 , t 1 is the unit direction vector of the straight line L 1 , p 2 is a known point on the straight line L 2 , t 2 is the unit direction vector of the straight line L 2 ;

解算到L1和L2距离之和最小的点A=(XM,YM,ZM),为光学元件上表面左上角顶点,即,在L1和L2上各取一个到A距离为410mm的点B和C,由A、B、C共同确定一个平面:Calculate the point A = (X M , Y M , Z M ) where the sum of the distances to L 1 and L 2 is the smallest Points B and C are 410mm apart, and a plane is determined by A, B, and C:

P0:a0X+b0Y+Z+c0=0P 0 :a 0 X+b 0 Y+Z+c 0 =0

即为光学元件上表面在全局坐标系下的解析式,按上述方法求解上表面与其他两个边缘所在平面的交线,即为剩下两个边缘所在直线在全局坐标系下的空间解析式:That is the analytical formula of the upper surface of the optical element in the global coordinate system. According to the above method, the intersection line between the upper surface and the plane where the other two edges are located is solved, which is the spatial analytical formula of the straight line where the remaining two edges are located in the global coordinate system. :

L3:P=p3+3VL 3 : P=p 3 + 3 V

L4:P=p4+4VL 4 : P=p 4 + 4 V

其中,p3是直线L3上的一个已知点,t3是直线L3的单位方向向量,p4是直线L4上的一个已知点,t4是直线L4的单位方向向量。Among them, p 3 is a known point on the straight line L 3 , t 3 is the unit direction vector of the straight line L 3 , p 4 is a known point on the straight line L 4 , and t 4 is the unit direction vector of the straight line L 4 .

S35、根据光学元件边缘在全局坐标系下的解析式,利用PnP算法(Perspective-n-Point,求解3D到2D点对运动的算法)根据对应关系解算光学元件的位置姿态及上表面的尺寸。S35. According to the analytical formula of the edge of the optical element in the global coordinate system, use the PnP algorithm (Perspective-n-Point, an algorithm for solving the motion of 3D to 2D point pairs) to calculate the position, posture and size of the upper surface of the optical element according to the corresponding relationship. .

解算光学元件的位置姿态及上表面的尺寸的过程为:The process of solving the position and attitude of the optical element and the size of the upper surface is:

光学元件模型坐标系OM到全局坐标系OG的姿态Z轴正方向定义为t2×t1,X轴正方向定义为t2。构建PnP对应关系:The positive Z-axis direction of the attitude from the optical element model coordinate system O M to the global coordinate system O G is defined as t 2 ×t 1 , and the positive X-axis direction is defined as t 2 . Construct PnP correspondence:

二维点(0,0),对应三维点A=(XM,YM,ZM);The two-dimensional point (0,0) corresponds to the three-dimensional point A=(X M , Y M , Z M );

二维点(1,0),对应三维点A+t2The two-dimensional point (1,0) corresponds to the three-dimensional point A+t 2 ;

二维点(0,1),对应三维点A+Norm(t2×t1)×t2Two-dimensional point (0,1) corresponds to three-dimensional point A+Norm(t 2 ×t 1 )×t 2 ;

二维点(1,1),对应三维点A+Norm(t2×t1)×t2+t2Two-dimensional point (1,1) corresponds to three-dimensional point A+Norm(t 2 ×t 1 )×t 2 +t 2 ;

其中Norm(t)为对t归一化的操作;Among them, Norm(t) is the operation of normalizing t;

通过建立上述四组点的PnP关系,即可求解出光学元件的模型坐标系OM到全局坐标系OG的转换关系(R,t),(R,t)作为光学元件的6自由度位姿,R为光学元件的模型坐标系OM到全局坐标系OG的旋转矩阵,t为光学元件的模型坐标系OM到全局坐标系OG的平移向量;By establishing the PnP relationship of the above four groups of points, the conversion relationship (R, t) from the model coordinate system O M of the optical element to the global coordinate system O G can be solved. (R, t) is used as the 6-degree-of-freedom position of the optical element. pose, R is the rotation matrix from the model coordinate system O M of the optical element to the global coordinate system O G , t is the translation vector from the model coordinate system O M of the optical element to the global coordinate system O G ;

位置矩阵t=AT=(XM,YM,ZM)TPosition matrix t=A T =(X M , Y M , Z M ) T ;

按照A点在全局坐标系下坐标的求解方法获取光学元件上表面剩余顶点在全局坐标系下坐标,剩余顶点按顺时针依次为B、C、D,According to the method of solving the coordinates of point A in the global coordinate system, obtain the coordinates of the remaining vertices on the upper surface of the optical element in the global coordinate system. The remaining vertices are B, C, and D in clockwise order.

光学元件的长为L=||((A+B)-(D+C))/2||2,宽为W=||((A+D)-(B+C))/2||2The length of the optical element is L=||((A+B)-(D+C))/2|| 2 and the width is W=||((A+D)-(B+C))/2| | 2 .

具体实施方式二:下面结合图1说明本实施方式,本实施方式所述基于正交视觉系统的光学元件装配方法是基于实施方式一检测方法实现的,将实施方式一解算出的光学元件六自由度位姿应用于后续的装配过程中,在装配时,利用实施方式一所述的基于正交视觉系统的光学元件装配过程位姿检测方法实时获取光学元件的位姿,结合机械手手眼标定信息和光学元件应该到达的理想位姿,计算出当前光学元件的位姿与理想位姿的偏差,利用这个偏差指导机械手进行位姿调整。Specific Embodiment 2: This embodiment will be described below with reference to Figure 1. The optical element assembly method based on the orthogonal vision system described in this embodiment is implemented based on the detection method of Embodiment 1. The optical element six calculated in Embodiment 1 is free. The pose is applied to the subsequent assembly process. During assembly, the pose detection method of the optical component assembly process based on the orthogonal vision system described in Embodiment 1 is used to obtain the pose of the optical component in real time, combined with the hand-eye calibration information of the manipulator and The ideal posture that the optical element should reach is calculated, and the deviation between the current optical element's posture and the ideal posture is calculated, and this deviation is used to guide the manipulator to adjust its posture.

虽然在本文中参照了特定的实施方式来描述本发明,但是应该理解的是,这些实施例仅仅是本发明的原理和应用的示例。因此应该理解的是,可以对示例性的实施例进行许多修改,并且可以设计出其他的布置,只要不偏离所附权利要求所限定的本发明的精神和范围。应该理解的是,可以通过不同于原始权利要求所描述的方式来结合不同的从属权利要求和本文中所述的特征。还可以理解的是,结合单独实施例所描述的特征可以使用在其它所述实施例中。Although the present invention is described herein with reference to specific embodiments, it is to be understood that these embodiments are merely exemplary of the principles and applications of the invention. It is therefore to be understood that many modifications may be made to the exemplary embodiments and other arrangements may be devised without departing from the spirit and scope of the invention as defined by the appended claims. It is to be understood that the features described in the different dependent claims may be combined in a different manner than that described in the original claims. It will also be understood that features described in connection with individual embodiments can be used in other described embodiments.

Claims (9)

1. The method for detecting the pose of the optical element assembly process based on the orthogonal vision system is characterized by comprising the following steps of:
s1, constructing a visual detection system;
the visual detection system comprises a global camera and two side view angle cameras, wherein the front of the two side view angle cameras uses a strip light source to glancing and illuminating the optical element;
s2, a joint calibration step of three cameras in the visual detection system;
taking a camera coordinate system of a global camera as a unified global coordinate system; three camera internal parameters and external parameters are marked, and a spatial mapping relation from a pixel coordinate system of each camera to a unified global coordinate system is established;
s3, calculating the pose of the optical element;
the three cameras synchronously acquire a top view and two side views of the optical element, extract the analysis of the edges of the optical element under the respective pixel coordinate systems from the images of the three cameras, and align the edges under the pixel coordinate systems of the different cameras under a unified global coordinate system by combining the combined calibration data; the pose of the optical element is determined by the analytical formula of these edges.
2. The method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to claim 1, wherein the optical element to be detected is horizontally arranged on a FOA assembly platform, a global camera is arranged above the FOA assembly platform, two cameras positioned at the side of the optical element during detection are a No. 1 side view angle camera and a No. 2 side view angle camera,
the three cameras are orthogonal to each other in terms of gestures, the optical axes of the three cameras intersect at one point on the upper surface of the optical element, and the following coordinate system is established:
the global coordinate system of the global camera is O G -X G Y G Z G The optical axis direction of the global camera is vertical to the horizontal plane;
no. 1 side view camera coordinate system O C1 -X C1 Y C1 Z C1 No. 2 side view camera coordinate system O C2 -X C2 Y C2 Z C2 The optical axis directions of the two cameras are parallel to the horizontal plane;
global camera coordinate system O G X-axis forward direction and side view angle camera coordinate system O number 2 C2 Z of (2) C2 The positive directions of the axes are the same;
model coordinate system O of optical element M Is O M -X M Y M Z M And the origin is defined at the vertex A of the upper left corner of the upper surface of the optical element, Z M The axis is perpendicular to the upper surface of the optical element, Z M Upward in the positive direction of axis, X M The positive axis direction is the direction in which the vertex of the upper left corner of the upper surface of the optical element points to the vertex of the lower left corner of the upper surface of the optical element.
3. The method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to claim 2, wherein the joint calibration process of step S2 is as follows:
firstly, calibrating internal reference matrixes of three cameras respectively by adopting a Zhang Zhengyou calibration method, and establishing a mapping relation from a two-dimensional pixel coordinate system to a three-dimensional camera coordinate system;
then, the camera coordinate systems of the global cameras are used as a unified global coordinate system, and the transformation from the camera coordinate systems of the two side view angle cameras to the coordinate systems of the global coordinate system is respectively calibrated to obtain external parameter matrixes of the two side view angle cameras;
the extrinsic matrix comprises: no. 1 side view angle camera coordinate system O C1 To the global coordinate system O G Is a conversion relation T of (2) C1 =(R C1 ,t C1 ),R C1 No. 1 side view camera coordinate system O C1 To the global coordinate system O G T C1 No. 1 side view camera coordinate system O C1 To the global coordinate system O G Is a translation vector of (a);
no. 2 side view angle camera coordinate system O C2 To the global coordinate system O G Is T C2 =(R C2 ,t C2 ),R C2 No. 2 side view camera coordinate system O C2 To the global coordinate system O G T C2 No. 2 side view camera coordinate system O C2 To the global coordinate system O G Is a translation vector of (a).
4. The method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to claim 2 or 3, wherein a three-dimensional calibration block is adopted and combined with an april tag algorithm to calibrate the external reference matrixes of two side-view angle cameras, the three-dimensional calibration block is a cuboid, four side surfaces and the top surface are sequentially defined as surfaces 0-4 of the cuboid, the surfaces 0-4 sequentially adopt 36H11 april tag patterns with id of 0-4, and the coordinate system O of the three-dimensional calibration block B The origin defines the center of an april tag pattern (id=0) on the surface 0, the X axis and the Y axis are parallel to the surface, the positive directions are the directions from the top left vertex to the top right vertex of the pattern, from the top left vertex to the bottom left vertex of the pattern, and the Z axis direction points to the inner side of the model;
the process of calibrating the external reference matrix by adopting the april tag algorithm comprises the following steps:
the surface 4 of the three-dimensional calibration block is kept horizontal, one surface is arbitrarily selected from the surfaces 0-3 of the three-dimensional calibration block, the distance from the april tag pattern of the selected surface to the optical center of the No. 1 side view angle camera along the optical axis direction is 280mm, the distance from the april tag pattern of the surface 4 to the optical center of the global camera along the optical axis direction is 1200mm, and the corresponding april tag pattern is ensured to completely appear in the visual fields of the two cameras;
when the three-dimensional calibration block, the global camera and the No. 1 side view angle camera meet the requirements, an external triggering mode is used for controlling the two cameras to capture images under the time difference of not more than 15ms, and an april tag algorithm is used for calculating the position and the gesture from the calibration graph in the view field to each camera; coordinate system of all graphics on three-dimensional calibration block and coordinate system O of calibration block B The transformation relation of the camera is known by CAD data, the pose of the shooting moment calibration block relative to the global camera and the No. 1 side view angle camera is deduced, and the transformation T between the two poses is calculated C1 =(R C1 ,t C1 ) Namely O C1 To O G Acquiring the conversion relation of the No. 1 side view angle camera at O G A lower extrinsic matrix;
no. 2 side view angle camera at O G The acquisition mode of the external reference matrix is the same as the above process.
5. The method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to claim 3, wherein the pose calculation process of the optical element in the step S3 is as follows:
s31, placing the optical element at a preset position of the FOA assembly platform by using a clamp, so that three cameras synchronously acquire overlook images I of the optical element M And two side view images I SL 、I SF Preprocessing the three images to generate a gray level image;
s32, extracting line segments from the three gray images by adopting an LSD algorithm, and processing the line segments to obtain the edges of the optical elements of the three gray images under the respective camera pixel coordinate systems;
s33, acquiring the analytic expressions of the edges of the optical element under three camera coordinate systems, specifically, according to the matching result in the step S32, when looking down the image I M One edge of the image I SL Or side view image I SF One edge of the CAD model corresponds to the same edge at the same time, and then the CAD model considers thatEdges in the two images correspond to the same edge of the optical element; according to the principle, obtaining analytic formulas of three gray images under respective camera coordinate systems;
s34, according to the camera internal reference matrix, the external reference matrix and the analysis formula under the three camera coordinate systems, the analysis formula of the edge of the optical element under the global coordinate system is obtained;
s35, according to the analysis of the edge of the optical element under the global coordinate system, calculating the position and the posture of the optical element and the size of the upper surface according to the corresponding relation by utilizing a PnP algorithm.
6. The method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to claim 5, wherein the process of obtaining the edge of the optical element in step S32 is as follows:
all line segments are extracted from the three gray images by adopting an LSD algorithm, and the following processing is carried out:
step one, eliminating the excessively short line segment of L <25 pixels, wherein the length L of the line segment is obtained according to the following formula:
(x 1 ,y 1 ),(x 2 ,y 2 ) Pixel coordinates of two endpoints of the line segment respectively;
step two, marking the segment set which is not fused as l Ori The fused set of line segments is denoted as l meg The rest line segments after the short line segments are removed in the first step are firstly placed in the set l Ori In (a) and (b);
step three, fusing line segments:
in set l Ori The line segment with the largest L is searched and marked as L 0 In set l Ori A line segment meeting the condition: line segment midpoint to l 0 The linear Euclidean distance dis is smaller than 3 pixels, and the line segment is equal to l 0 The included angle ang is smaller than 0.5 degrees; will l 0 Line segment slave set meeting condition Ori Removing from the middle part;
the Euclidean distance dis and the included angle ang of the two line segments are calculated according to the following formula:
wherein:
(a 1 ,b 1 ),(a 2 ,b 2 ) Is l 0 The coordinates of the end points,
k is l 0 The slope of the slope is calculated,
x=(x 1 +x 2 )/2
y=(y 1 +y 2 )/2
Δx=x 2 -x 1
Δy=y 2 -y 1
Δa=a 2 -a 1
Δb=b 2 -b 1
step four, for l 0 And the minimum circumscribed rectangle of the line end points is calculated by using the SMBR algorithm, the midpoint pixel coordinates of the two short sides of the circumscribed rectangle are used as the pixel coordinates of the two end points of the new fused line, and the fused line is put into the fused line set l meg In (a) and (b);
repeating the third and fourth steps until the set l Ori The set is empty, and then the fifth step is executed;
step five, deleting the set l meg Middle error line segment:
first, deleting the line segment of L <1200 pixels;
second, using a priori knowledge of the CAD model of the optical element to identify line segments belonging to the edge of the optical element, in particular, the optical element to be clamped by the clampResetting to a preset position, and projecting CAD of the optical element at the moment into the acquired image. Since the extracted line segments are from the optical element edges, the line segments representing the optical element edges are screened as follows: the Euclidean distance dis from the midpoint of the line segment to the edge line segment of a certain model extracted by CAD is smaller than 15 pixels, and the included angle ang is smaller than 2 degrees; will aggregate l meg Segment deletion not meeting the above conditions, set l meg The remaining line segments of (a) act as optical element edges.
7. The method for detecting pose of an optical element assembling process based on an orthogonal vision system according to claim 6, wherein the step S34 of obtaining the resolution of the edge of the optical element in the global coordinate system is:
according to the known camera internal reference matrix and external reference matrix, converting the straight line of the same edge of the corresponding optical element in the two images into a plane under a camera coordinate system by using back projection to obtain the three-dimensional parameter of the plane where the straight line is located, wherein the plane passes through a camera optical center O; the two images are a top view and a side view;
assume an edge line segment l from a global camera 1 And an edge line segment l of a side view camera 2 Corresponding to the same straight line in the space, l 1 And the optical center determination plane P of the global camera 1 ,l 2 And a light center determination plane P of a side view angle camera 2 The plane P is divided into an internal reference matrix and an external reference matrix 1 、P 2 Transforming to global coordinate system to obtain P 1 、P 2 The analytical formula under the global coordinate system is:
P 1 :a 1 X+b 1 Y+c 1 Z+d 1 =0
P 2 :a 2 X+b 2 Y+c 2 Z+d 2 =0
wherein vector (a) 1 ,b 1 ,c 1 ) Is a plane P 1 Normal vector d of (d) 1 Is to satisfy all points in plane P 1 Adding a suffix; vector (a) 2 ,b 2 ,c 2 ) Is a plane P 2 Normal vector d of (d) 2 Is to satisfy allPoint at plane P 2 Adding a suffix; c 1 =c 2 =1;
Because in the imaging of the two cameras, the matched straight lines correspond to the same straight line in the three-dimensional space, the plane P 1 And P 2 The intersecting lines are the target straight lines, namely the straight line analysis of the edges of the optical element is as follows:
P=p 0 +tV
wherein p is 0 =(X 0 ,Y 0 ,Z 0 ) Is a known point on the target line, t is the unit direction vector of the target line, the form is t=ix+ jY +kz, and the coefficients i, j, k satisfy the relation: i.e 2 +j 2 +k 2 =1;
By repeating the method, the analytical formula of the straight line where the upper surface of the optical element is close to the adjacent two edges of the two side view angle cameras in the global coordinate system can be obtained:
L 1 :P=p 1 +t 1 V
L 2 :P=β 2 +t 2 V
wherein p is 1 Is a straight line L 1 A known point, t 1 Is a straight line L 1 Unit direction vector, p 2 Is a straight line L 2 A known point, t 2 Is a straight line L 2 Is a unit direction vector of (a);
solving to L 1 And L 2 Point a= (X) where the sum of distances is minimum M ,Y M ,Z M ) Is the top left corner apex of the upper surface of the optical element, i.e. at L 1 And L 2 Taking a point B and a point C with a distance of 410mm from the point A respectively, and jointly determining a plane by A, B, C:
P 0 :a 0 X+b 0 Y+Z+c 0 =0
the method is that the analytic expression of the upper surface of the optical element under the global coordinate system is that the intersection line of the upper surface and the planes of the other two edges is solved according to the method, namely the spatial analytic expression of the straight line of the remaining two edges under the global coordinate system is that:
L 3 :P=p 3 +t 3 V
L 4 :P=p 4 +t 4 V
wherein p is 3 Is a straight line L 3 A known point, t 3 Is a straight line L 3 Unit direction vector, p 4 Is a straight line L 4 A known point, t 4 Is a straight line L 4 Is a unit direction vector of (a).
8. The method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to claim 7, wherein the step of S35 of calculating the position pose of the optical element and the size of the upper surface is:
optical element model coordinate system O M To the global coordinate system O G Is defined as t 2 ×t 1 The positive X-axis direction is defined as t 2 . Constructing PnP corresponding relation:
two-dimensional point (0, 0), corresponding to three-dimensional point a= (X) M ,Y M ,Z M );
Two-dimensional point (1, 0) corresponding to three-dimensional point A+t 2
Two-dimensional points (0, 1) corresponding to three-dimensional points A+Norm (t 2 ×t 1 )×t 2
Two-dimensional points (1, 1) corresponding to three-dimensional points A+Norm (t 2 ×t 1 )×t 2 +t 2
Wherein Norm (t) is the operation normalized to t;
by establishing PnP corresponding relation of the four groups of points, a model coordinate system O of the optical element can be solved M To the global coordinate system O G The conversion relation (R, t), (R, t) of the optical element is taken as the 6-degree-of-freedom pose of the optical element, and R is taken as the model coordinate system O of the optical element M To the global coordinate system O G T is the model coordinate system O of the optical element M To the global coordinate system O G Is a translation vector of (a);
position matrix t=a T =(X M ,Y M ,Z M ) T
Acquiring the coordinates of the residual vertex of the upper surface of the optical element under the global coordinate system according to the solving method of the coordinates of the point A under the global coordinate system, wherein the residual vertex is B, C, D in turn clockwise,
the length of the optical element is l= | ((a+b) - (d+c))/2|| 2 The width is w= | ((a+d) - (b+c))/2|| 2
9. The method for assembling the optical element based on the orthogonal vision system is characterized in that the pose of the optical element is obtained in real time by using the pose detection method for the optical element assembling process based on the orthogonal vision system according to claims 1-8 during assembly, the deviation between the pose of the current optical element and the ideal pose is calculated, and the manipulator is guided to adjust the pose.
CN202310735104.6A 2023-06-20 2023-06-20 Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method Active CN116758160B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310735104.6A CN116758160B (en) 2023-06-20 2023-06-20 Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310735104.6A CN116758160B (en) 2023-06-20 2023-06-20 Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method

Publications (2)

Publication Number Publication Date
CN116758160A true CN116758160A (en) 2023-09-15
CN116758160B CN116758160B (en) 2024-04-26

Family

ID=87958590

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310735104.6A Active CN116758160B (en) 2023-06-20 2023-06-20 Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method

Country Status (1)

Country Link
CN (1) CN116758160B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117934630A (en) * 2024-01-11 2024-04-26 哈尔滨工业大学 A method for extrinsic calibration of multi-view visual inspection system
CN118781207A (en) * 2024-09-11 2024-10-15 深圳市方度电子有限公司 Method, device and equipment for determining target based on multiple cameras

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108470370A (en) * 2018-03-27 2018-08-31 北京建筑大学 The method that three-dimensional laser scanner external camera joint obtains three-dimensional colour point clouds
JP2018153910A (en) * 2016-12-22 2018-10-04 セイコーエプソン株式会社 Control device, robot, and robot system
CN111009014A (en) * 2019-11-25 2020-04-14 天津大学 Calibration Method of Orthogonal Spectroscopic Imaging Pose Sensor with Universal Imaging Model
CN113870358A (en) * 2021-09-17 2021-12-31 聚好看科技股份有限公司 Method and equipment for joint calibration of multiple 3D cameras
CN115953483A (en) * 2022-12-30 2023-04-11 深圳云天励飞技术股份有限公司 Parameter calibration method and device, computer equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018153910A (en) * 2016-12-22 2018-10-04 セイコーエプソン株式会社 Control device, robot, and robot system
CN108470370A (en) * 2018-03-27 2018-08-31 北京建筑大学 The method that three-dimensional laser scanner external camera joint obtains three-dimensional colour point clouds
CN111009014A (en) * 2019-11-25 2020-04-14 天津大学 Calibration Method of Orthogonal Spectroscopic Imaging Pose Sensor with Universal Imaging Model
CN113870358A (en) * 2021-09-17 2021-12-31 聚好看科技股份有限公司 Method and equipment for joint calibration of multiple 3D cameras
CN115953483A (en) * 2022-12-30 2023-04-11 深圳云天励飞技术股份有限公司 Parameter calibration method and device, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
安宁;翟晓彤;赵华东;徐博凡;: "视觉系统在玩具小车装配中的应用", 机械设计与制造, no. 10, 8 October 2020 (2020-10-08), pages 256 - 260 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117934630A (en) * 2024-01-11 2024-04-26 哈尔滨工业大学 A method for extrinsic calibration of multi-view visual inspection system
CN118781207A (en) * 2024-09-11 2024-10-15 深圳市方度电子有限公司 Method, device and equipment for determining target based on multiple cameras
CN118781207B (en) * 2024-09-11 2024-12-24 深圳市方度电子有限公司 Target determining method, device and equipment based on multiple cameras

Also Published As

Publication number Publication date
CN116758160B (en) 2024-04-26

Similar Documents

Publication Publication Date Title
US9124873B2 (en) System and method for finding correspondence between cameras in a three-dimensional vision system
US9436987B2 (en) Geodesic distance based primitive segmentation and fitting for 3D modeling of non-rigid objects from 2D images
CN108965690B (en) Image processing system, image processing apparatus, and computer-readable storage medium
US8600192B2 (en) System and method for finding correspondence between cameras in a three-dimensional vision system
CN116758160B (en) Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method
JP7037876B2 (en) Use of 3D vision in automated industrial inspection
CN102713671A (en) Point cloud data processing device, point cloud data processing method, and point cloud data processing program
JP5156601B2 (en) Shape measuring apparatus and program
JP2006148745A (en) Camera calibration method and apparatus
US11986955B2 (en) In-hand pose refinement for pick and place automation
CN108648237A (en) A kind of space-location method of view-based access control model
JP5297779B2 (en) Shape measuring apparatus and program
CN114299156A (en) Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area
JP6180158B2 (en) Position / orientation measuring apparatus, control method and program for position / orientation measuring apparatus
Yamauchi et al. Calibration of a structured light system by observing planar object from unknown viewpoints
Shroff et al. Finding a needle in a specular haystack
CN111738971B (en) Circuit board stereoscopic scanning detection method based on line laser binocular stereoscopic vision
So et al. Calibration of a dual-laser triangulation system for assembly line completeness inspection
JP3696336B2 (en) How to calibrate the camera
JP3696335B2 (en) Method for associating each measurement point of multiple images
Tran et al. Extrinsic calibration of a camera and structured multi-line light using a rectangle
Lin et al. Image-sensor-based fast industrial-robot positioning system for assembly implementation
Kim et al. A new camera calibration method for robotic applications
CN117881959A (en) Appearance inspection device and appearance inspection method
WO2022003919A1 (en) Inspection data preparation method, inspection data preparation device, and inspection device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant