CN114963981B - A non-contact measurement method for cylindrical parts docking based on monocular vision - Google Patents

A non-contact measurement method for cylindrical parts docking based on monocular vision Download PDF

Info

Publication number
CN114963981B
CN114963981B CN202210527443.0A CN202210527443A CN114963981B CN 114963981 B CN114963981 B CN 114963981B CN 202210527443 A CN202210527443 A CN 202210527443A CN 114963981 B CN114963981 B CN 114963981B
Authority
CN
China
Prior art keywords
coordinate system
image
camera
hole
edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210527443.0A
Other languages
Chinese (zh)
Other versions
CN114963981A (en
Inventor
薛善良
郑祖闯
岳松
张明
陈琪玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN202210527443.0A priority Critical patent/CN114963981B/en
Publication of CN114963981A publication Critical patent/CN114963981A/en
Application granted granted Critical
Publication of CN114963981B publication Critical patent/CN114963981B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A cylindrical part butt joint non-contact measurement method based on monocular vision is characterized by comprising the following steps: selecting two cameras of the same manufacturer and the same model, and calibrating and registering the cameras; 2. photographing and measuring the end surfaces of the fixed part and the butt joint part; 3. denoising the shot picture; 4. extracting the processing area where the end face features are located through threshold segmentation, extracting the hole edge through an edge detection algorithm, determining the position of the hole center in an image through screening and ellipse fitting, mapping the hole centers and the axes on two parts to the same coordinate system through coordinate transformation, and calculating the rolling angle of the butt joint part relative to the fixed part. The invention has high automation degree and high measuring speed. According to the invention, on the premise that a target is not required to be arranged on the end face of the part and manual participation is not required, automatic measurement is realized, and the working efficiency is improved.

Description

一种基于单目视觉的筒状零件对接非接触式测量方法A non-contact measurement method for cylindrical parts docking based on monocular vision

技术领域technical field

本发明涉及智能装配领域,尤其是一种大尺寸筒状零件的对接技术,具体地说是一种基于单目视觉的筒状零件对接非接触式测量方法,在对接零件与固定零件同轴度符合对接要求的情况下,用于实现两者之间相对转角的测量,以实现二者的对接。The invention relates to the field of intelligent assembly, especially a docking technology of large-sized cylindrical parts, specifically a non-contact measurement method for butt jointing of cylindrical parts based on monocular vision. In the case of meeting the docking requirements, it is used to measure the relative angle between the two, so as to realize the docking of the two.

背景技术Background technique

随着科学技术不断发展,市场竞争日趋激烈。快速、高效、可靠的生产已经成为了当今所有工业发展的主要方向和特性。为了实现这些目标各行业都面临着提升生产效率、提高产品质量和降低生产成本的问题。在某些大尺寸筒状零件的生产中,高质量自动化对接测量设备尤为重要。快捷精确的测量零件的空间姿态、对中和定位,能够在缩短装配时间、提高对接效率方面扮演重要角色。With the continuous development of science and technology, market competition is becoming increasingly fierce. Fast, efficient and reliable production has become the main direction and characteristics of all industrial development today. In order to achieve these goals, various industries are faced with the problems of improving production efficiency, improving product quality and reducing production costs. In the production of some large-size cylindrical parts, high-quality automatic butt joint measurement equipment is particularly important. Quick and accurate measurement of spatial attitude, centering and positioning of parts can play an important role in shortening assembly time and improving docking efficiency.

现有的姿态测量方式有接触式测量和非接触式测量两大类。非接触式测量主要采用视觉测量与激光扫描测量两种方式。接触式测量主要由三坐标机械机构和检测头组成,并且检测头和被测零件必须接触才可以实现测量。为保证测量精度和防止仪器被撞坏,测量仪器和被测零件接触时必须缓慢接触。另外由于需要检测的零件的空间维度比较多,所以测量的点比较多,导致测量时间比较长,严重影响了生产效率。激光扫描测量,需要对工件的整体外形进行扫描,生成点云,并处理大量的模型数据,测量时间长。而视觉测量技术其测量系统结构简单,便于移动,数据采集快速、便捷,操作方便,测量成本较低,尤其适合于三维空间点位、尺寸或大型工件轮廓的检测。视觉测量又分为单目视觉和双目视觉,相比于双目视觉,单目视觉系统结构简单,不需要对相机的调校,安装和使用更加方便,因此本发明使用单目视觉进行测量。同时,这种非接触测量方法既可以避免对被测对象的损坏又适合被测对象不可接触的情况,如高温、高压、流体、环境危险等场合;同时机器视觉系统可以同时对多个尺寸一起测量,实现了测量工作的快速完成;而对于微小尺寸的测量又是机器视觉系统的长处,它可以利用高倍镜头放大被测对象,使得测量精度达到微米以上。There are two types of attitude measurement methods: contact measurement and non-contact measurement. Non-contact measurement mainly adopts two methods: visual measurement and laser scanning measurement. Contact measurement is mainly composed of a three-coordinate mechanical mechanism and a detection head, and the detection head and the measured part must be in contact to achieve measurement. In order to ensure the measurement accuracy and prevent the instrument from being damaged, the measuring instrument must be in contact with the measured part slowly. In addition, since the parts to be inspected have many spatial dimensions, there are many measurement points, which leads to a long measurement time and seriously affects the production efficiency. Laser scanning measurement needs to scan the overall shape of the workpiece, generate a point cloud, and process a large amount of model data, which takes a long time to measure. The visual measurement technology has a simple measurement system structure, easy to move, fast and convenient data collection, convenient operation, and low measurement cost, especially suitable for the detection of three-dimensional space points, dimensions or large workpiece contours. Visual measurement is further divided into monocular vision and binocular vision. Compared with binocular vision, monocular vision system has a simple structure, does not need to adjust the camera, and is more convenient to install and use. Therefore, the present invention uses monocular vision for measurement . At the same time, this non-contact measurement method can avoid damage to the measured object and is suitable for situations where the measured object cannot be touched, such as high temperature, high pressure, fluid, environmental hazards, etc.; at the same time, the machine vision system can simultaneously measure multiple dimensions together. The measurement realizes the rapid completion of the measurement work; and the measurement of small sizes is the strength of the machine vision system, which can use high-power lenses to enlarge the measured object, so that the measurement accuracy can reach more than microns.

发明内容Contents of the invention

本发明的目的是针对现有的筒状零件测量周期长、需处理数据多易导致装配周期长,影响生产效率的问题,发明一种基于单目视觉的筒状零件对接非接触式测量方法。The purpose of the present invention is to solve the problems of long measurement cycle of existing cylindrical parts and too much data to be processed, which easily leads to long assembly cycle and affects production efficiency, and to invent a non-contact measurement method for butt joint of cylindrical parts based on monocular vision.

本发明的技术方案是:Technical scheme of the present invention is:

一种基于单目视觉的筒状零件对接非接触式测量方法,其特征是它包括以下步骤:A non-contact measurement method for butt joint of cylindrical parts based on monocular vision is characterized in that it comprises the following steps:

步骤1:选取两台同一生产商同一型号的相机,按照图1所示进行安装,一台相机拍摄固定零件端面,另一台相机拍摄对接零件端面,然后对两台相机进行标定和配准,具体步骤如下:Step 1: Select two cameras of the same manufacturer and model, and install them as shown in Figure 1. One camera shoots the end face of the fixed part, and the other camera shoots the end face of the docked part, and then calibrates and registers the two cameras. Specific steps are as follows:

步骤1.1:建立世界坐标系、相机坐标系、图像坐标系、像素坐标系,四个坐标系位置关系如图2所示,以相机光轴为Z轴,根据右手定则建立相机坐标系(Xc,Yc,Zc),以照片中心为原点,建立如图2图像坐标系(x,y),以照片左上角第一个元素为原点,建立如图2建立像素坐标系(u,v),以相机标定时拍摄的第一张照片的相机坐标系作为世界坐标系(Xw,Yw,Zw)。Step 1.1: Establish the world coordinate system, camera coordinate system, image coordinate system, and pixel coordinate system. The positional relationship of the four coordinate systems is shown in Figure 2. The camera optical axis is the Z axis, and the camera coordinate system is established according to the right-hand rule (X c , Y c , Z c ), take the center of the photo as the origin, establish the image coordinate system (x, y) as shown in Figure 2, and take the first element in the upper left corner of the photo as the origin, establish the pixel coordinate system (u, y) as shown in Figure 2 v), the camera coordinate system of the first photo taken during camera calibration is used as the world coordinate system (X w , Y w , Z w ).

步骤1.2:使用标准大小的国际象棋棋盘格作为标定物,对标定物从不同方向拍摄多张照片,将照片输入到MATLAB单目相机标定模型中,获得相机坐标系到世界坐标系的转换矩阵,即相机的外参,相机的畸变模型,以及包括相机焦距、单个像元与图像的宽和高,以及在图像坐标系下的照片中心点坐标的内参,计算出两组相机的数学模型。Step 1.2: Use a standard-sized chessboard as the calibration object, take multiple photos of the calibration object from different directions, input the photos into the MATLAB monocular camera calibration model, and obtain the transformation matrix from the camera coordinate system to the world coordinate system, That is, the external parameters of the camera, the distortion model of the camera, and the internal parameters including the focal length of the camera, the width and height of a single pixel and the image, and the coordinates of the center point of the photo in the image coordinate system, and calculate the mathematical models of the two groups of cameras.

步骤1.3:将两台相机沿配准架对称放置,根据相机与零件端面的距离,将相机轴心与拍摄端面夹角为45度,将完全对接贴合的零件重新分开,使对接和固定零件端面距离固定架中点的距离相等,对相机进行配准和标定,系统模型结构组成如图1所示。Step 1.3: Place the two cameras symmetrically along the registration frame. According to the distance between the camera and the end face of the part, the angle between the axis of the camera and the end face of the shooting is 45 degrees, and the parts that are completely docked and bonded are separated again to make the docking and fixing parts The distance between the end face and the midpoint of the fixed frame is equal, and the camera is registered and calibrated. The structure of the system model is shown in Figure 1.

步骤2:对固定零件与对接零件的端面进行拍照测量。Step 2: Take photos and measure the end faces of the fixed parts and the docking parts.

步骤3:结合相关约束条件,对步骤2得到的测量图片进行预处理,具体步骤如下:Step 3: Combined with the relevant constraints, preprocess the measurement image obtained in step 2. The specific steps are as follows:

步骤3.1:对测量图片进行重投影,应用步骤1得到相机内参和外参,将对端面倾斜拍照的图片转换为对端面垂直拍照的图片。Step 3.1: Reproject the measurement picture, apply step 1 to obtain the internal and external parameters of the camera, and convert the pictures taken obliquely to the end face vertically.

步骤3.2:对拍照后的图片进行对数变换,将源图像中范围较窄的低灰度值映射到范围较宽的灰度区间,同时将范围较宽的高灰度值区间映射为较窄的灰度区间,从而扩展了暗像素的值,压缩了高灰度的值,对图像中低灰度细节进行增强。Step 3.2: Perform logarithmic transformation on the photographed image, map the narrower low gray value in the source image to a wider gray range, and map the wider high gray value range to a narrower range The grayscale interval, thereby expanding the value of dark pixels, compressing the value of high grayscale, and enhancing the low grayscale details in the image.

步骤3.3:将照片中一点的值用该点的一个邻域中各点值的中值代替,解决图像中椒盐噪声,以及重投影后存在的没有灰度值的点。Step 3.3: Replace the value of a point in the photo with the median value of each point in a neighborhood of the point to solve the salt and pepper noise in the image and the points without gray value after reprojection.

步骤4:对上述预处理后的图片进行阈值分割和边缘检测,提取孔心与零件轴心的位置,具体步骤如下:Step 4: Perform threshold segmentation and edge detection on the above-mentioned preprocessed image, and extract the position of the center of the hole and the axis of the part. The specific steps are as follows:

步骤4.1:根据端面对接特征设置阈值分割的灰度值为50,通过阈值分割筛选并提取出独立的连通区域,对于所选出的区域,通过圆度和面积特征进一步筛选出孔所在的区域。对该区与分别进行腐蚀和膨胀并将所得的区域求交集,得到孔边缘所在的区域。Step 4.1: Set the gray value of the threshold segmentation to 50 according to the end-to-face feature, and filter and extract independent connected regions through the threshold segmentation. For the selected region, further filter the region where the hole is located through the roundness and area features. Erosion and dilation are performed on this area and respectively, and the resulting areas are intersected to obtain the area where the edge of the hole is located.

步骤4.2:将上一步所得到的区域和原图像求交集,将含有孔边缘的图像从原图上选出,进一步缩小图像的运算区域。Step 4.2: Intersect the area obtained in the previous step with the original image, select the image containing the edge of the hole from the original image, and further reduce the calculation area of the image.

步骤4.3:运用边缘检测算法提取孔边缘,并根据形状进行筛选,对筛选结果椭圆拟合,确定孔中心和零件轴心在图像中的位置,通过相机模型计算得到被测孔心和零件轴心在其对应的相机真实坐标系中的位置。Step 4.3: Use the edge detection algorithm to extract the edge of the hole, and filter according to the shape, fit the ellipse to the screening result, determine the position of the hole center and the axis of the part in the image, and calculate the center of the measured hole and the axis of the part through the camera model The position in its corresponding camera's real coordinate system.

步骤4.4:将固定零件和对接零件中孔心和轴心映射到同一世界坐标系,计算出需要调整的偏转角。Step 4.4: Map the hole center and axis center of the fixed part and the docking part to the same world coordinate system, and calculate the deflection angle that needs to be adjusted.

本发明的有益效果是:The beneficial effects of the present invention are:

本发明既可以避免对被测对象的损坏又适合被测对象不可接触的情况,如高温、高压、流体、环境危险等场合;同时机器视觉系统可以同时对多个尺寸一起测量,实现了测量工作的快速完成;而对于微小尺寸的测量又是机器视觉系统的长处,它可以利用高倍镜头放大被测对象,使得测量精度达到微米以上。The present invention can not only avoid the damage to the measured object, but also is suitable for the situation where the measured object cannot be touched, such as high temperature, high pressure, fluid, environmental hazards and other occasions; at the same time, the machine vision system can measure multiple dimensions together at the same time, realizing the measurement work The fast completion; and the measurement of small size is the strength of the machine vision system, it can use high power lens to enlarge the measured object, so that the measurement accuracy can reach more than micron.

本发明为零件姿态的测量提供一种简易化、自动化的测量方法,测量的内容为零件之间的相对转角。本发明自动化程度高,测量速度快。本发明在不需要在零件端面设置标靶,不需要人工参与的前提下,实现了自动测量,提高了工作效率。The invention provides a simplified and automatic measurement method for the measurement of the attitude of the parts, and the measurement content is the relative rotation angle between the parts. The invention has high degree of automation and high measurement speed. The present invention realizes automatic measurement and improves working efficiency under the premise that no target is set on the end face of the part and no manual participation is required.

附图说明Description of drawings

图1是本发明的系统模型结构组成图。Fig. 1 is a structural composition diagram of the system model of the present invention.

图2是本发明所采用的四个坐标系位置关系示意图。Fig. 2 is a schematic diagram of the positional relationship of the four coordinate systems used in the present invention.

图3是本发明所采用的非接触式测量装置的结构示意图。Fig. 3 is a schematic structural view of the non-contact measuring device used in the present invention.

图4是本发明的非接触测量处理流程图。Fig. 4 is a flow chart of the non-contact measurement process of the present invention.

具体实施方式Detailed ways

为了使本发明的技术方案和实施步骤更加清晰明了,下面结合附图和实施例对本发明做进一步的说明。In order to make the technical solution and implementation steps of the present invention clearer, the present invention will be further described below in conjunction with the accompanying drawings and embodiments.

如图1-4所示。As shown in Figure 1-4.

一种基于单目视觉的筒状零件对接非接触式测量方法,用于实现对接零件与固定零件上定位孔与零件轴心的提取,从而实现两个零件之间相对转角测量。A non-contact measurement method for cylindrical parts docking based on monocular vision, which is used to realize the extraction of the positioning holes and the axes of the parts on the docked parts and fixed parts, so as to realize the measurement of the relative rotation angle between the two parts.

本发明提供的一种用于零件对接的非接触式测量方法采用如图3所示的测量系统包括非接触式测量装置、通讯模块一、通信模块二以及工控机系统。A non-contact measurement method for parts docking provided by the present invention adopts a measurement system as shown in FIG. 3 including a non-contact measurement device, a communication module 1, a communication module 2 and an industrial computer system.

图4是本发明的非接触测量处理流程图,具体包括如下步骤:Fig. 4 is the flow chart of non-contact measurement processing of the present invention, specifically comprises the following steps:

步骤1:选取两台同一生产商同一型号的相机,按照图1所示进行安装,相机A拍摄固定零件端面,相机B拍摄对接零件端面,然后对两台相机进行标定和配准,本发明系统模型结构如图1所示,具体步骤如下:Step 1: Select two cameras of the same manufacturer and the same model, and install them as shown in Figure 1. Camera A shoots the end face of the fixed part, camera B shoots the end face of the docked part, and then calibrates and registers the two cameras. The system of the present invention The model structure is shown in Figure 1, and the specific steps are as follows:

步骤1.1:建立世界坐标系、相机坐标系、图像坐标系、像素坐标系,四个坐标系位置关系如图2所示,以相机光轴为Z轴,根据右手定则建立相机坐标系(Xc,Yc,Zc),以照片中心为原点,建立如图2图像坐标系(x,y),以照片左上角第一个元素为原点,建立如图2建立像素坐标系(u,v),以相机标定时拍摄的第一张照片的相机坐标系作为世界坐标系(Xw,Yw,Zw)。Step 1.1: Establish the world coordinate system, camera coordinate system, image coordinate system, and pixel coordinate system. The positional relationship of the four coordinate systems is shown in Figure 2. The camera optical axis is the Z axis, and the camera coordinate system is established according to the right-hand rule (X c , Y c , Z c ), take the center of the photo as the origin, establish the image coordinate system (x, y) as shown in Figure 2, and take the first element in the upper left corner of the photo as the origin, establish the pixel coordinate system (u, y) as shown in Figure 2 v), the camera coordinate system of the first photo taken during camera calibration is used as the world coordinate system (X w , Y w , Z w ).

步骤1.2:使用标准大小的国际象棋棋盘格作为标定物,对标定物从不同方向拍摄多张照片,将照片输入到MATLAB单目相机标定模型中,获得相机的畸变模型,相机坐标系到世界坐标系的转换矩阵,以及包括相机焦距、单个像元与图像的宽和高,以及在图像坐标系下的照片中心点坐标的内参,计算出两组相机的数学模型。Step 1.2: Use a standard-sized chessboard as the calibration object, take multiple photos of the calibration object from different directions, and input the photos into the MATLAB monocular camera calibration model to obtain the camera distortion model, and the camera coordinate system to the world coordinates The transformation matrix of the system, as well as the internal parameters including the focal length of the camera, the width and height of a single pixel and the image, and the coordinates of the center point of the photo in the image coordinate system, calculate the mathematical models of the two groups of cameras.

步骤1.3:将两台相机沿固定架对称放置,根据相机与零件端面的距离,将相机轴线与拍摄端面夹角为45度,将完全对接贴合的零件重新分开,使对接和固定零件端面距离固定架中点的距离相等,对端面上的定位孔进行拍照,提取固定端面与对接端面孔心坐标对分别为Pak(xk,yk)与Pbk(xk,yk),k=1,2,3……N,获取两组不同的配准点对。将固定零件相机作为基准相机,假设两点集可通过变换矩阵进行配准,即:Step 1.3: Place the two cameras symmetrically along the fixing frame. According to the distance between the camera and the end face of the part, the angle between the axis of the camera and the end face of the shooting part is 45 degrees, and the parts that are completely docked and bonded are separated again, so that the distance between the end face of the docking part and the fixed part The distances between the midpoints of the fixing frame are equal, and the positioning holes on the end face are photographed, and the center coordinates of the fixed end face and the docking end face are extracted as P ak (x k , y k ) and P bk (x k , y k ), k =1,2,3...N, to obtain two different registration point pairs. Taking the fixed part camera as the reference camera, it is assumed that the two point sets can be registered through the transformation matrix, namely:

计算出在物理空间下固定零件与对接零件的世界坐标系之间的转移矩阵 Calculate the transfer matrix between the world coordinate system of the fixed part and the docked part in physical space

步骤2:对固定零件与对接零件的端面进行拍照测量。Step 2: Take photos and measure the end faces of the fixed parts and the docking parts.

步骤3:结合相关约束条件,对步骤2得到的测量图片进行预处理,具体步骤如下:Step 3: Combined with the relevant constraints, preprocess the measurement image obtained in step 2. The specific steps are as follows:

步骤3.1:对测量图片进行重投影,应用步骤1得到相机内参和外参,将对端面倾斜拍照的图片转换为对端面垂直拍照的图片。Step 3.1: Reproject the measurement picture, apply step 1 to obtain the internal and external parameters of the camera, and convert the pictures taken obliquely to the end face vertically.

步骤3.2:对拍照后的图片进行对数变换s=clog(1+r),c为常数,将源图像中范围较窄的低灰度值映射到范围较宽的灰度区间,同时将范围较宽的高灰度值区间映射为较窄的灰度区间,从而扩展了暗像素的值,压缩了高灰度的值,对图像中低灰度细节进行增强。Step 3.2: Perform logarithmic transformation s=clog(1+r) to the picture after taking the photo, c is a constant, and map the low grayscale value with a narrower range in the source image to a grayscale interval with a wider range, and simultaneously convert the range The wider high grayscale value interval is mapped to a narrower grayscale interval, thereby expanding the value of dark pixels, compressing the value of high grayscale, and enhancing the low grayscale details in the image.

步骤3.3:将照片中一点的值用该点的一个邻域中各点值的中值代替,解决图像中椒盐噪声,以及重投影后存在的没有灰度值的点。Step 3.3: Replace the value of a point in the photo with the median value of each point in a neighborhood of the point to solve the salt and pepper noise in the image and the points without gray value after reprojection.

步骤4:对上述预处理后的图片进行阈值分割和边缘检测,提取孔心与零件轴心的位置,具体步骤如下:Step 4: Perform threshold segmentation and edge detection on the above-mentioned preprocessed image, and extract the position of the center of the hole and the axis of the part. The specific steps are as follows:

步骤4.1:设置适用于当前情况下阈值分割的灰度值,通过阈值分割提取出独立的连通区域,对于所选出的区域,通过圆度和面积特征进一步筛选出孔所在的区域。对该区与分别进行腐蚀和膨胀并将所得的区域求交集,得到孔边缘所在的区域。将所得到的区域和原图像求交集,将含有孔边缘的图像从原图上选出,进一步缩小图像的运算区域。Step 4.1: Set the gray value suitable for threshold segmentation in the current situation, and extract independent connected regions through threshold segmentation. For the selected region, further filter out the region where the hole is located by roundness and area features. Erosion and dilation are performed on this area and respectively, and the resulting areas are intersected to obtain the area where the edge of the hole is located. Calculate the intersection of the obtained area and the original image, select the image containing the edge of the hole from the original image, and further reduce the calculation area of the image.

步骤4.2:对上一步结果进行高斯滤波使图像变得平滑,对于一个位置(m,n)的像素点,其灰度值(这里只考虑二值图)为f(m,n)。那么经过高斯滤波后的灰度值将变为:计算滤波后的边缘梯度值和梯度方向,边缘就是灰度值变化较大的像素点的集合。在图像中,用梯度来表示灰度值的变化程度和方向,通过以下公式计算梯度值和梯度方向:Step 4.2: Perform Gaussian filtering on the results of the previous step to smooth the image. For a pixel at a position (m, n), its gray value (only binary images are considered here) is f(m, n). Then the gray value after Gaussian filtering will become: Calculate the filtered edge gradient value and gradient direction, and the edge is a collection of pixels with large gray value changes. In the image, the gradient is used to represent the change degree and direction of the gray value, and the gradient value and gradient direction are calculated by the following formula:

在高斯滤波过程中,边缘有可能被放大了。因此,通过设置规则来过滤不是边缘的点,使边缘的宽度尽可能为1个像素点:如果一个像素点属于边缘,那么这个像素点在梯度方向上的梯度值是最大的,否则不是边缘,将灰度值设为0。使用上下阀值来检测边缘,其中大于上阈值的都被检测为边缘,而低于上阈值的都被检测为非边缘。对于中间的像素点,如果与确定为边缘的像素点邻接,则判定为边缘;否则为非边缘。During Gaussian filtering, edges may be enlarged. Therefore, filter the points that are not edges by setting rules so that the width of the edge is 1 pixel as much as possible: if a pixel belongs to the edge, then the gradient value of this pixel in the gradient direction is the largest, otherwise it is not an edge, Set the grayscale value to 0. The upper and lower thresholds are used to detect edges, and those greater than the upper threshold are detected as edges, while those lower than the upper threshold are detected as non-edges. For the middle pixel, if it is adjacent to the pixel determined as an edge, it is determined as an edge; otherwise, it is not an edge.

步骤4.3:通过形状对结果进行筛选,并利用最小二乘法进行椭圆拟合,确定孔中心和零件轴心在图像中的位置,通过相机模型计算得到被测孔中心和零件轴心在其对应的相机真实坐标系中的位置。Step 4.3: Filter the results by shape, and use the least square method to perform ellipse fitting, determine the position of the hole center and the axis of the part in the image, and calculate the center of the measured hole and the axis of the part in its corresponding position through the camera model. The camera's position in the real coordinate system.

步骤4.4:将固定零件和对接零件中孔心和轴心映射到同一世界坐标系下,轴心坐标为Hab0=(x0,y0),孔心坐标集为Ha=(xm,ym),Hb=(xm,ym),m=1,2,3……N,根据同一坐标系下两零件轴心与孔心确定的直线,通过三角函数:Step 4.4: Map the hole center and axis center of the fixed part and the docking part to the same world coordinate system, the axis coordinate is H ab0 =(x 0 ,y 0 ), the hole center coordinate set is H a =(x m , y m ), H b = (x m ,y m ), m=1, 2, 3...N, according to the straight line determined by the axes and holes of the two parts in the same coordinate system, through trigonometric functions:

计算出需要调整的最小偏转角α。Calculate the minimum deflection angle α that needs to be adjusted.

本发明未涉及部分与现有技术相同或可采用现有技术加以实现。The parts not involved in the present invention are the same as the prior art or can be realized by adopting the prior art.

Claims (2)

1.一种基于单目视觉的筒状零件对接非接触式测量方法,其特征是它包括以下步骤:步骤一.选取两台同一生产商同一型号的相机,对相机进行标定和配准,所述的对相机进行标定和配准的步骤如下:1. A non-contact measurement method for butt jointing of cylindrical parts based on monocular vision is characterized in that it comprises the following steps: Step 1. Select two cameras of the same model from the same manufacturer, and the cameras are calibrated and registered. The steps for calibrating and registering the cameras described above are as follows: 步骤1.1:建立世界坐标系、相机坐标系、图像坐标系、像素坐标系四个坐标系,以相机光轴为Z轴,根据右手定则建立相机坐标系(Xc,Yc,Zc),以照片中心为原点,建立图像坐标系(x,y),以照片左上角第一个元素为原点,建立像素坐标系(u,v),以相机标定时拍摄的第一张照片的相机坐标系作为世界坐标系(Xw,Yw,Zw);Step 1.1: Establish the four coordinate systems of world coordinate system, camera coordinate system, image coordinate system, and pixel coordinate system, take the camera optical axis as the Z axis, and establish the camera coordinate system according to the right-hand rule (X c , Y c , Z c ) , take the center of the photo as the origin, establish the image coordinate system (x, y), take the first element in the upper left corner of the photo as the origin, establish the pixel coordinate system (u, v), and use the camera of the first photo taken when the camera is calibrated The coordinate system is used as the world coordinate system (X w , Y w , Z w ); 步骤1.2:使用标准大小的国际象棋棋盘格作为标定物,对标定物从不同方向拍摄多张照片,将照片输入到MATLAB单目相机标定模型中,获得相机坐标系到世界坐标系的转换矩阵,即相机的外参,相机的畸变模型,以及包括相机焦距、单个像元与图像的宽和高,以及在图像坐标系下的照片中心点坐标的内参,计算出两组相机的数学模型;Step 1.2: Use a standard-sized chessboard as the calibration object, take multiple photos of the calibration object from different directions, input the photos into the MATLAB monocular camera calibration model, and obtain the transformation matrix from the camera coordinate system to the world coordinate system, That is, the external parameters of the camera, the distortion model of the camera, and the internal parameters including the focal length of the camera, the width and height of a single pixel and the image, and the coordinates of the center point of the photo in the image coordinate system, and calculate the mathematical models of the two groups of cameras; 步骤1.3:将两台相机沿配准架对称放置,根据相机与零件端面的距离,使相机轴心与拍摄端面夹角为45度,将完全对接贴合的零件重新分开,使对接和固定零件端面距离固定架中点的距离相等,对相机进行配准和标定;Step 1.3: Place the two cameras symmetrically along the registration frame. According to the distance between the camera and the end face of the part, make the angle between the axis of the camera and the end face of the shooting be 45 degrees, and separate the parts that are completely docked and fitted to make the docking and fixing parts The distance between the end surface and the midpoint of the fixed frame is equal, and the camera is registered and calibrated; 步骤二.对固定零件和对接零件的端面进行拍照测量,对得到的测量图片进行预处理,具体步骤如下:Step 2. Take photos and measure the end faces of the fixed parts and the docking parts, and preprocess the obtained measurement pictures. The specific steps are as follows: 步骤2.1:对测量图片进行重投影,应用步骤1得到的相机内参和外参,将对端面倾斜拍照的图片转换为对端面垂直拍照的图片;Step 2.1: Reproject the measurement picture, apply the internal and external parameters of the camera obtained in step 1, and convert the picture taken obliquely to the end face vertically; 步骤2.2:对拍照后的图片进行对数变换,将源图像中范围较窄的低灰度值映射到范围较宽的灰度区间,同时将范围较宽的高灰度值区间映射为较窄的灰度区间,从而扩展暗像素的值,压缩高灰度的值,对图像中低灰度细节进行增强;Step 2.2: Perform logarithmic transformation on the photographed image, map the narrower low gray value in the source image to a wider gray range, and map the wider high gray value range to a narrower range The grayscale interval, thereby expanding the value of dark pixels, compressing the value of high grayscale, and enhancing the low grayscale details in the image; 步骤2.3:将照片中一点的值用该点的一个邻域中各点值的中值代替,解决图像中椒盐噪声,以及重投影后存在的没有灰度值的点;Step 2.3: Replace the value of a point in the photo with the median value of each point in a neighborhood of the point to solve the salt and pepper noise in the image and the points without gray value after reprojection; 步骤三.对拍摄得到的图片进行去噪处理;Step 3. Denoising the captured image; 步骤四.通过阈值分割,提取端面特征所在的处理区域,通过边缘检测算法提取孔边缘,经过筛选和椭圆拟合,确定孔中心在图像中的位置,将两个零件上的孔心与轴心经过坐标变换映射到同一坐标系下,计算出对接零件相对于固定零件的滚转角;Step 4. Through threshold segmentation, extract the processing area where the end face features are located, and extract the edge of the hole through the edge detection algorithm. After screening and ellipse fitting, determine the position of the hole center in the image, and compare the hole center and the axis center of the two parts After the coordinate transformation is mapped to the same coordinate system, the roll angle of the docking part relative to the fixed part is calculated; 对预处理后的图片进行阈值分割和边缘检测,提取孔心与零件轴心的位置,具体步骤如下:Perform threshold segmentation and edge detection on the preprocessed image, and extract the position of the center of the hole and the axis of the part. The specific steps are as follows: 步骤3.1:根据端面对接特征设置阈值分割的灰度值为50,通过阈值分割筛选并提取出独立的连通区域,对于所选出的区域,通过圆度和面积特征进一步筛选出孔所在的区域;对该区与分别进行腐蚀和膨胀并将所得的区域求交集,得到孔边缘所在的区域;Step 3.1: Set the gray value of the threshold segmentation to 50 according to the face-to-face feature, filter and extract independent connected regions through the threshold segmentation, and further filter out the region where the hole is located by using the roundness and area features for the selected region; Erosion and expansion are performed on the area and the intersection of the obtained areas to obtain the area where the edge of the hole is located; 步骤3.2:将上一步所得到的区域和原图像求交集,将含有孔边缘的图像从原图上选出,进一步缩小图像的运算区域;Step 3.2: Intersect the area obtained in the previous step with the original image, select the image containing the edge of the hole from the original image, and further reduce the calculation area of the image; 步骤3.3:运用边缘检测算法提取孔边缘,并根据形状进行筛选,对筛选结果进行椭圆拟合,确定孔中心和零件轴心在图像中的位置,通过相机模型计算得到被测孔心和零件轴心在其对应的相机真实坐标系中的位置;Step 3.3: Use the edge detection algorithm to extract the edge of the hole, and filter according to the shape, perform ellipse fitting on the screening result, determine the position of the center of the hole and the axis of the part in the image, and calculate the center of the measured hole and the axis of the part through the camera model The position of the center in its corresponding camera real coordinate system; 步骤3.4:将固定零件和对接零件中孔心和轴心映射到同一世界坐标系,计算出需要调整的偏转角;Step 3.4: Map the hole center and axis center of the fixed part and the docking part to the same world coordinate system, and calculate the deflection angle that needs to be adjusted; 采用高斯滤波使图像变得平滑,对于一个位置(m,n)的像素点,其灰度值为f(m,n);那么经过高斯滤波后的灰度值将变为:计算滤波后的边缘梯度值和梯度方向,边缘就是灰度值变化较大的像素点的集合;在图像中,用梯度来表示灰度值的变化程度和方向,通过以下公式计算梯度值和梯度方向:Gaussian filtering is used to smooth the image. For a pixel at a position (m,n), its gray value is f(m,n); then the gray value after Gaussian filtering will become: Calculate the filtered edge gradient value and gradient direction. The edge is a collection of pixels with large gray value changes; in the image, the gradient is used to represent the change degree and direction of the gray value, and the gradient value and gradient are calculated by the following formula direction: 在高斯滤波过程中,通过设置规则来过滤不是边缘的点,使边缘的宽度尽可能为1个像素点:如果一个像素点属于边缘,那么这个像素点在梯度方向上的梯度值是最大的,否则不是边缘,将灰度值设为0;使用上下阀值来检测边缘,其中大于上阈值的都被检测为边缘,而低于上阈值的都被检测为非边缘;对于中间的像素点,如果与确定为边缘的像素点邻接,则判定为边缘;否则为非边缘。In the Gaussian filtering process, the points that are not edges are filtered by setting rules, so that the width of the edge is 1 pixel as much as possible: if a pixel belongs to the edge, then the gradient value of this pixel in the gradient direction is the largest, Otherwise, it is not an edge, and the gray value is set to 0; the upper and lower thresholds are used to detect edges, and those greater than the upper threshold are detected as edges, and those lower than the upper threshold are detected as non-edges; for the middle pixels, If it is adjacent to the pixel determined as an edge, it is determined as an edge; otherwise, it is not an edge. 2.根据权利要求1所述的方法,其特征是:步骤3.4中将固定零件和对接零件中孔心和轴心映射到同一世界坐标系下,轴心坐标为Hab0=(x0,y0),孔心坐标集为Ha=(xm,ym),Hb=(xm,ym),m=1,2,3……N,根据同一坐标系下两零件轴心与孔心确定的直线,通过三角函数:2. The method according to claim 1, characterized in that: in step 3.4, the center of the hole and the axis in the fixed part and the docking part are mapped to the same world coordinate system, and the coordinates of the axis are H ab0 = (x 0 , y 0 ), the coordinate set of the hole center is H a = (x m , y m ), H b = (x m , y m ), m = 1, 2, 3...N, according to the axes of the two parts in the same coordinate system The straight line determined with the center of the hole, via trigonometric functions: 计算出需要调整的最小偏转角α。Calculate the minimum deflection angle α that needs to be adjusted.
CN202210527443.0A 2022-05-16 2022-05-16 A non-contact measurement method for cylindrical parts docking based on monocular vision Active CN114963981B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210527443.0A CN114963981B (en) 2022-05-16 2022-05-16 A non-contact measurement method for cylindrical parts docking based on monocular vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210527443.0A CN114963981B (en) 2022-05-16 2022-05-16 A non-contact measurement method for cylindrical parts docking based on monocular vision

Publications (2)

Publication Number Publication Date
CN114963981A CN114963981A (en) 2022-08-30
CN114963981B true CN114963981B (en) 2023-08-15

Family

ID=82970874

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210527443.0A Active CN114963981B (en) 2022-05-16 2022-05-16 A non-contact measurement method for cylindrical parts docking based on monocular vision

Country Status (1)

Country Link
CN (1) CN114963981B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116140987A (en) * 2023-04-17 2023-05-23 广东施泰德测控与自动化设备有限公司 Visual quick docking device and docking method for axle test board
CN118840369B (en) * 2024-09-20 2025-01-21 东海实验室 A method, system and storage medium for detecting double-axis docking deviation

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108562274A (en) * 2018-04-20 2018-09-21 南京邮电大学 A kind of noncooperative target pose measuring method based on marker
CN109190628A (en) * 2018-08-15 2019-01-11 东北大学 A kind of plate camber detection method based on machine vision
CN110146038A (en) * 2019-06-08 2019-08-20 西安电子科技大学 Distributed Monocular Camera Laser Measuring Device and Method for Assembly Rotation Angle of Cylindrical Parts
CN112362034A (en) * 2020-11-11 2021-02-12 上海电器科学研究所(集团)有限公司 Solid engine multi-cylinder section butt joint guiding measurement algorithm based on binocular vision
CN112686920A (en) * 2020-12-31 2021-04-20 天津理工大学 Visual measurement method and system for geometric dimension parameters of circular part
CN113295171A (en) * 2021-05-19 2021-08-24 北京航空航天大学 Monocular vision-based attitude estimation method for rotating rigid body spacecraft
WO2021208231A1 (en) * 2020-04-15 2021-10-21 上海工程技术大学 Gap measuring system and measuring method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111243032B (en) * 2020-01-10 2023-05-12 大连理工大学 A fully automatic checkerboard corner detection method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108562274A (en) * 2018-04-20 2018-09-21 南京邮电大学 A kind of noncooperative target pose measuring method based on marker
CN109190628A (en) * 2018-08-15 2019-01-11 东北大学 A kind of plate camber detection method based on machine vision
CN110146038A (en) * 2019-06-08 2019-08-20 西安电子科技大学 Distributed Monocular Camera Laser Measuring Device and Method for Assembly Rotation Angle of Cylindrical Parts
WO2021208231A1 (en) * 2020-04-15 2021-10-21 上海工程技术大学 Gap measuring system and measuring method
CN112362034A (en) * 2020-11-11 2021-02-12 上海电器科学研究所(集团)有限公司 Solid engine multi-cylinder section butt joint guiding measurement algorithm based on binocular vision
CN112686920A (en) * 2020-12-31 2021-04-20 天津理工大学 Visual measurement method and system for geometric dimension parameters of circular part
CN113295171A (en) * 2021-05-19 2021-08-24 北京航空航天大学 Monocular vision-based attitude estimation method for rotating rigid body spacecraft

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于机器视觉的高精度测量与装配系统设计;焦亮等;《计算机测量与控制》;20160725(第07期);全文 *

Also Published As

Publication number Publication date
CN114963981A (en) 2022-08-30

Similar Documents

Publication Publication Date Title
CN110276808B (en) Method for measuring unevenness of glass plate by combining single camera with two-dimensional code
CN110889829B (en) A monocular distance measurement method based on fisheye lens
CN111369630A (en) A method of multi-line lidar and camera calibration
CN109035320A (en) Depth extraction method based on monocular vision
CN101673397B (en) Digital camera nonlinear calibration method based on LCDs
CN109163657B (en) Round target pose detection method based on binocular vision three-dimensional reconstruction
CN109190628A (en) A kind of plate camber detection method based on machine vision
CN108362469B (en) Size and surface pressure measurement method and apparatus based on pressure sensitive paint and light-field camera
CN108007388A (en) A kind of turntable angle high precision online measuring method based on machine vision
CN101299270A (en) Multiple video cameras synchronous quick calibration method in three-dimensional scanning system
CN114963981B (en) A non-contact measurement method for cylindrical parts docking based on monocular vision
CN112200203B (en) Matching method of weak correlation speckle images in oblique field of view
CN112161997B (en) Online precise visual measurement method and system for three-dimensional geometric dimension of semiconductor chip pin
CN103902953B (en) A kind of screen detecting system and method
CN112132907A (en) A camera calibration method, device, electronic device and storage medium
CN101311963A (en) Round mark point center picture projection point position acquiring method for positioning video camera
CN109859137B (en) Wide-angle camera irregular distortion global correction method
CN107084680A (en) Target depth measuring method based on machine monocular vision
CN111709985A (en) A method of underwater target ranging based on binocular vision
CN107507244A (en) Camera calibration method, proving operation method and the caliberating device of a kind of single-frame images
CN111862193A (en) A method and device for binocular vision positioning of electric welding spot based on shape descriptor
CN111402330A (en) Laser line key point extraction method based on plane target
CN114964007A (en) Visual measurement and surface defect detection method for weld size
CN111968182B (en) Calibration method for nonlinear model parameters of binocular camera
CN114998571B (en) Image processing and color detection method based on fixed-size markers

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant