WO2020114134A1 - 一种识别金刚砂颗粒的视觉处理方法 - Google Patents

一种识别金刚砂颗粒的视觉处理方法 Download PDF

Info

Publication number
WO2020114134A1
WO2020114134A1 PCT/CN2019/112854 CN2019112854W WO2020114134A1 WO 2020114134 A1 WO2020114134 A1 WO 2020114134A1 CN 2019112854 W CN2019112854 W CN 2019112854W WO 2020114134 A1 WO2020114134 A1 WO 2020114134A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
projection transformation
emery
projection
coordinate
Prior art date
Application number
PCT/CN2019/112854
Other languages
English (en)
French (fr)
Inventor
尹章芹
张冶
周奇
王杰高
Original Assignee
南京埃斯顿机器人工程有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 南京埃斯顿机器人工程有限公司 filed Critical 南京埃斯顿机器人工程有限公司
Publication of WO2020114134A1 publication Critical patent/WO2020114134A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20056Discrete and fast Fourier transform, [DFT, FFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection

Definitions

  • the invention relates to a visual processing method for identifying emery particles, in particular to an image processing method for extracting emery particles on the surface of an emery wire based on computer vision, which belongs to the field of machine vision applications.
  • Emery wire is a common cutting or grinding tool composed of emery particles electroplated onto the surface of thin steel wire. It is mainly used in industries with high requirements for materials and precision, such as the slicing of silicon raw materials in the photovoltaic industry; Emery particles on the surface of emery wire Quantity and density are an important indicator for judging the quality of emery wire, and the industry often uses this indicator to judge the quality of emery wire produced.
  • the traditional diamond particle detection methods include manual observation and calculation with a microscope, as well as chemical or physical methods (particle separation method) to separate and count diamond particles per unit length.
  • the microscope observation method requires manual sampling and observation calculation, and the efficiency is low; the particle separation method is cumbersome and causes damage to the diamond wire itself.
  • the existing visual detection method has a certain effect on the extraction of emery particles, the basic operation of morphology is used in the image processing process, which significantly impairs the image accuracy and does not have high accuracy.
  • the existing visual inspection method mainly counts by extracting the characteristic concave points of the target area, but this method is easily affected by the light intensity, and in the case where the concave points are not obvious, It is difficult to accurately extract the emery particles.
  • the purpose of the present invention is to overcome the shortcomings of the existing technology and provide a visual processing method for identifying emery particles.
  • a frequency domain analysis method is used to find abrupt points, and on the premise of ensuring image accuracy,
  • the peripheral area of the target is extracted, and unified projection positioning is performed to realize the rapid and accurate extraction of the diamond particles on the surface of the diamond wire.
  • the visual processing method for identifying emery particles of the present invention includes the following steps:
  • Step 1 Design a frequency domain Gaussian filter, use the difference principle to create two frequency domain Gaussian filters with standard deviations of c1 and c2 according to the size of the acquired image, and find the difference to obtain a differential filter picture.
  • the values of the standard deviations c 1 and c 2 of the two frequency domain Gaussian filters created are determined according to the analysis of the frequency band on the frequency domain map where the target area is located.
  • the height of the target area accounts for 1/n of the height of the entire image.
  • the width and height of the Gaussian filter in the frequency domain are set to 1 of the width and height of the entire image, respectively. /n, c 1 is equal to the width of the image, c 2 is equal to n.
  • W is the field of view of the camera, and the line width of W 1 emery is in millimeters.
  • Step 2 Perform mean filtering and binarization on the image, perform mean filtering on the original image and binarize the maximum inter-class variance of the image to obtain the image of the target peripheral area where the emery line is located.
  • Step 3 Perform projection processing on the image, calculate the image projection transformation matrix according to the target peripheral area image obtained in step 2, and perform projection transformation on the collected original image to transform the target peripheral area to the fixed position of the image to obtain The image of the target peripheral area of uniform size and position; specifically including the following steps:
  • Step 301 Calculation of the projection matrix:
  • px 1 ... px 4 , py 1 ... py 4 are the row coordinates and column coordinates of the above four vertices (hereinafter referred to as vertices); Px and Py are the row coordinate vectors of the vertices, Column coordinate vector.
  • qx 1 ... qx 4 , qy 1 ... qy 4 are the row and column coordinates of the vertex after projection transformation; Qx and Qy are the row coordinate vector and column coordinates of the vertex after projection transformation, respectively vector.
  • MatH -1 ⁇ (Qx,Qy,1,1) ⁇ (Px,Py,1,1) -1 ⁇ -1 (4)
  • Step 302 Projective transformation of the image.
  • the specific step is to determine the pixel value f(x,y) of the image coordinate point (x,y) after the projection transformation by using pixel weighted interpolation method.
  • the specific steps are:
  • Step 3021 Traverse the coordinate points (x 0 , y 0 ) corresponding to each pixel in the original image before projection transformation in sequence, and determine the coordinate points (x, y) after projection transformation according to the projection transformation matrix;
  • the coordinate point (x, y) after projection transformation can be determined.
  • Step 3022 Calculate the weighted interpolation of pixels, the specific steps are:
  • Step 30221 The analytical formulas f 1 and f 2 of the two diagonal lines of the four pixels around the coordinate point (x 0 , y 0 ) before the projection transformation are calculated according to the calculation formula of the two-point linear analytical formula:
  • x 1 ... x 4 and y 1 ... y 4 are the row and column coordinates of the vertex around the coordinate point (x 0 , y 0 ) before projection transformation, respectively.
  • Step 30222 calculating a projective transformation before the coordinate point (x 0, y 0)
  • the linear formula is calculated analytical oblique point, and two diagonals perpendicular and vertical two analytical expressions g 1, g 2, respectively :
  • Step 30223 even solving simultaneous equations f 1 and g 1; f 2 and g 2 intersection (projection point) j 1, j 2;
  • Step 30224 Project the Euclidean distance between the coordinate point (x 0 , y 0 ) before the projection transformation and the four surrounding pixels onto the corresponding diagonal lines;
  • Step 30225 Calculate the pixel value at the coordinate point (x, y) after projection transformation:
  • the four pixel values around the coordinate point (x 0 , y 0 ) before the projection transformation are T 1 (x 1 , y 1 ), T 2 in the order of upper left, upper right, lower right, and lower left.
  • u, v are the coordinate point (x 0 , y 0 ) and coordinate point T 1 before projection transformation, respectively .
  • the Euclidean distance between T 2 is based on the projection distance obtained in step 30224.
  • i takes a value of 3 or 4.
  • Step 4 Perform fast fast Fourier transform on the image, and perform fast Fourier transform on the projected transformed image to obtain a spectrogram in the complex domain.
  • Step 5 Perform the convolution calculation on the image, and use the frequency domain differential filter created in Step 1 to perform the image convolution calculation on the spectrogram to enhance the features and obtain the spectral image of the target area where the emery is located.
  • Step 6 Inverse Fourier transform of the image, inverse Fourier transform of the target area spectrum image to obtain the real number image of the target area.
  • Step 7 Perform Gaussian filtering on the image, and process the real-number image of the target area by Gaussian filtering in the spatial domain to obtain a target area map with reduced noise.
  • Step 8 Perform dynamic threshold processing on the image, adopt the improved dynamic threshold method, set the threshold d, obtain the bright channel of the target area image, and obtain the position map of the emery particles; the specific steps are:
  • Step 801. Perform median filtering on the image.
  • Step 802. The difference between the image obtained in step 7 and the image obtained in step 801 is obtained to obtain a deviation image g(x, y) of the two images.
  • Step 803. Extract the bright channel of the image; according to the offset of the local gray value deviation of the image before and after the filtering in step 801; the set of bright channels is:
  • equation (12) (x, y) is the transformed coordinate point, and d is the set threshold.
  • Step 9 Perform inverse projection transformation on the image, perform inverse projection transformation on the position map of the emery particles, and restore the position of the target area in the image.
  • the present invention transforms the target peripheral area obtained by mean filtering and maximum inter-class variance threshold to a fixed image position, which not only reduces the area required for subsequent image processing, but also improves The execution speed of the image processing algorithm; also realizes the uniform positioning of the target area of the random transformation, which can prevent the single picture processing due to the change of the image background art and science to take too long and affect the system execution efficiency.
  • the present invention performs fast Fourier transform on the projected image and converts it into a spectrogram in the complex domain for differential filtering to extract feature extraction, which can effectively avoid the image noise interference problem that generally exists in the spatial domain.
  • the present invention adopts an improved dynamic threshold method, sets a threshold d, considers the local characteristics of the image, and obtains an image bright channel that meets the gray condition according to the relative gray difference of the local area, and extracts The target area where the emery particles are located effectively avoids the interference of image brightness unevenness and image noise on feature extraction.
  • FIG. 1 is a flowchart of a visual processing method for identifying emery particles of the present invention.
  • Fig. 2 is a schematic diagram of projection transformation of an image.
  • Figure 3 is a camera acquisition diagram of emery.
  • Figure 4 is a picture of the results of emery identification.
  • the image processed in this embodiment is acquired by a CMOS grayscale industrial camera.
  • the visual image is a 640*480 grayscale image.
  • the field of view of the emery camera is about 4 mm, and the width of the steel sand is about 1 mm.
  • Step 2 Perform image mean filtering on the collected original image and binarize the maximum inter-class variance of the image to obtain an image of the target peripheral area where the emery line is located.
  • Step 3 According to the target peripheral area image obtained in step 2, establish a projection matrix, calculate the image projection transformation matrix, and perform projection transformation on the collected original image to transform the target peripheral area to a fixed position of the image to obtain a uniform size And the image of the target's peripheral area; the specific steps include the following:
  • Step 301 Calculation of the projection matrix:
  • the row and column coordinate vectors of the top left, top right, bottom right and bottom left vertices of the image before projection transformation are:
  • px 1 ... px 4 , py 1 ... py 4 are the row and column coordinates of the above four vertices (hereinafter referred to as vertices); Px and Py are the row and column coordinate vectors of the vertices, respectively.
  • px 1 and px 2 represent the row coordinates of the area where the first target is located,
  • px 3 and px 4 represent the row coordinates of the last target where py 1 and py 4 are 0, and
  • py 2 and py 3 are the width.
  • Step 302 Projection transformation of the image, using pixel weighted interpolation to determine the pixel value f(x,y) of the image coordinate point (x,y) after projection transformation.
  • the specific steps are:
  • Step 3021 Traverse the coordinate points (x 0 , y 0 ) corresponding to each pixel in the original image before projection transformation in sequence, and determine the coordinate points (x, y) after projection transformation according to the projection transformation matrix;
  • the coordinate point (x, y) after projection transformation can be determined.
  • Step 3022 Calculate the weighted interpolation of pixels, the specific steps are:
  • Step 30221 As shown in FIG. 2, the analytical formulas f 1 and f 2 of the two diagonal lines of the four pixels around the coordinate point (x 0 , y 0 ) before the projection transformation are calculated according to the calculation formula of the two-point linear analytical formula: :
  • x 1 ... x 4 and y 1 ... y 4 are the rows and columns of the vertex around the coordinate point (x 0 , y 0 ) before projection transformation (as shown in Figure 2) coordinate.
  • Step 30222 As shown in FIG. 2, the analytical formula g 1 of the two perpendicular lines that are the coordinate points (x 0 , y 0 ) before the projection transformation and the two perpendicular lines that are perpendicular to the two diagonal lines are calculated according to the calculation formula of the point-inclined straight line analytical formula g 2 are:
  • x 1 ... x 4 and y 1 ... y 4 are the rows and columns of the vertex around the coordinate point (x 0 , y 0 ) before projection transformation (as shown in Figure 2) coordinate.
  • Step 30223 Solve f 1 and g 1 by simultaneous equations; the intersection (projection point) of f 2 and g 2 , as shown in j 1 , j 2 in Figure 2 ;
  • Step 30224 Project the Euclidean distance between the coordinate point (x 0 , y 0 ) before the projection transformation and the four surrounding pixels onto their corresponding diagonal lines (see Figure 2);
  • Step 30225 Calculate the pixel value at the coordinate point (x, y) after projection transformation:
  • the four pixel values around the coordinate point (x 0 , y 0 ) before projection transformation are T 1 (x 1 , y in the order of upper left, upper right, lower right, and lower left, respectively) 1 ), T 2 (x 2 , y 2 ), T 3 (x 3 , y 3 ), T 4 (x 4 , y 4 ); u, v are the coordinate points before projection transformation (x 0 , y 0 )
  • the Euclidean distance between the coordinate points T 1 and T 2 is based on the projection distance obtained in step 30224.
  • Step 4 Perform fast Fourier transform on the projected image, transform the image from the spatial domain to the frequency domain, and obtain the spectrogram in the complex domain.
  • Step 5 Use the frequency domain differential filter created in Step 1 to convolve the image of the spectrogram, filter out the background and noise, and obtain the spectral image of the target area where the emery is located.
  • Step 6 Use the inverse Fourier transform of the image to inverse transform the target area spectrum image to obtain the target area spatial image information.
  • Step 7 Using the Gaussian filtering in the spatial domain, the real-number image of the target area is processed through the Gaussian filtering in the spatial domain to obtain a target area map with reduced noise.
  • Step 801. Perform median filtering on the image.
  • Step 802. The difference between the image obtained in step 7 and the image obtained in step 801 is obtained to obtain a deviation image g(x, y) of the two images.
  • Step 803. Extract the bright channel of the image; according to the offset of the local gray value deviation of the image before and after the filtering in step 801; the set of bright channels is:
  • equation (12) (x, y) is the transformed coordinate point, and d is the set threshold.
  • Step 9 Using the inverse projection transformation module, perform inverse projection transformation on the position map of the emery particles, restore the position of the target area in the image and output position information.
  • the extraction effect of emery particles under this embodiment is shown in FIG. 4.

Abstract

本发明公开了一种识别金刚砂颗粒的视觉处理方法,解决了人工检测效率低、基于形态学的检测方法受光源影响精准度低等问题,针对现有通用金刚砂线,采用频域分析的方法寻找金刚砂颗粒突变点,在保证图像精度的前提下,提取目标外围区域,进行统一投影定位,实现了金刚砂线表面金刚砂颗粒的快速精确提取。

Description

一种识别金刚砂颗粒的视觉处理方法 技术领域
本发明涉及一种识别金刚砂颗粒的视觉处理方法,特别是一种基于计算机视觉的金刚砂线表面金刚砂颗粒提取的图像处理方法,属于机器视觉应用领域。
背景技术
金刚砂线是由金刚砂颗粒电镀到细钢丝表面所构成的一种常见的切割或打磨工具,主要用于材料与精度要求较高的行业,比如光伏行业中硅原料的切片;金刚砂线表面的金刚砂颗粒数量与密度是评判金刚砂线品质的一种重要指标,工业上也常以此指标对所生产的金刚砂线进行质量评判。
传统的金刚砂颗粒检测方法有显微镜人工观察计算法,以及通过化学或物理方法(颗粒分离法)分离出单位长度内的金刚砂颗粒并进行计数。显微镜观察法需要人工进行取样与观察计算,效率较低;颗粒分离法操作过程繁琐,并且对金刚砂线本身造成了破坏。
近年,基于机器视觉的自动检测法逐渐发展起来。现有的视觉检测方法虽然针对金刚砂颗粒的提取具有一定效果,但是由于在图像处理过程中采用了形态学的基本操作,使图像精度显著受损,不具有较高的准确性。此外,考虑到金刚砂颗粒具有粘连的现象,现有视觉检测法中,主要通过提取目标区域的特征凹点进行计数,但是这种方法易受光线强度影响,并且在凹点不明显的情况下,难以进行金刚砂颗粒的精准提取。
发明内容
本发明的目的在于,克服现有技术存在的缺陷,提供一种识别金刚砂颗粒的视觉处理方法,针对现有通用金刚砂线,采用频域分析的方法寻找突变点,在保证图像精度的前提 下,提取目标外围区域,进行统一投影定位,实现了金刚砂线表面金刚砂颗粒的快速精确提取。
本发明的一种识别金刚砂颗粒的视觉处理方法,包括以下步骤:
步骤1,设计频域高斯滤波器,利用差分原理,根据采集的图像尺寸创建两个标准差分别为c1、c2的频域高斯滤波器,并求差得到差分滤波器图片。
所创建的两个频域高斯滤波器的标准差c 1、c 2的值根据目标区域所在频域图上频段的分析确定。所采集的图像中,目标区域的高度占整幅图像高度的1/n,为提高滤波速度,并保持图像比例,频域高斯滤波器宽度、高度分别设定为整幅图像宽度、高度的1/n,c 1等于图像的宽度,c 2等于n。
n=W/W 1
W是相机的视野范围,W 1金刚砂的线宽度是,单位都是毫米。
步骤2.对图像进行均值滤波与二值化处理,对采集到的原始图像进行图像的均值滤波与图像最大类间方差二值化,得到金刚砂线所在的目标外围区域图像。
步骤3:对图像进行投影处理,根据步骤2中得到的目标外围区域图像,计算图像投影变换矩阵,并对采集到的原始图像进行投影变换,将目标外围区域变换到图像的固定位置处,得到统一大小和位置的目标外围区域图像;具体包括如下步骤:
步骤301:投影矩阵的计算:
假设投影变换前的图像左上、右上、右下、左下四个顶点的行、列坐标向量分别为:
Px=(px 1,px 2,px 3,px 4) T,Py=(py 1,py 2,py 3,py 4) T  (1)
式(1)中,px 1...px 4,py 1...py 4分别为上述四个顶点(以下简称顶点)的行坐标、列坐标; Px、Py分别为顶点的行坐标向量、列坐标向量。
假设投影变换后对应的坐标分别为:
Qx=(qx 1,qx 2,qx 3,qx 4) T,Qy=(qy 1,qy 2,qy 3,qy 4) T   (2)
式(2)中,qx 1...qx 4,qy 1...qy 4分别为投影变换后顶点的行坐标、列坐标;Qx、Qy分别为投影变换后顶点的行坐标向量、列坐标向量。
则投影变换矩阵:
MatH=(Qx,Qy,1,1)·(Px,Py,1,1) -1       (3)
投影逆变换矩阵:
MatH -1={(Qx,Qy,1,1)·(Px,Py,1,1) -1} -1    (4)
步骤302:图像的投影变换,具体步骤为,采用像素加权插值法确定投影变换后图像坐标点(x,y)的像素值f(x,y)。具体步骤为:
步骤3021:依次遍历投影变换前原图像中的每一个像素点对应的坐标点(x 0,y 0),根据投影变换矩阵确定投影变换后的坐标点(x,y);
(x,y,1,1)=MatH·(x 0,y 0,1,1)        (5)
可确定投影变换后的坐标点(x,y)。
步骤3022:像素点的加权插值法计算,具体步骤为:
步骤30221:根据两点式直线解析式计算公式计算投影变换前坐标点(x 0,y 0)周围4个像素点的两条对角线的解析式f 1,f 2分别为:
Figure PCTCN2019112854-appb-000001
Figure PCTCN2019112854-appb-000002
式(6)、(7)中,x 1...x 4以及y 1...y 4分别为投影变换前坐标点(x 0,y 0)周围顶点的行坐标、列坐标。
步骤30222:根据点斜式直线解析式计算公式计算投影变换前坐标点(x 0,y 0),且与两条对角线分别垂直的两条垂线的解析式g 1,g 2分别为:
Figure PCTCN2019112854-appb-000003
Figure PCTCN2019112854-appb-000004
步骤30223:连立方程组求解f 1与g 1;f 2与g 2的交点(投影点)j 1,j 2
步骤30224:将投影变换前坐标点(x 0,y 0)与四个周围像素点间的欧几里得距离投影到各自所对应的对角线上;
步骤30225:计算投影变换后坐标点(x,y)处像素值:
Figure PCTCN2019112854-appb-000005
式(10)中,投影变换前坐标点(x 0,y 0)周围的四个像素值,按照左上、右上、右下、左下的顺序分别为T 1(x 1,y 1),T 2(x 2,y 2),T 3(x 3,y 3),T 4(x 4,y 4);u,v分别是投影变换前坐标点(x 0,y 0)与坐标点T 1,T 2间的欧几里得距离根据步骤30224所得的投影距离。
另外,在式(10)中,对角线长度L定义为:
Figure PCTCN2019112854-appb-000006
式(11)中,i取值为3或4。
步骤4.对图像进行快速快速傅里叶变换处理,针对投影变换后的图像进行快速傅里叶变换,得到复数域内的频谱图。
步骤5.对图像进行卷积计算,采用步骤1中创建的频域差分滤波器对频谱图进行图像卷积计算,增强特征,得到金刚砂所在的目标区域频谱图像。
步骤6.对图像傅里叶反变换处理,对目标区域频谱图像进行图像傅里叶反变换,得到目标区域实数图像。
步骤7.对图像进行高斯滤波处理,通过空间域的高斯滤波处理目标区域实数图像,得到噪声减弱的目标区域图。
步骤8.对图像进行动态阈值处理,采用改进的动态阈值法,设定阈值d,获得目标区域图像的亮通道,得到金刚砂颗粒的位置图;具体步骤为:
步骤801.对图像进行中值滤波。
步骤802.采用步骤7得到的图像与步骤801得到的图像求差,得到两幅图像的偏差图像g(x,y)。
步骤803.提取图像亮通道;根据步骤801滤波前与滤波后的图像局部灰度值偏差offset;亮通道的集合为:
B={(x,y)offset(x,y)≥d}     (12)
式(12)中,(x,y)为变换后的坐标点,d为设定的阈值。
步骤9.对图像进行投影逆变换处理,对金刚砂颗粒的位置图执行投影逆变换,还原目标区域在图像中的位置。
本发明的优点或者有益效果
1、本发明通过计算图像投影变换矩阵并进行投影变换,将通过均值滤波与最大类间方差阈值得到的目标外围区域变换到图像固定位置处,既缩小了需要进行后续图像处理的区域,提高了图像处理算法的执行速度;又实现了随机性变换的目标区域的统一定位,可防止由于图像背景文理发生变化所造成的单张图片处理耗时过长而影响系统执行效率。
2、本发明所采用的像素加权插值法在图像投影变换中的应用,在实现图像位置与尺寸缩放的同时,在有限的计算复杂度(低于高阶线性插值法)范围内最大限度的还原了图像细节信息,解决了图像边缘区域插值模糊的问题,保持了图像的精度。
3、本发明针对投影变换后的图像进行快速傅里叶变换,转换为复数域内的频谱图进行差分滤波提取特征提取,可有效避免空间域内普遍存在的图像噪声干扰问题。
4、本发明在进行金刚砂颗粒的最终提取时,采用了改进的动态阈值法,设定阈值d,从图像局部特征考虑,根据局部区域相对灰度差获得满足灰度条件的图像亮通道,提取金刚砂颗粒所在目标区域,有效避免了图像亮度不均以及图像噪声对特征提取的干扰。
附图说明
图1是本发明识别金刚砂颗粒的视觉处理方法流程框图。
图2是图像的投影变换示意图。
图3是金刚砂的相机采集图。
图4是金刚砂识别的的结果图片。
具体实施方式
下面结合实施例和附图,对本发明方法作进一步详细说明。
实施例:
本实施例所处理的图像是由CMOS灰度工业相机采集得到,视觉图片是640*480的灰度图,金刚砂的相机视野范围是4mm左右,精钢砂的宽度是1mm左右,。
步骤1.高斯滤波器的标准差分别是c 1=640和c 2=4,高斯滤波器的宽度是640,高度是120。
步骤2.对采集到的原始图像进行图像的均值滤波与图像最大类间方差二值化,得到金刚砂线所在的目标外围区域图像。
步骤3:根据步骤2中得到的目标外围区域图像,建立投影矩阵,计算图像投影变换矩阵,并对采集到的原始图像进行投影变换,将目标外围区域变换到图像的固定位置处,得到统一大小和位置的目标外围区域图像;具体包括如下步骤:
步骤301:投影矩阵的计算:
投影变换前的图像左上、右上、右下、左下四个顶点的行、列坐标向量分别为:
px=(px 1,px 2,px 3,px 4) T=(0,0,504,504) T
py=(py 1,py 2,py 3,py 4) T=(0,640,640,0) T  (1)
式(1)中,px 1...px 4,py 1...py 4分别为上述四个顶点(以下简称顶点)的行列坐标;Px、Py分别为顶点的行、列坐标向量。px 1,px 2,表示第一个目标所在区域的行坐标,px 3,px 4表示最后一个目标所在的行坐标,py 1,py 4值是0,py 2和py 3值是宽度。
投影变换后对应的行列坐标向量Qx、Qy:
Qx=(0,0,504,504)
Qy=(0,640,640,0) 1   (2)
则投影变换矩阵:
Figure PCTCN2019112854-appb-000007
投影逆变换矩阵:
Figure PCTCN2019112854-appb-000008
步骤302:图像的投影变换,采用像素加权插值法确定投影变换后图像坐标点(x,y)的像素值f(x,y)。具体步骤为:
步骤3021:依次遍历投影变换前原图像中的每一个像素点对应的坐标点(x 0,y 0),根据投影变换矩阵确定投影变换后的坐标点(x,y);
(x,y,1,1)=MatH·(x 0,y 0,1,1)      (5)
可确定投影变换后的坐标点(x,y)。
步骤3022:像素点的加权插值法计算,具体步骤为:
步骤30221:如图2,根据两点式直线解析式计算公式计算投影变换前坐标点(x 0,y 0)周围4个像素点的两条对角线的解析式f 1,f 2分别为:
Figure PCTCN2019112854-appb-000009
Figure PCTCN2019112854-appb-000010
式(6)、(7)中,x 1...x 4以及y 1...y 4分别为投影变换前坐标点(x 0,y 0)周围顶点(如图 2)的行、列坐标。
步骤30222:如图2,根据点斜式直线解析式计算公式计算投影变换前坐标点(x 0,y 0),且与两条对角线分别垂直的两条垂线的解析式g 1,g 2分别为:
Figure PCTCN2019112854-appb-000011
Figure PCTCN2019112854-appb-000012
式(8)、(9)中,x 1...x 4以及y 1...y 4分别为投影变换前坐标点(x 0,y 0)周围顶点(如图2)的行、列坐标。
步骤30223:连立方程组求解f 1与g 1;f 2与g 2的交点(投影点),如图2中的j 1,j 2
步骤30224:将投影变换前坐标点(x 0,y 0)与四个周围像素点间的欧几里得距离投影到各自所对应的对角线上(如图2);
步骤30225:计算投影变换后坐标点(x,y)处像素值:
Figure PCTCN2019112854-appb-000013
式(10)中,投影变换前坐标点(x 0,y 0)周围的四个像素值(如图2),按照左上、右上、右下、左下的顺序分别为T 1(x 1,y 1),T 2(x 2,y 2),T 3(x 3,y 3),T 4(x 4,y 4);u,v分别是投影变换前坐标点(x 0,y 0)与坐标点T 1,T 2间的欧几里得距离根据步骤30224所得的投影距离。
另外,在式(10)中,如图2,对角线长度L定义为:
Figure PCTCN2019112854-appb-000014
式(11)中,i取值为3或4,本次实验中,取的i=3
步骤4.对投影变换后的图像进行快速傅里叶变换,将图像由空间域转换到频域中,得 到复数域内的频谱图。
步骤5.采用步骤1中创建的频域差分滤波器对频谱图进行图像卷积,滤除背景及噪声,得到金刚砂所在的目标区域频谱图像。
步骤6.利用图像傅里叶反变换,对目标区域频谱图像进行反变换,得到目标区域空间图像信息。
步骤7.利用空间域的高斯滤波,通过空间域的高斯滤波处理目标区域实数图像,得到噪声减弱的目标区域图。
步骤8.利用动态阈值,设定阈值变换幅度为d=12,获得目标区域图像的亮通道,得到金刚砂颗粒的位置图;具体步骤为:
步骤801.对图像进行中值滤波。
步骤802.采用步骤7得到的图像与步骤801得到的图像求差,得到两幅图像的偏差图像g(x,y)。
步骤803.提取图像亮通道;根据步骤801滤波前与滤波后的图像局部灰度值偏差offset;亮通道的集合为:
B={(x,y)|offset(x,y)≥(d=12)}   (12)
式(12)中,(x,y)为变换后的坐标点,d为设定的阈值。
步骤9.利用投影逆变换模块,对金刚砂颗粒的位置图执行投影逆变换,还原目标区域在图像中的位置并输出位置信息。该实施例下的金刚砂颗粒提取效果见图4。

Claims (5)

  1. 一种识别金刚砂颗粒的视觉处理方法,其步骤是:
    步骤1.设计频域高斯滤波器,利用差分原理,根据采集的图像尺寸创建两个标准差分别为c 1、c 2的频域高斯滤波器,并求差得到差分滤波器图片;
    步骤2.对图像进行均值滤波与二值化处理,对采集到的原始图像进行图像的均值滤波与图像最大类间方差二值化,得到金刚砂线所在的目标外围区域图像;
    步骤3:对图像进行投影处理,根据步骤2中得到的目标外围区域图像,计算图像投影变换矩阵,并对采集到的原始图像进行投影变换,将目标外围区域变换到图像的固定位置处,得到统一大小和位置的目标外围区域图像;
    步骤4.图像处理软件调用快速傅里叶变换模块,针对投影变换后的图像进行快速傅里叶变换,得到复数域内的频谱图;
    步骤5.图像处理软件调用图像卷积模块,采用步骤1中创建的频域差分滤波器对频谱图进行图像卷积计算,增强特征,得到金刚砂所在的目标区域频谱图像;
    步骤6.再利用用图像傅里叶反变换,对目标区域频谱图像进行图像傅里叶反变换,得到目标区域实数图像;
    步骤7.图像处理软件调用空间域的高斯滤波模块,通过空间域的高斯滤波处理目标区域实数图像,得到噪声减弱的目标区域图;
    步骤8.图像处理软件调用动态阈值模块,采用改进的动态阈值法,设定阈值d,获得目标区域图像的亮通道,得到金刚砂颗粒的位置图;
    步骤9.图像处理软件调用投影逆变换模块,对金刚砂颗粒的位置图执行投影逆变换,还原目标区域在图像中的位置。
  2. 根据权利要求1所述的一种识别金刚砂颗粒的视觉处理方法,其特征是:
    所述创建两个频域高斯滤波器的标准差c 1、c 2的值,根据目标区域所在频域图上频段的确定;所采集的图像中,目标区域的高度占整幅图像高度的1/n,频域高斯滤波器宽度、高度分别设定为整幅图像宽度、高度的1/n,c 1等于图像的宽度,c 2等于n;
    n=W/W 1
    W是相机的视野范围,W 1是金刚砂的线宽度,单位都是毫米。
  3. 根据权利要求1所述的一种识别金刚砂颗粒的视觉处理方法,其特征是:
    所述步骤3,步骤如下:
    步骤301:投影矩阵的计算:
    假设投影变换前的图像左上、右上、右下、左下四个顶点的行、列坐标向量分别为:
    Px=(px 1,px 2,px 3,px 4) T,Py=(py 1,py 2,py 3,py 4) T
    式中,px 1...px 4,py 1...py 4分别为上述四个顶点(以下简称顶点)的行坐标、列坐标;Px、Py分别为顶点的行坐标向量、列坐标向量;
    假设投影变换后对应的坐标分别为:
    Qx=(qx 1,qx 2,qx 3,qx 4) T,Qy=(qy 1,qy 2,qy 3,qy 4) T
    式中,qx 1...qx 4,qy 1...qy 4分别为投影变换后顶点的行坐标、列坐标;Qx、Qy分别为投影变换后顶点的行坐标向量、列坐标向量;
    则投影变换矩阵:
    MatH=(Qx,Qy,1,1)·(Px,Py,1,1) -1
    投影逆变换矩阵:
    MatH -1={(Qx,Qy,1,1)·(Px,Py,1,1) -1} -1
    步骤302:图像的投影变换,采用像素加权插值法确定投影变换后图像坐标点(x,y)的像素值f(x,y)。
  4. 根据权利要求3所述的一种识别金刚砂颗粒的视觉处理方法,其特征是:
    所述步骤302所述的图像投影变换,其步骤是:
    步骤3021:依次遍历投影变换前原图像中的每一个像素点对应的坐标点(x 0,y 0),根据投影变换矩阵确定投影变换后的坐标点(x,y);
    (x,y,1,1)=MatH·(x 0,y 0,1,1)
    确定投影变换后的坐标点(x,y);
    步骤3022:加权插值法计算像素点,具体步骤为:
    步骤30221:根据两点式直线解析式计算公式计算投影变换前坐标点(x 0,y 0)周围4个像素点的两条对角线的解析式f 1,f 2分别为:
    Figure PCTCN2019112854-appb-100001
    Figure PCTCN2019112854-appb-100002
    式中,x 1...x 4以及y 1...y 4分别为投影变换前坐标点(x 0,y 0)周围顶点的行坐标、列坐标;
    步骤30222:根据点斜式直线解析式计算公式计算投影变换前坐标点(x 0,y 0),且与两条对角线分别垂直的两条垂线的解析式g 1,g 2分别为:
    Figure PCTCN2019112854-appb-100003
    Figure PCTCN2019112854-appb-100004
    步骤30223:连立方程组求解f 1与g 1;f 2与g 2的交点(投影点)j 1,j 2
    步骤30224:将投影变换前坐标点(x 0,y 0)与四个周围像素点间的欧几里得距离投影到各自所对应的对角线上;
    步骤30225:计算投影变换后坐标点(x,y)处像素值:
    Figure PCTCN2019112854-appb-100005
    式中,投影变换前坐标点(x 0,y 0)周围的四个像素值,按照左上、右上、右下、左下的顺序分别为T 1(x 1,y 1),T 2(x 2,y 2),T 3(x 3,y 3),T 4(x 4,y 4);u,v分别是投影变换前坐标点(x 0,y 0)与坐标点T 1,T 2间的欧几里得距离根据步骤30224所得的投影距离;
    L为对角线长度:
    Figure PCTCN2019112854-appb-100006
    式中,i取值为3或4。
  5. 根据权利要求1所述的一种识别金刚砂颗粒的视觉处理方法,其特征是:所述步骤8具体步骤为:
    步骤801.对图像进行中值滤波;
    步骤802.采用步骤7得到的图像与步骤801得到的图像求差,得到两幅图像的偏差图像g(x,y);
    步骤803.提取图像亮通道;根据步骤801滤波前与滤波后的图像局部灰度值偏差offset;亮通道的集合为:
    B={(x,y)|offset(x,y)≥d}
    式中,(x,y)为变换后的坐标点,d为设定的阈值。
PCT/CN2019/112854 2018-12-07 2019-10-23 一种识别金刚砂颗粒的视觉处理方法 WO2020114134A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811494388.X 2018-12-07
CN201811494388.XA CN109636785A (zh) 2018-12-07 2018-12-07 一种识别金刚砂颗粒的视觉处理方法

Publications (1)

Publication Number Publication Date
WO2020114134A1 true WO2020114134A1 (zh) 2020-06-11

Family

ID=66071960

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/112854 WO2020114134A1 (zh) 2018-12-07 2019-10-23 一种识别金刚砂颗粒的视觉处理方法

Country Status (2)

Country Link
CN (1) CN109636785A (zh)
WO (1) WO2020114134A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109636785A (zh) * 2018-12-07 2019-04-16 南京埃斯顿机器人工程有限公司 一种识别金刚砂颗粒的视觉处理方法
CN113063705B (zh) * 2021-03-22 2022-09-27 陕西科技大学 一种基于机器视觉的金刚线表面金刚砂颗粒质量检测方法
CN113409266A (zh) * 2021-06-17 2021-09-17 陕西科技大学 一种金刚砂颗粒检测及计数方法及系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103177458A (zh) * 2013-04-17 2013-06-26 北京师范大学 一种基于频域分析的可见光遥感图像感兴趣区域检测方法
CN107767385A (zh) * 2017-08-28 2018-03-06 江苏理工学院 一种基于机器视觉的金刚砂线颗粒计数方法和装置
CN109636785A (zh) * 2018-12-07 2019-04-16 南京埃斯顿机器人工程有限公司 一种识别金刚砂颗粒的视觉处理方法

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1202490C (zh) * 2003-03-19 2005-05-18 上海交通大学 虹膜纹理归一化处理方法
CN101093538B (zh) * 2006-06-19 2011-03-30 电子科技大学 一种基于小波变换过零表示的虹膜识别方法
DE102009014080B4 (de) * 2009-03-23 2011-12-15 Baumer Innotec Ag Vorrichtung zum Bestimmen von Partikelgrössen
CN103020920B (zh) * 2013-01-10 2015-03-25 厦门大学 一种低照度图像增强方法
CN103077504B (zh) * 2013-01-10 2015-08-05 厦门大学 一种基于自适应光照计算的图像去雾方法
CN105046681A (zh) * 2015-05-14 2015-11-11 江南大学 一种基于SoC的图像显著性区域检测方法
KR101767564B1 (ko) * 2015-11-12 2017-08-11 성균관대학교산학협력단 막대 입자 이미지의 영상 분석 방법
CN108171244A (zh) * 2016-12-07 2018-06-15 北京深鉴科技有限公司 对象识别方法和系统
CN106846263B (zh) * 2016-12-28 2019-11-29 中国科学院长春光学精密机械与物理研究所 基于融合通道且对天空免疫的图像去雾方法
CN107478657A (zh) * 2017-06-20 2017-12-15 广东工业大学 基于机器视觉的不锈钢表面缺陷检测方法
CN108875731B (zh) * 2017-12-28 2022-12-09 北京旷视科技有限公司 目标识别方法、装置、系统及存储介质
CN108226159B (zh) * 2017-12-29 2019-11-22 钢铁研究总院 金属材料中析出相颗粒的全视场定量统计分布表征方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103177458A (zh) * 2013-04-17 2013-06-26 北京师范大学 一种基于频域分析的可见光遥感图像感兴趣区域检测方法
CN107767385A (zh) * 2017-08-28 2018-03-06 江苏理工学院 一种基于机器视觉的金刚砂线颗粒计数方法和装置
CN109636785A (zh) * 2018-12-07 2019-04-16 南京埃斯顿机器人工程有限公司 一种识别金刚砂颗粒的视觉处理方法

Also Published As

Publication number Publication date
CN109636785A (zh) 2019-04-16

Similar Documents

Publication Publication Date Title
CN108921176B (zh) 一种基于机器视觉的指针式仪表定位与识别方法
CN114972329B (zh) 基于图像处理的表面缺陷检测仪的图像增强方法及系统
CN107808378B (zh) 基于垂直纵横线轮廓特征的复杂结构铸件潜在缺陷检测方法
US20220292645A1 (en) Method for restoring video data of drainage pipe based on computer vision
CN111243032B (zh) 一种棋盘格角点全自动检测方法
CN106650770B (zh) 一种基于样本学习和人眼视觉特性的mura缺陷检测方法
CN107845087B (zh) 液晶面板亮度不均匀缺陷的检测方法和系统
WO2020114134A1 (zh) 一种识别金刚砂颗粒的视觉处理方法
CN109816652B (zh) 一种基于灰度显著性的复杂铸件缺陷识别方法
CN107678192B (zh) 一种基于机器视觉的Mura缺陷检测方法
WO2020133046A1 (zh) 一种缺陷检测方法及装置
CN107478657A (zh) 基于机器视觉的不锈钢表面缺陷检测方法
CN102974551A (zh) 一种基于机器视觉的多晶硅太阳能检测分选的方法
CN105139391B (zh) 一种雾霾天气交通图像边缘检测方法
CN108921813A (zh) 一种基于机器视觉的无人机检测桥梁结构裂缝识别方法
CN110599552A (zh) 一种基于计算机视觉的pH试纸检测方法
CN110648330B (zh) 摄像头玻璃的缺陷检测方法
CN104899888A (zh) 一种基于Legendre矩的图像亚像素边缘检测方法
CN114331986A (zh) 一种基于无人机视觉的坝体裂纹识别与测量方法
CN107388991A (zh) 一种端面多圆角轴类零件圆角半径测量方法
CN112489042A (zh) 基于超分辨重建的金属品印刷缺陷与表面损伤的检测方法
CN113705564B (zh) 一种指针式仪表识别读数方法
CN107748897B (zh) 基于模式识别的大尺寸弯曲零件轮廓度质量检测方法
Zhao et al. Analysis of image edge checking algorithms for the estimation of pear size
TWI543117B (zh) 物件辨識與定位方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19893814

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19893814

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19893814

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 14.01.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 19893814

Country of ref document: EP

Kind code of ref document: A1