WO2020114134A1 - 一种识别金刚砂颗粒的视觉处理方法 - Google Patents
一种识别金刚砂颗粒的视觉处理方法 Download PDFInfo
- Publication number
- WO2020114134A1 WO2020114134A1 PCT/CN2019/112854 CN2019112854W WO2020114134A1 WO 2020114134 A1 WO2020114134 A1 WO 2020114134A1 CN 2019112854 W CN2019112854 W CN 2019112854W WO 2020114134 A1 WO2020114134 A1 WO 2020114134A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- projection transformation
- emery
- projection
- coordinate
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20056—Discrete and fast Fourier transform, [DFT, FFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
Definitions
- the invention relates to a visual processing method for identifying emery particles, in particular to an image processing method for extracting emery particles on the surface of an emery wire based on computer vision, which belongs to the field of machine vision applications.
- Emery wire is a common cutting or grinding tool composed of emery particles electroplated onto the surface of thin steel wire. It is mainly used in industries with high requirements for materials and precision, such as the slicing of silicon raw materials in the photovoltaic industry; Emery particles on the surface of emery wire Quantity and density are an important indicator for judging the quality of emery wire, and the industry often uses this indicator to judge the quality of emery wire produced.
- the traditional diamond particle detection methods include manual observation and calculation with a microscope, as well as chemical or physical methods (particle separation method) to separate and count diamond particles per unit length.
- the microscope observation method requires manual sampling and observation calculation, and the efficiency is low; the particle separation method is cumbersome and causes damage to the diamond wire itself.
- the existing visual detection method has a certain effect on the extraction of emery particles, the basic operation of morphology is used in the image processing process, which significantly impairs the image accuracy and does not have high accuracy.
- the existing visual inspection method mainly counts by extracting the characteristic concave points of the target area, but this method is easily affected by the light intensity, and in the case where the concave points are not obvious, It is difficult to accurately extract the emery particles.
- the purpose of the present invention is to overcome the shortcomings of the existing technology and provide a visual processing method for identifying emery particles.
- a frequency domain analysis method is used to find abrupt points, and on the premise of ensuring image accuracy,
- the peripheral area of the target is extracted, and unified projection positioning is performed to realize the rapid and accurate extraction of the diamond particles on the surface of the diamond wire.
- the visual processing method for identifying emery particles of the present invention includes the following steps:
- Step 1 Design a frequency domain Gaussian filter, use the difference principle to create two frequency domain Gaussian filters with standard deviations of c1 and c2 according to the size of the acquired image, and find the difference to obtain a differential filter picture.
- the values of the standard deviations c 1 and c 2 of the two frequency domain Gaussian filters created are determined according to the analysis of the frequency band on the frequency domain map where the target area is located.
- the height of the target area accounts for 1/n of the height of the entire image.
- the width and height of the Gaussian filter in the frequency domain are set to 1 of the width and height of the entire image, respectively. /n, c 1 is equal to the width of the image, c 2 is equal to n.
- W is the field of view of the camera, and the line width of W 1 emery is in millimeters.
- Step 2 Perform mean filtering and binarization on the image, perform mean filtering on the original image and binarize the maximum inter-class variance of the image to obtain the image of the target peripheral area where the emery line is located.
- Step 3 Perform projection processing on the image, calculate the image projection transformation matrix according to the target peripheral area image obtained in step 2, and perform projection transformation on the collected original image to transform the target peripheral area to the fixed position of the image to obtain The image of the target peripheral area of uniform size and position; specifically including the following steps:
- Step 301 Calculation of the projection matrix:
- px 1 ... px 4 , py 1 ... py 4 are the row coordinates and column coordinates of the above four vertices (hereinafter referred to as vertices); Px and Py are the row coordinate vectors of the vertices, Column coordinate vector.
- qx 1 ... qx 4 , qy 1 ... qy 4 are the row and column coordinates of the vertex after projection transformation; Qx and Qy are the row coordinate vector and column coordinates of the vertex after projection transformation, respectively vector.
- MatH -1 ⁇ (Qx,Qy,1,1) ⁇ (Px,Py,1,1) -1 ⁇ -1 (4)
- Step 302 Projective transformation of the image.
- the specific step is to determine the pixel value f(x,y) of the image coordinate point (x,y) after the projection transformation by using pixel weighted interpolation method.
- the specific steps are:
- Step 3021 Traverse the coordinate points (x 0 , y 0 ) corresponding to each pixel in the original image before projection transformation in sequence, and determine the coordinate points (x, y) after projection transformation according to the projection transformation matrix;
- the coordinate point (x, y) after projection transformation can be determined.
- Step 3022 Calculate the weighted interpolation of pixels, the specific steps are:
- Step 30221 The analytical formulas f 1 and f 2 of the two diagonal lines of the four pixels around the coordinate point (x 0 , y 0 ) before the projection transformation are calculated according to the calculation formula of the two-point linear analytical formula:
- x 1 ... x 4 and y 1 ... y 4 are the row and column coordinates of the vertex around the coordinate point (x 0 , y 0 ) before projection transformation, respectively.
- Step 30222 calculating a projective transformation before the coordinate point (x 0, y 0)
- the linear formula is calculated analytical oblique point, and two diagonals perpendicular and vertical two analytical expressions g 1, g 2, respectively :
- Step 30223 even solving simultaneous equations f 1 and g 1; f 2 and g 2 intersection (projection point) j 1, j 2;
- Step 30224 Project the Euclidean distance between the coordinate point (x 0 , y 0 ) before the projection transformation and the four surrounding pixels onto the corresponding diagonal lines;
- Step 30225 Calculate the pixel value at the coordinate point (x, y) after projection transformation:
- the four pixel values around the coordinate point (x 0 , y 0 ) before the projection transformation are T 1 (x 1 , y 1 ), T 2 in the order of upper left, upper right, lower right, and lower left.
- u, v are the coordinate point (x 0 , y 0 ) and coordinate point T 1 before projection transformation, respectively .
- the Euclidean distance between T 2 is based on the projection distance obtained in step 30224.
- i takes a value of 3 or 4.
- Step 4 Perform fast fast Fourier transform on the image, and perform fast Fourier transform on the projected transformed image to obtain a spectrogram in the complex domain.
- Step 5 Perform the convolution calculation on the image, and use the frequency domain differential filter created in Step 1 to perform the image convolution calculation on the spectrogram to enhance the features and obtain the spectral image of the target area where the emery is located.
- Step 6 Inverse Fourier transform of the image, inverse Fourier transform of the target area spectrum image to obtain the real number image of the target area.
- Step 7 Perform Gaussian filtering on the image, and process the real-number image of the target area by Gaussian filtering in the spatial domain to obtain a target area map with reduced noise.
- Step 8 Perform dynamic threshold processing on the image, adopt the improved dynamic threshold method, set the threshold d, obtain the bright channel of the target area image, and obtain the position map of the emery particles; the specific steps are:
- Step 801. Perform median filtering on the image.
- Step 802. The difference between the image obtained in step 7 and the image obtained in step 801 is obtained to obtain a deviation image g(x, y) of the two images.
- Step 803. Extract the bright channel of the image; according to the offset of the local gray value deviation of the image before and after the filtering in step 801; the set of bright channels is:
- equation (12) (x, y) is the transformed coordinate point, and d is the set threshold.
- Step 9 Perform inverse projection transformation on the image, perform inverse projection transformation on the position map of the emery particles, and restore the position of the target area in the image.
- the present invention transforms the target peripheral area obtained by mean filtering and maximum inter-class variance threshold to a fixed image position, which not only reduces the area required for subsequent image processing, but also improves The execution speed of the image processing algorithm; also realizes the uniform positioning of the target area of the random transformation, which can prevent the single picture processing due to the change of the image background art and science to take too long and affect the system execution efficiency.
- the present invention performs fast Fourier transform on the projected image and converts it into a spectrogram in the complex domain for differential filtering to extract feature extraction, which can effectively avoid the image noise interference problem that generally exists in the spatial domain.
- the present invention adopts an improved dynamic threshold method, sets a threshold d, considers the local characteristics of the image, and obtains an image bright channel that meets the gray condition according to the relative gray difference of the local area, and extracts The target area where the emery particles are located effectively avoids the interference of image brightness unevenness and image noise on feature extraction.
- FIG. 1 is a flowchart of a visual processing method for identifying emery particles of the present invention.
- Fig. 2 is a schematic diagram of projection transformation of an image.
- Figure 3 is a camera acquisition diagram of emery.
- Figure 4 is a picture of the results of emery identification.
- the image processed in this embodiment is acquired by a CMOS grayscale industrial camera.
- the visual image is a 640*480 grayscale image.
- the field of view of the emery camera is about 4 mm, and the width of the steel sand is about 1 mm.
- Step 2 Perform image mean filtering on the collected original image and binarize the maximum inter-class variance of the image to obtain an image of the target peripheral area where the emery line is located.
- Step 3 According to the target peripheral area image obtained in step 2, establish a projection matrix, calculate the image projection transformation matrix, and perform projection transformation on the collected original image to transform the target peripheral area to a fixed position of the image to obtain a uniform size And the image of the target's peripheral area; the specific steps include the following:
- Step 301 Calculation of the projection matrix:
- the row and column coordinate vectors of the top left, top right, bottom right and bottom left vertices of the image before projection transformation are:
- px 1 ... px 4 , py 1 ... py 4 are the row and column coordinates of the above four vertices (hereinafter referred to as vertices); Px and Py are the row and column coordinate vectors of the vertices, respectively.
- px 1 and px 2 represent the row coordinates of the area where the first target is located,
- px 3 and px 4 represent the row coordinates of the last target where py 1 and py 4 are 0, and
- py 2 and py 3 are the width.
- Step 302 Projection transformation of the image, using pixel weighted interpolation to determine the pixel value f(x,y) of the image coordinate point (x,y) after projection transformation.
- the specific steps are:
- Step 3021 Traverse the coordinate points (x 0 , y 0 ) corresponding to each pixel in the original image before projection transformation in sequence, and determine the coordinate points (x, y) after projection transformation according to the projection transformation matrix;
- the coordinate point (x, y) after projection transformation can be determined.
- Step 3022 Calculate the weighted interpolation of pixels, the specific steps are:
- Step 30221 As shown in FIG. 2, the analytical formulas f 1 and f 2 of the two diagonal lines of the four pixels around the coordinate point (x 0 , y 0 ) before the projection transformation are calculated according to the calculation formula of the two-point linear analytical formula: :
- x 1 ... x 4 and y 1 ... y 4 are the rows and columns of the vertex around the coordinate point (x 0 , y 0 ) before projection transformation (as shown in Figure 2) coordinate.
- Step 30222 As shown in FIG. 2, the analytical formula g 1 of the two perpendicular lines that are the coordinate points (x 0 , y 0 ) before the projection transformation and the two perpendicular lines that are perpendicular to the two diagonal lines are calculated according to the calculation formula of the point-inclined straight line analytical formula g 2 are:
- x 1 ... x 4 and y 1 ... y 4 are the rows and columns of the vertex around the coordinate point (x 0 , y 0 ) before projection transformation (as shown in Figure 2) coordinate.
- Step 30223 Solve f 1 and g 1 by simultaneous equations; the intersection (projection point) of f 2 and g 2 , as shown in j 1 , j 2 in Figure 2 ;
- Step 30224 Project the Euclidean distance between the coordinate point (x 0 , y 0 ) before the projection transformation and the four surrounding pixels onto their corresponding diagonal lines (see Figure 2);
- Step 30225 Calculate the pixel value at the coordinate point (x, y) after projection transformation:
- the four pixel values around the coordinate point (x 0 , y 0 ) before projection transformation are T 1 (x 1 , y in the order of upper left, upper right, lower right, and lower left, respectively) 1 ), T 2 (x 2 , y 2 ), T 3 (x 3 , y 3 ), T 4 (x 4 , y 4 ); u, v are the coordinate points before projection transformation (x 0 , y 0 )
- the Euclidean distance between the coordinate points T 1 and T 2 is based on the projection distance obtained in step 30224.
- Step 4 Perform fast Fourier transform on the projected image, transform the image from the spatial domain to the frequency domain, and obtain the spectrogram in the complex domain.
- Step 5 Use the frequency domain differential filter created in Step 1 to convolve the image of the spectrogram, filter out the background and noise, and obtain the spectral image of the target area where the emery is located.
- Step 6 Use the inverse Fourier transform of the image to inverse transform the target area spectrum image to obtain the target area spatial image information.
- Step 7 Using the Gaussian filtering in the spatial domain, the real-number image of the target area is processed through the Gaussian filtering in the spatial domain to obtain a target area map with reduced noise.
- Step 801. Perform median filtering on the image.
- Step 802. The difference between the image obtained in step 7 and the image obtained in step 801 is obtained to obtain a deviation image g(x, y) of the two images.
- Step 803. Extract the bright channel of the image; according to the offset of the local gray value deviation of the image before and after the filtering in step 801; the set of bright channels is:
- equation (12) (x, y) is the transformed coordinate point, and d is the set threshold.
- Step 9 Using the inverse projection transformation module, perform inverse projection transformation on the position map of the emery particles, restore the position of the target area in the image and output position information.
- the extraction effect of emery particles under this embodiment is shown in FIG. 4.
Abstract
Description
Claims (5)
- 一种识别金刚砂颗粒的视觉处理方法,其步骤是:步骤1.设计频域高斯滤波器,利用差分原理,根据采集的图像尺寸创建两个标准差分别为c 1、c 2的频域高斯滤波器,并求差得到差分滤波器图片;步骤2.对图像进行均值滤波与二值化处理,对采集到的原始图像进行图像的均值滤波与图像最大类间方差二值化,得到金刚砂线所在的目标外围区域图像;步骤3:对图像进行投影处理,根据步骤2中得到的目标外围区域图像,计算图像投影变换矩阵,并对采集到的原始图像进行投影变换,将目标外围区域变换到图像的固定位置处,得到统一大小和位置的目标外围区域图像;步骤4.图像处理软件调用快速傅里叶变换模块,针对投影变换后的图像进行快速傅里叶变换,得到复数域内的频谱图;步骤5.图像处理软件调用图像卷积模块,采用步骤1中创建的频域差分滤波器对频谱图进行图像卷积计算,增强特征,得到金刚砂所在的目标区域频谱图像;步骤6.再利用用图像傅里叶反变换,对目标区域频谱图像进行图像傅里叶反变换,得到目标区域实数图像;步骤7.图像处理软件调用空间域的高斯滤波模块,通过空间域的高斯滤波处理目标区域实数图像,得到噪声减弱的目标区域图;步骤8.图像处理软件调用动态阈值模块,采用改进的动态阈值法,设定阈值d,获得目标区域图像的亮通道,得到金刚砂颗粒的位置图;步骤9.图像处理软件调用投影逆变换模块,对金刚砂颗粒的位置图执行投影逆变换,还原目标区域在图像中的位置。
- 根据权利要求1所述的一种识别金刚砂颗粒的视觉处理方法,其特征是:所述创建两个频域高斯滤波器的标准差c 1、c 2的值,根据目标区域所在频域图上频段的确定;所采集的图像中,目标区域的高度占整幅图像高度的1/n,频域高斯滤波器宽度、高度分别设定为整幅图像宽度、高度的1/n,c 1等于图像的宽度,c 2等于n;n=W/W 1W是相机的视野范围,W 1是金刚砂的线宽度,单位都是毫米。
- 根据权利要求1所述的一种识别金刚砂颗粒的视觉处理方法,其特征是:所述步骤3,步骤如下:步骤301:投影矩阵的计算:假设投影变换前的图像左上、右上、右下、左下四个顶点的行、列坐标向量分别为:Px=(px 1,px 2,px 3,px 4) T,Py=(py 1,py 2,py 3,py 4) T式中,px 1...px 4,py 1...py 4分别为上述四个顶点(以下简称顶点)的行坐标、列坐标;Px、Py分别为顶点的行坐标向量、列坐标向量;假设投影变换后对应的坐标分别为:Qx=(qx 1,qx 2,qx 3,qx 4) T,Qy=(qy 1,qy 2,qy 3,qy 4) T式中,qx 1...qx 4,qy 1...qy 4分别为投影变换后顶点的行坐标、列坐标;Qx、Qy分别为投影变换后顶点的行坐标向量、列坐标向量;则投影变换矩阵:MatH=(Qx,Qy,1,1)·(Px,Py,1,1) -1投影逆变换矩阵:MatH -1={(Qx,Qy,1,1)·(Px,Py,1,1) -1} -1;步骤302:图像的投影变换,采用像素加权插值法确定投影变换后图像坐标点(x,y)的像素值f(x,y)。
- 根据权利要求3所述的一种识别金刚砂颗粒的视觉处理方法,其特征是:所述步骤302所述的图像投影变换,其步骤是:步骤3021:依次遍历投影变换前原图像中的每一个像素点对应的坐标点(x 0,y 0),根据投影变换矩阵确定投影变换后的坐标点(x,y);由(x,y,1,1)=MatH·(x 0,y 0,1,1)确定投影变换后的坐标点(x,y);步骤3022:加权插值法计算像素点,具体步骤为:步骤30221:根据两点式直线解析式计算公式计算投影变换前坐标点(x 0,y 0)周围4个像素点的两条对角线的解析式f 1,f 2分别为:式中,x 1...x 4以及y 1...y 4分别为投影变换前坐标点(x 0,y 0)周围顶点的行坐标、列坐标;步骤30222:根据点斜式直线解析式计算公式计算投影变换前坐标点(x 0,y 0),且与两条对角线分别垂直的两条垂线的解析式g 1,g 2分别为:步骤30223:连立方程组求解f 1与g 1;f 2与g 2的交点(投影点)j 1,j 2;步骤30224:将投影变换前坐标点(x 0,y 0)与四个周围像素点间的欧几里得距离投影到各自所对应的对角线上;步骤30225:计算投影变换后坐标点(x,y)处像素值:式中,投影变换前坐标点(x 0,y 0)周围的四个像素值,按照左上、右上、右下、左下的顺序分别为T 1(x 1,y 1),T 2(x 2,y 2),T 3(x 3,y 3),T 4(x 4,y 4);u,v分别是投影变换前坐标点(x 0,y 0)与坐标点T 1,T 2间的欧几里得距离根据步骤30224所得的投影距离;L为对角线长度:式中,i取值为3或4。
- 根据权利要求1所述的一种识别金刚砂颗粒的视觉处理方法,其特征是:所述步骤8具体步骤为:步骤801.对图像进行中值滤波;步骤802.采用步骤7得到的图像与步骤801得到的图像求差,得到两幅图像的偏差图像g(x,y);步骤803.提取图像亮通道;根据步骤801滤波前与滤波后的图像局部灰度值偏差offset;亮通道的集合为:B={(x,y)|offset(x,y)≥d}式中,(x,y)为变换后的坐标点,d为设定的阈值。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811494388.X | 2018-12-07 | ||
CN201811494388.XA CN109636785A (zh) | 2018-12-07 | 2018-12-07 | 一种识别金刚砂颗粒的视觉处理方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020114134A1 true WO2020114134A1 (zh) | 2020-06-11 |
Family
ID=66071960
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/112854 WO2020114134A1 (zh) | 2018-12-07 | 2019-10-23 | 一种识别金刚砂颗粒的视觉处理方法 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN109636785A (zh) |
WO (1) | WO2020114134A1 (zh) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109636785A (zh) * | 2018-12-07 | 2019-04-16 | 南京埃斯顿机器人工程有限公司 | 一种识别金刚砂颗粒的视觉处理方法 |
CN113063705B (zh) * | 2021-03-22 | 2022-09-27 | 陕西科技大学 | 一种基于机器视觉的金刚线表面金刚砂颗粒质量检测方法 |
CN113409266A (zh) * | 2021-06-17 | 2021-09-17 | 陕西科技大学 | 一种金刚砂颗粒检测及计数方法及系统 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103177458A (zh) * | 2013-04-17 | 2013-06-26 | 北京师范大学 | 一种基于频域分析的可见光遥感图像感兴趣区域检测方法 |
CN107767385A (zh) * | 2017-08-28 | 2018-03-06 | 江苏理工学院 | 一种基于机器视觉的金刚砂线颗粒计数方法和装置 |
CN109636785A (zh) * | 2018-12-07 | 2019-04-16 | 南京埃斯顿机器人工程有限公司 | 一种识别金刚砂颗粒的视觉处理方法 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1202490C (zh) * | 2003-03-19 | 2005-05-18 | 上海交通大学 | 虹膜纹理归一化处理方法 |
CN101093538B (zh) * | 2006-06-19 | 2011-03-30 | 电子科技大学 | 一种基于小波变换过零表示的虹膜识别方法 |
DE102009014080B4 (de) * | 2009-03-23 | 2011-12-15 | Baumer Innotec Ag | Vorrichtung zum Bestimmen von Partikelgrössen |
CN103020920B (zh) * | 2013-01-10 | 2015-03-25 | 厦门大学 | 一种低照度图像增强方法 |
CN103077504B (zh) * | 2013-01-10 | 2015-08-05 | 厦门大学 | 一种基于自适应光照计算的图像去雾方法 |
CN105046681A (zh) * | 2015-05-14 | 2015-11-11 | 江南大学 | 一种基于SoC的图像显著性区域检测方法 |
KR101767564B1 (ko) * | 2015-11-12 | 2017-08-11 | 성균관대학교산학협력단 | 막대 입자 이미지의 영상 분석 방법 |
CN108171244A (zh) * | 2016-12-07 | 2018-06-15 | 北京深鉴科技有限公司 | 对象识别方法和系统 |
CN106846263B (zh) * | 2016-12-28 | 2019-11-29 | 中国科学院长春光学精密机械与物理研究所 | 基于融合通道且对天空免疫的图像去雾方法 |
CN107478657A (zh) * | 2017-06-20 | 2017-12-15 | 广东工业大学 | 基于机器视觉的不锈钢表面缺陷检测方法 |
CN108875731B (zh) * | 2017-12-28 | 2022-12-09 | 北京旷视科技有限公司 | 目标识别方法、装置、系统及存储介质 |
CN108226159B (zh) * | 2017-12-29 | 2019-11-22 | 钢铁研究总院 | 金属材料中析出相颗粒的全视场定量统计分布表征方法 |
-
2018
- 2018-12-07 CN CN201811494388.XA patent/CN109636785A/zh active Pending
-
2019
- 2019-10-23 WO PCT/CN2019/112854 patent/WO2020114134A1/zh active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103177458A (zh) * | 2013-04-17 | 2013-06-26 | 北京师范大学 | 一种基于频域分析的可见光遥感图像感兴趣区域检测方法 |
CN107767385A (zh) * | 2017-08-28 | 2018-03-06 | 江苏理工学院 | 一种基于机器视觉的金刚砂线颗粒计数方法和装置 |
CN109636785A (zh) * | 2018-12-07 | 2019-04-16 | 南京埃斯顿机器人工程有限公司 | 一种识别金刚砂颗粒的视觉处理方法 |
Also Published As
Publication number | Publication date |
---|---|
CN109636785A (zh) | 2019-04-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108921176B (zh) | 一种基于机器视觉的指针式仪表定位与识别方法 | |
CN114972329B (zh) | 基于图像处理的表面缺陷检测仪的图像增强方法及系统 | |
CN107808378B (zh) | 基于垂直纵横线轮廓特征的复杂结构铸件潜在缺陷检测方法 | |
US20220292645A1 (en) | Method for restoring video data of drainage pipe based on computer vision | |
CN111243032B (zh) | 一种棋盘格角点全自动检测方法 | |
CN106650770B (zh) | 一种基于样本学习和人眼视觉特性的mura缺陷检测方法 | |
CN107845087B (zh) | 液晶面板亮度不均匀缺陷的检测方法和系统 | |
WO2020114134A1 (zh) | 一种识别金刚砂颗粒的视觉处理方法 | |
CN109816652B (zh) | 一种基于灰度显著性的复杂铸件缺陷识别方法 | |
CN107678192B (zh) | 一种基于机器视觉的Mura缺陷检测方法 | |
WO2020133046A1 (zh) | 一种缺陷检测方法及装置 | |
CN107478657A (zh) | 基于机器视觉的不锈钢表面缺陷检测方法 | |
CN102974551A (zh) | 一种基于机器视觉的多晶硅太阳能检测分选的方法 | |
CN105139391B (zh) | 一种雾霾天气交通图像边缘检测方法 | |
CN108921813A (zh) | 一种基于机器视觉的无人机检测桥梁结构裂缝识别方法 | |
CN110599552A (zh) | 一种基于计算机视觉的pH试纸检测方法 | |
CN110648330B (zh) | 摄像头玻璃的缺陷检测方法 | |
CN104899888A (zh) | 一种基于Legendre矩的图像亚像素边缘检测方法 | |
CN114331986A (zh) | 一种基于无人机视觉的坝体裂纹识别与测量方法 | |
CN107388991A (zh) | 一种端面多圆角轴类零件圆角半径测量方法 | |
CN112489042A (zh) | 基于超分辨重建的金属品印刷缺陷与表面损伤的检测方法 | |
CN113705564B (zh) | 一种指针式仪表识别读数方法 | |
CN107748897B (zh) | 基于模式识别的大尺寸弯曲零件轮廓度质量检测方法 | |
Zhao et al. | Analysis of image edge checking algorithms for the estimation of pear size | |
TWI543117B (zh) | 物件辨識與定位方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19893814 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19893814 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19893814 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 14.01.2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19893814 Country of ref document: EP Kind code of ref document: A1 |