CN107220988B - Part image edge extraction method based on improved canny operator - Google Patents

Part image edge extraction method based on improved canny operator Download PDF

Info

Publication number
CN107220988B
CN107220988B CN201710300398.4A CN201710300398A CN107220988B CN 107220988 B CN107220988 B CN 107220988B CN 201710300398 A CN201710300398 A CN 201710300398A CN 107220988 B CN107220988 B CN 107220988B
Authority
CN
China
Prior art keywords
image
threshold value
gray level
edge
gray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710300398.4A
Other languages
Chinese (zh)
Other versions
CN107220988A (en
Inventor
黄成�
金威
宋跃磊
陈嘉
方杰
徐志良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN201710300398.4A priority Critical patent/CN107220988B/en
Publication of CN107220988A publication Critical patent/CN107220988A/en
Application granted granted Critical
Publication of CN107220988B publication Critical patent/CN107220988B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details

Abstract

The invention discloses a part image edge extraction method based on an improved canny operator, which comprises the steps of firstly obtaining a part color image and converting the part color image into a gray scale image; then, carrying out frequency domain filtering on the gray level image by utilizing Fourier transform; performing convolution sharpening on the filtered gray level image by using prewitt operator templates in 4 directions to obtain a maximum gradient image of the image; then calculating a gray level histogram of the maximum gradient map, obtaining a threshold value T by utilizing a valley bottom method, dividing the gray level map into 2 regions by utilizing the threshold value T, and calculating a high threshold value and a low threshold value by respectively using a maximum inter-class variance method for the 2 regions; and finally, binarizing the gray level image according to the high and low threshold values, taking the binary image generated by the high threshold value as an edge basis, and connecting the edge basis by using the binary image generated by the low threshold value to obtain a complete edge binary image. The method improves the edge extraction precision, adaptively generates high and low thresholds and improves the efficiency.

Description

Part image edge extraction method based on improved canny operator
Technical Field
The invention belongs to the field of image processing and machine vision, and particularly relates to a part image edge extraction method based on an improved canny operator.
Background
An image edge is one of the most basic features of an image and contains most of the information of the image. In the field of image processing and computer vision, image edge detection is the most basic technique, and plays an important role in subsequent processing. In the image of the part, the accurate detection of the edge of the part plays an important role in checking the qualification degree of the part.
The image edge detection can be divided into the following 3 steps:
1. denoising an image: when the first-order and second-order derivatives of the image gray scale are used for image edge detection, the derivative calculation of the image is greatly influenced by noise, so that the image is denoised firstly to improve the performance of edge detection.
2. Edge enhancement: calculating the magnitude of the image gradient can highlight the points in the neighborhood of points in the image or points where the local intensity changes are significant.
3. And (3) detection: in general, the gradient amplitude of the image edge points is relatively large, but in practice, there are many points with large image gradient amplitude, and these points are not all edges sometimes, so that an appropriate method is used to determine which are edge points. Edge detection methods are generally classified into operator detection methods of first-order derivatives and second-order derivatives.
The edge is where the change of the gray level of the image is significant, and the derivative value at the abrupt change of the gray level of the image is also large, so the change of the gray level is generally expressed by the magnitude of the derivative of the gray level. Image point ux(x, y) and point uyThe gradient of (x, y) is defined as the following vector:
Figure DEST_PATH_GDA0001370004900000011
the magnitude of the gradient is given by:
Figure DEST_PATH_GDA0001370004900000012
depending on the size and weight of the convolution template, many gradient operators are present.
However, since it is always difficult to simultaneously ensure the denoising and the positioning accuracy of the image edge, when the noise is smoothed, it is not desirable to smooth the edge information, and also, when the noise is sharpened, it is desirable to sharpen only the edge without receiving the interference of the noise. How to balance the relationship between the two is a core problem in image edge extraction. Meanwhile, in the edge operator, most templates only have good response to horizontal and vertical edges, and have poor extraction capability to edges at other angles.
The Canny operator is an optimization operator that combines image smoothing, edge enhancement, and detection. During edge detection, firstly a Gaussian function is used for carrying out convolution on an image to remove noise, then a first-order difference template is used for calculating the direction and gradient amplitude of each pixel point, then a non-maximum suppression principle is used for highlighting the point with obvious change of each point neighborhood intensity value of the image so as to achieve the effect of edge enhancement, and finally two thresholds are used for detecting and connecting edges to finally obtain an edge image.
However, the conventional first-order difference template only has good response to vertical and horizontal edges, and has poor effect on detecting edges in other directions. Meanwhile, in the final selection of the two thresholds, the optimal threshold is difficult to select by adopting a manual selection method, so that the final edge selection and edge connection are influenced. Improper threshold selection can cause the image to generate many false edges and noise, which interfere with further processing and adaptation of the image at a later stage.
Disclosure of Invention
The invention aims to provide a part image edge extraction method based on an improved Canny operator, which is used for automatically generating a single-pixel binary image containing clear target part edges and facilitating the subsequent analysis of each parameter of a part.
The technical solution for realizing the purpose of the invention is as follows: a part image edge extraction method based on an improved canny operator comprises the following steps:
step 1, acquiring a color image of a part, and converting the color image into a gray-scale image;
step 2, carrying out frequency domain filtering on the gray level image;
step 3, carrying out convolution sharpening on the filtered gray level image by using prewitt operator templates in 4 directions to obtain a maximum gradient image of the image;
step 4, calculating a gray level histogram of the maximum gradient map, obtaining a threshold value T by using a valley bottom method, dividing the gray level map into 2 regions by using the threshold value T, and calculating a high threshold value and a low threshold value by using a maximum inter-class variance method for the 2 regions respectively;
and 5, binarizing the gray level image according to the high and low threshold values, taking the binary image generated by the high threshold value as an edge basis, and connecting the edge basis by using the binary image generated by the low threshold value to obtain a complete edge binary image.
The step 2 of performing frequency domain filtering specifically comprises the following steps:
step 2.1, obtaining F (u, v) by adopting a fast Fourier algorithm to the gray-scale image, and moving a zero-frequency point of the F (u, v) to the central position of the spectrogram;
step 2.2, calculate the product G (u, v) of the Gaussian function filter function H (u, v) and F (u, v), shift the zero frequency point of the spectrum G (u, v) back to the upper left corner of the spectrogram, wherein
The gaussian filter function is expressed as:
Figure DEST_PATH_GDA0001370004900000031
where u and v are frequency domain variables, and M, N are the number of horizontal and vertical pixels of the image, respectively. Sigma is a sigma parameter of a Gaussian function;
the spectrum is represented as:
G(u,v)=H(u,v)×F(u,v)
and 2.3, performing inverse discrete Fourier transform on G (u, v) to obtain G (x, y), and taking a real part of G (x, y) as a filtered result image.
The specific step of solving the maximum gradient map in the step 3 is as follows:
step 3.1, respectively convolving the filtered gray level images by utilizing templates in 45 degrees, 135 degrees, horizontal and vertical directions of prewitt;
and 3.2, taking the maximum value of the 4 convolution results of each pixel as a gradient value, wherein the direction corresponding to the maximum value is the maximum gradient direction, and obtaining the maximum gradient image.
The specific method for calculating the low threshold in the step 4 is as follows:
let Q (x, y) be the maximum gradient map, and the gray scale range of Q (x, y) be [0, L-1%]If the threshold value is T by using the valley bottom method, the threshold value T divides Q (x, y) into 2 ranges [0, T ]]And [ T, L-1]]Let n be the number of pixels in the image with i gray scaleiIn the gray scale range of [0, T]The total number of pixels in the interior is:
Figure DEST_PATH_GDA0001370004900000032
the probability of each gray level occurrence is:
Figure DEST_PATH_GDA0001370004900000033
at [0, T]In the threshold value T1Divide it into class 2C0And C1,C0From [0, T1-1]Composition C of1From [ T ]1,T]Composition, then region C0And C1The probabilities of (c) are respectively:
Figure DEST_PATH_GDA0001370004900000041
P1=1-P0
C0and C1The average gray levels of (a) are:
Figure DEST_PATH_GDA0001370004900000042
Figure DEST_PATH_GDA0001370004900000043
where μ is the average gray scale of [0, T ]:
μ=P0μ0+P1μ1
the total variance of the two regions is:
σB 2=P00-μ)2+P11-μ)2=P0P101)2
let T be [0, T ]1-1]Take values in turn, so that sigmaB 2The maximum value of T is the best choice for the low threshold.
In a similar way, in [ T1,L-1]The above steps are repeated, and the optimal selection of the high threshold value can be obtained.
Compared with the prior art, the invention has the following advantages: 1) the invention preprocesses the image through Gaussian filtering and well filters the noise. 2) In the sharpening operator, prewitt operators in four directions are used as gradient templates, and templates in two directions of 45 degrees and 135 degrees are added on the basis of two templates, namely a vertical template and a horizontal template, so that good response is achieved on edges in 8 directions, and the edge extraction precision is improved. 3) In the aspect of binary high and low threshold selection, a histogram valley method and a maximum inter-class variance method are combined, the high and low thresholds are generated in a self-adaptive manner, the efficiency is improved, and the precision is improved.
Drawings
Fig. 1 is an overall schematic diagram of an image edge extraction process.
Fig. 2 is a schematic diagram of a frequency domain filtering process.
Fig. 3 is a schematic diagram of an edge sharpening process.
FIG. 4 is a diagram illustrating adaptive threshold selection and final edge binarization.
FIG. 5 is a graph of edge binary values generated by the method of the present invention.
FIG. 6 is a graph of edge binary values generated using a conventional method.
FIG. 7 is a diagram of 4 templates of the Prewitt operator.
Detailed Description
The invention will be further explained with reference to the drawings.
The method for extracting the edge of the part image based on the improved canny operator comprises the steps of firstly opening an industrial camera and connecting the industrial camera to a PC (personal computer), fixing a shot part, adjusting a focal length, determining that a clear picture can be collected, shooting a clear map picture after the installation is finished, and transmitting the clear map picture into a computer processing system through the Ethernet, wherein a processing platform is visual studio 2010+ opencv2.4.10. The image processing step is then entered:
step 1, carrying out gray scale conversion on the collected RGB color image to generate a gray scale image, and regarding the gray scale image as an f (x, y) function according to the threshold value of the gray scale image.
Step 2, carrying out frequency domain filtering on the gray level image, and specifically comprising the following steps:
step 2.1, calculating discrete Fourier transform of F (x, y), wherein due to low direct implementation efficiency of DFT, a time-extracted fast Fourier algorithm (DIT-FFT) is adopted to obtain F (u, v), and a zero frequency point of the F (u, v) is moved to the center position of a spectrogram;
step 2.2, calculating a product G (u, v) of the filter functions H (u, v) and F (u, v), namely a frequency spectrum, wherein a Gaussian function is adopted in the method:
Figure DEST_PATH_GDA0001370004900000051
wherein u and v are frequency domain variables, and M and N are the number of horizontal and vertical pixels of the image respectively. σ is the sigma parameter of a gaussian function.
The spectrum is the product of the filter functions H (u, v) and F (u, v):
G(u,v)=H(u,v)×F(u,v)
then, the zero frequency point of the frequency spectrum G (u, v) is shifted back to the position of the upper left corner of the frequency spectrum graph;
and 2.3, calculating Inverse Discrete Fourier Transform (IDFT) on G (u, v) in the step 2.2 to obtain G (x, y), and taking a real part of the G (x, y) as a filtered result image.
And 3, performing convolution sharpening on the filtered gray level image by using prewitt operator templates in 4 directions to obtain a maximum gradient image of the image, wherein the specific steps are as follows:
step 3.1, respectively convolving the filtered gray level map g (x, y) by using templates of 45 °, 135 °, horizontal and vertical directions of Prewitt, wherein 4 templates of the Prewitt operator are shown in fig. 7:
and 3.2, taking the maximum value of the 4 convolution results of each pixel as a gradient value, wherein the direction corresponding to the maximum value is the maximum gradient direction, and thus obtaining a maximum gradient image Q (x, y) after the processing.
And 4, step 4: the maximum gradient image is thresholded according to high and low thresholds, and the traditional threshold selection is finished manually, so that a plurality of tests are required and false edges can be generated. The method for adaptively selecting the high and low thresholds by using the combination method of the histogram valley and the maximum between-class variance comprises the following specific steps:
step 4.1, calculating a gray level histogram of the maximum gradient image Q (x, y), obtaining a threshold value T by using a valley bottom method, dividing the gray level image into 2 areas by using the threshold value T, and dividing the Q (x, y) into 2 ranges [0, T ] and [ T, L-1] by using the threshold value T if the gray level range of the Q (x, y) is [0, L-1 ];
step 4.2, respectively using a maximum inter-class variance method for 2 areas to calculate the high and low thresholds, wherein the specific method comprises the following steps:
let n be the number of pixels with i gray in the imageiIn the gray scale range of [0, T]The total number of pixels in the interior is:
Figure DEST_PATH_GDA0001370004900000061
the probability of each gray level occurrence is:
Figure DEST_PATH_GDA0001370004900000062
at [0, T]In the threshold value T1Divide it into class 2C0And C1,C0From [0, T1-1]Composition C of1From [ T ]1,T]Composition, then region C0And C1The probabilities of (c) are respectively:
Figure DEST_PATH_GDA0001370004900000063
P1=1-P0
C0and C1The average gray of (d) is:
Figure DEST_PATH_GDA0001370004900000071
Figure DEST_PATH_GDA0001370004900000072
where μ is the average gray scale of [0, T ]:
μ=P0μ0+P1μ1
the total variance of the two regions is:
σB 2=P00-μ)2+P11-μ)2=P0P101)2
let T be [0, T ]1-1]Take values in turn, so that sigmaB 2The maximum value of T is the best choice for the low threshold;
in a similar way, in [ T1,L-1]The above steps are repeated to obtain the optimal selection of the high threshold, and the specific method is not repeated.
Step 5, obtaining two threshold edge images N by using the high and low thresholds in the step 41[i,j],N2[i,j], N1[i,j]Derived from a low threshold, N2[i,j]Resulting from the high threshold. Due to N2[i,j]From the high threshold, the edge is basically true, but there is a break, so N is sought in 8 fields of the break1[i,j]Until N is connected2[i,j]Until they are connected, a complete edge binary map is obtained.
As can be seen from fig. 5 and 6, the edge binary image generated by the conventional prewitt method has a lot of noise and many false edges, and an accurate single-pixel edge cannot be obtained.
The edge binary image generated by the method of the invention firstly effectively filters most of noise, reduces the generation of false edges in the aspect of edges and generates accurate single-pixel edges.

Claims (3)

1. The part image edge extraction method based on the improved canny operator is characterized by comprising the following steps of:
step 1, acquiring a color image of a part, and converting the color image into a gray-scale image;
step 2, carrying out frequency domain filtering on the gray level image;
step 3, carrying out convolution sharpening on the filtered gray level image by using prewitt operator templates in 4 directions to obtain a maximum gradient image of the image;
step 4, calculating a gray level histogram of the maximum gradient map, obtaining a threshold value T by using a valley bottom method, dividing the gray level map into 2 regions by using the threshold value T, and calculating a high threshold value and a low threshold value by using a maximum inter-class variance method for the 2 regions respectively;
step 5, binarizing the gray level image according to the high and low threshold values, taking the binary image generated by the high threshold value as an edge basis, and connecting the edge basis by using the binary image generated by the low threshold value to obtain a complete edge binary image;
the specific method for calculating the low threshold in the step 4 is as follows:
let Q (x, y) be the maximum gradient map, and the gray scale range of Q (x, y) be [0, L-1%]If the threshold value is T by using the valley bottom method, the threshold value T divides Q (x, y) into 2 ranges [0, T ]]And [ T, L-1]]Let n be the number of pixels in the image with i gray scaleiIn the gray scale range of [0, T]The total number of pixels in the interior is:
Figure FDA0002449158550000011
the probability of each gray level occurrence is:
Figure FDA0002449158550000012
at [0, T]In the threshold value T1Divide it into class 2C0And C1,C0From [0, T1-1]Composition C of1From [ T ]1,T]Composition, then region C0And C1The probabilities of (c) are respectively:
Figure FDA0002449158550000013
P1=1-P0
C0and C1The average gray levels of (a) are:
Figure FDA0002449158550000014
Figure FDA0002449158550000021
where μ is the average gray scale of [0, T ]:
μ=P0μ0+P1μ1
the total variance of the two regions is:
σB 2=P00-μ)2+P11-μ)2=P0P101)2
let T be [0, T ]1-1]Take values in turn, so that sigmaB 2The maximum value of T is the best choice for the low threshold.
2. The method for extracting the edge of the image based on the modified Canny operator as claimed in claim 1, wherein the step 2 of performing the frequency domain filtering specifically comprises the following steps:
step 2.1, obtaining F (u, v) by adopting a fast Fourier algorithm to the gray-scale image, and moving a zero-frequency point of the F (u, v) to the central position of the spectrogram;
step 2.2, calculate the product G (u, v) of the Gaussian function filter function H (u, v) and F (u, v), shift the zero frequency point of the spectrum G (u, v) back to the upper left corner of the spectrogram, wherein
The gaussian filter function is expressed as:
Figure FDA0002449158550000022
wherein u and v are frequency domain variables, M, N are the number of horizontal and vertical pixels of the image respectively, and sigma is a sigma parameter of a Gaussian function;
the spectrum is represented as:
G(u,v)=H(u,v)×F(u,v)
and 2.3, performing inverse discrete Fourier transform on G (u, v) to obtain G (x, y), and taking a real part of G (x, y) as a filtered result image.
3. The method for extracting the image edge based on the modified Canny operator as claimed in claim 1, wherein the step 3 of obtaining the maximum gradient map comprises the following specific steps:
step 3.1, respectively convolving the filtered gray level images by utilizing templates in 45 degrees, 135 degrees, horizontal and vertical directions of prewitt;
and 3.2, taking the maximum value of the 4 convolution results of each pixel as a gradient value, wherein the direction corresponding to the maximum value is the maximum gradient direction, and obtaining the maximum gradient image.
CN201710300398.4A 2017-04-30 2017-04-30 Part image edge extraction method based on improved canny operator Active CN107220988B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710300398.4A CN107220988B (en) 2017-04-30 2017-04-30 Part image edge extraction method based on improved canny operator

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710300398.4A CN107220988B (en) 2017-04-30 2017-04-30 Part image edge extraction method based on improved canny operator

Publications (2)

Publication Number Publication Date
CN107220988A CN107220988A (en) 2017-09-29
CN107220988B true CN107220988B (en) 2020-09-18

Family

ID=59943716

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710300398.4A Active CN107220988B (en) 2017-04-30 2017-04-30 Part image edge extraction method based on improved canny operator

Country Status (1)

Country Link
CN (1) CN107220988B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108109155A (en) * 2017-11-28 2018-06-01 东北林业大学 A kind of automatic threshold edge detection method based on improvement Canny
CN109101976B (en) * 2018-07-10 2021-11-30 温州大学 Method for detecting surface defects of arc-extinguishing grid plate
CN110378866B (en) * 2019-05-22 2021-04-06 中国水利水电科学研究院 Channel lining damage image identification method based on unmanned aerial vehicle inspection
CN111027567A (en) * 2019-10-30 2020-04-17 四川轻化工大学 Edge extraction method based on algorithm learning
CN111127498B (en) * 2019-12-12 2023-07-25 重庆邮电大学 Canny edge detection method based on edge self-growth
CN111062317A (en) * 2019-12-16 2020-04-24 中国计量大学上虞高等研究院有限公司 Method and system for cutting edges of scanned document
CN111681256B (en) * 2020-05-07 2023-08-18 浙江大华技术股份有限公司 Image edge detection method, image edge detection device, computer equipment and readable storage medium
CN112508795A (en) * 2020-11-26 2021-03-16 珠海格力电器股份有限公司 Image contour detection method and device
CN112488940A (en) * 2020-11-30 2021-03-12 哈尔滨市科佳通用机电股份有限公司 Method for enhancing image edge of railway locomotive component
CN112381850A (en) * 2020-12-04 2021-02-19 亿嘉和科技股份有限公司 Cabinet surface circular target automatic segmentation method, system, device and storage medium
CN112669265B (en) * 2020-12-17 2022-06-21 华中科技大学 Method for realizing surface defect detection based on Fourier transform and image gradient characteristics
CN113850809A (en) * 2021-12-01 2021-12-28 武汉飞恩微电子有限公司 Method for detecting appearance defects of protective film of chip resistor
CN114331923B (en) * 2022-03-11 2022-05-13 中国空气动力研究与发展中心低速空气动力研究所 Improved Canny algorithm-based bubble profile extraction method in ice structure
CN115131387B (en) * 2022-08-25 2023-01-24 山东鼎泰新能源有限公司 Gasoline engine spray wall collision parameter automatic extraction method and system based on image processing

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102521836A (en) * 2011-12-15 2012-06-27 江苏大学 Edge detection method based on gray-scale image of specific class
CN104700421A (en) * 2015-03-27 2015-06-10 中国科学院光电技术研究所 Adaptive threshold edge detection algorithm based on canny
CN105354815A (en) * 2015-09-12 2016-02-24 沈阳理工大学 Flat micro-part based accurate identification and positioning method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8306355B2 (en) * 2009-07-13 2012-11-06 Sharp Laboratories Of America, Inc. Methods and systems for reducing compression artifacts

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102521836A (en) * 2011-12-15 2012-06-27 江苏大学 Edge detection method based on gray-scale image of specific class
CN104700421A (en) * 2015-03-27 2015-06-10 中国科学院光电技术研究所 Adaptive threshold edge detection algorithm based on canny
CN105354815A (en) * 2015-09-12 2016-02-24 沈阳理工大学 Flat micro-part based accurate identification and positioning method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"基于Canny算子的机械零件图像分块自适应边缘检测";朱晓林 等;《江南大学学报(自然科学版)》;20100630;第9卷(第3期);第304-307页 *

Also Published As

Publication number Publication date
CN107220988A (en) 2017-09-29

Similar Documents

Publication Publication Date Title
CN107220988B (en) Part image edge extraction method based on improved canny operator
CN107808378B (en) Method for detecting potential defects of complex-structure casting based on vertical longitudinal and transverse line profile features
CN110349207B (en) Visual positioning method in complex environment
CN111080661B (en) Image-based straight line detection method and device and electronic equipment
CN100474337C (en) Noise-possessing movement fuzzy image restoration method based on radial basis nerve network
CN109741356B (en) Sub-pixel edge detection method and system
WO2021109697A1 (en) Character segmentation method and apparatus, and computer-readable storage medium
CN113034452B (en) Weldment contour detection method
CN105139391B (en) A kind of haze weather traffic image edge detection method
CN107403435B (en) Color image edge extraction method based on RGB color space
CN110414308B (en) Target identification method for dynamic foreign matters on power transmission line
CN109584198B (en) Method and device for evaluating quality of face image and computer readable storage medium
CN114399522A (en) High-low threshold-based Canny operator edge detection method
CN116542982A (en) Departure judgment device defect detection method and device based on machine vision
CN113780110A (en) Method and device for detecting weak and small targets in image sequence in real time
CN111179186A (en) Image denoising system for protecting image details
CN106599891A (en) Remote sensing image region-of-interest rapid extraction method based on scale phase spectrum saliency
CN107545549B (en) Method for estimating scattered focus point spread function based on one-dimensional spectrum curve
CN117094975A (en) Method and device for detecting surface defects of steel and electronic equipment
CN104036461A (en) Infrared complicated background inhibiting method based on combined filtering
CN114943744A (en) Edge detection method based on local Otsu thresholding
CN106920266B (en) The Background Generation Method and device of identifying code
CN113781413A (en) Electrolytic capacitor positioning method based on Hough gradient method
CN108269264B (en) Denoising and fractal method of bean kernel image
CN111311610A (en) Image segmentation method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant