CN114943744A - Edge detection method based on local Otsu thresholding - Google Patents

Edge detection method based on local Otsu thresholding Download PDF

Info

Publication number
CN114943744A
CN114943744A CN202210457562.3A CN202210457562A CN114943744A CN 114943744 A CN114943744 A CN 114943744A CN 202210457562 A CN202210457562 A CN 202210457562A CN 114943744 A CN114943744 A CN 114943744A
Authority
CN
China
Prior art keywords
image
edge
edge detection
threshold
gradient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210457562.3A
Other languages
Chinese (zh)
Inventor
李昌利
潘志庚
王超
周先春
蔡创新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Information Science and Technology
Original Assignee
Nanjing University of Information Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Information Science and Technology filed Critical Nanjing University of Information Science and Technology
Priority to CN202210457562.3A priority Critical patent/CN114943744A/en
Publication of CN114943744A publication Critical patent/CN114943744A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Computational Mathematics (AREA)
  • Geometry (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Operations Research (AREA)
  • Probability & Statistics with Applications (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Algebra (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an edge detection method based on local Otsu thresholding, which comprises the steps of uniformly partitioning an original image to obtain four regions with the same area; then, calculating the threshold value of each block of area by adopting an Otsu algorithm to obtain four threshold values; then, carrying out thresholding treatment on respective regions by utilizing each threshold value to obtain a binary image; and finally, carrying out edge detection on the binary image by using an edge detection operator to obtain an edge image. The method of the invention improves the edge detection quality, can effectively filter noise, reserve key edges, and can obtain single-pixel closed edge lines with good continuity, thereby providing good foundation for feature extraction, target identification and the like.

Description

Edge detection method based on local Otsu thresholding
Technical Field
The invention relates to an edge detection method based on local Otsu thresholding, and belongs to the technical field of edge detection.
Background
The digital image processing technology in a broad sense is divided into three levels from low level to high level. First, in image processing in a narrow sense, both input and output are a complete image, and only specific details are enhanced on the basis of an original image or the quality of the image is improved, so that a basis is provided for subsequent processing. Secondly, image analysis is performed, which only analyzes and processes a part of the image, and the important information of the image is usually contained in the part of the image, so that not only is the key information extracted as much as possible, but also the influence of the peripheral area on the key information is reduced. Finally, the recognition and understanding of the image is carried out, which is based on image analysis.
Image edge detection belongs to the level of image analysis. In practical applications, when processing most images, we do not need to pay attention to the whole image, and usually use some characteristics to ignore the parts that we do not pay attention to, that is, the background area, and only pay attention to the feature area, that is, the foreground area. Edge detection is one of the key steps in extracting image features in digital image processing. This is in communication with human vision: when an unknown object appears in the human sight, the judgment is usually made through the outline of the unknown object. Since the contour can be formed by connecting a plurality of edge segments, the edge of the image can be successfully detected in the digital image processing, so that the image recognition is easy, and the information to be processed in the subsequent digital image processing is simplified. The results of edge detection are typically applied to higher level image processing techniques such as feature recognition, image recognition, and image compression for further analysis and understanding.
From the optimization course of the edge detection algorithm, the difficulty of edge detection always lies in: both to correctly detect valid edges, to ensure the accuracy of the results and the integrity of the details, and to ensure that the response of the detection is preferably a single pixel wide, to minimize the effects of noise. However, if the maximum reduction of the missing detection of the edge is ensured, the influence of the noise is inevitably caused, the edge line is too wide, even a false edge is generated, and the true and false edges are difficult to distinguish, but if the influence of the noise is minimized, the detail of the image is difficult to ensure not to be lost. Therefore, in practical situations, these two requirements are often contradictory, and especially when the image is complex, it is difficult to obtain a good edge by only a simple edge detection algorithm.
Based on this difficulty, the edge detection operator often requires a preprocessing step before use. In this pre-processing step, we strive to reduce all noise in the image, but at the same time without losing edge information of the image. Through the research on the edge detection operator, the concept of the threshold value is not difficult to find all the time. The selected threshold is mostly the judgment condition of the edge point, and the contradiction between the effective edge and the noise can be well balanced. If the threshold value is properly selected, the noise in the image can be effectively reduced, but if the threshold value is improperly selected, the loss of a large area of image details can be easily caused. But how to choose the appropriate threshold is also a difficulty.
The image thresholding is used as the preprocessing step of the edge detection by using the threshold value automatically calculated by the algorithm in advance, so that the problem of low efficiency caused by manually selecting the threshold value can be effectively avoided. The maximum between class variance (Otsu) method is one of the most classical threshold algorithms. The method can calculate the optimal threshold value by itself, has the characteristics of strong practicability and simple calculation compared with other threshold value algorithms, and is widely applied.
However, when there are multiple targets, uneven illumination, etc. in the image, a single threshold obviously cannot satisfy the requirement of subsequent edge detection. The maximum inter-class variance method has the following defects: it is implemented by traversing the image to threshold with inter-class variance and intra-class variance, usually yielding only one optimal threshold.
Disclosure of Invention
In order to solve the problems, the invention provides an edge detection method algorithm based on local Otsu thresholding.
The edge detection method based on local Otsu thresholding comprises the following specific steps:
at S1, the input image may be a grayscale image or a color image. Equally dividing an input image into four regions according to the area size;
and S2, calculating the threshold value of each block of region by adopting an Otsu (maximum inter-class variance) algorithm to obtain four threshold values.
S3, performing thresholding processing on the respective region by using each threshold value to obtain a binary image;
and S4, finally, carrying out edge detection on the binary image by using an edge detection operator to obtain an edge image.
Further, the step of the Otsu algorithm in S2 is specifically
(1) Let the gray scale value of the image be G ═ 0, 1, …, 255}, and use f as the number of all pixels with gray scale value i i Representing, total pixel points by representing N, then
Figure BDA0003619332420000031
(2) By P i The probability of the pixel point with the gray value i in the image is represented, so that
Figure BDA0003619332420000033
(3) Selecting a threshold value t, and dividing pixel points in the image into two types and C1, wherein C0 comprises all pixel points with the gray value between [0 and t ] in the image, and C1 comprises all pixel points with the gray value between [ t +1 and 255] in the image;
(4) calculating the probability P of the occurrence of two types of pixel point sets 0 And P 1 ,P 1 =1-P 0
(5) Calculating the average gray value mu of two types of pixel points 0 、μ 1 And average gray value μ of the entire image:
Figure BDA0003619332420000034
(6) calculate the between class variance of C0 and C1:
σ 2 =P 0 (μ-μ 0 ) 2 +P 1 (μ-μ 1 ) 2 =P 0 P 101 ) 2 (14)
(7) traversing the gray values of the image, setting each gray value as a threshold, and then solving the inter-class variance, so that the threshold with the maximum inter-class variance is the optimal threshold.
Further, the thresholding process in S3 is specifically
Assuming that the original image is f (x, y) and the thresholded image is g (x, y), the method will be described
Figure BDA0003619332420000041
Further, in S4, the edge detection operators are specifically Sobel operators and Canny operators.
Further Sobel operator is embodied as
The Sobel operator strengthens the weight of the pixels in the four fields of the central pixel, the gray value weighting difference of the upper field, the lower field, the left field and the right field of the pixel to be processed is added, an extreme value is reached at the edge, the edge information of the image is highlighted, and therefore the edge is detected, and the expression is as follows:
Figure BDA0003619332420000042
wherein
G x =[f(x+1,y-1)+2f(x+1,y)+f(x+1,y+1)]-[f(x-1,y-1)+2f(x-1,y)+f(x-1,y+1)] (16)
G y =[f(x-1,y+1)+2f(x,y+1)+f(x+1,y+1)]-[f(x-1,y-1)+2f(x,y-1)+f(x+1,y-1)] (17)
The two templates are respectively:
Figure BDA0003619332420000043
the two templates represent the horizontal gradient and the vertical gradient of the image, respectively.
The Canny operator is further divided into the following 4 steps:
(1) the image is first smoothed with the first derivative of a two-dimensional gaussian function.
Let the two-dimensional gaussian function be:
Figure BDA0003619332420000044
where σ is the variance parameter of the Gaussian function. As in the previous application of the LOG operator to edge detection, the size of the variance parameter directly affects the smoothness of the image, so that the variance parameter needs to be selected appropriately according to the actual situation.
(2) Then, the gradient size and the gradient direction of all pixel points in the image are calculated. Let the first directional derivative of the gaussian function in a certain direction:
Figure BDA0003619332420000051
Figure BDA0003619332420000052
wherein n is a direction vector;
Figure BDA0003619332420000053
is a gradient vector.
Canny operator is built in two dimensions
Figure BDA0003619332420000054
On the basis, the edge strength of the image can be represented by the amplitude of the gradient of the smoothed image at the pixel point to be detected:
Figure BDA0003619332420000055
the direction of the edge is expressed as:
Figure BDA0003619332420000056
(3) "non-maxima suppression" is applied to the magnitude of the gradient. Because the global gradient in the image can not determine the edge, the whole image needs to be traversed, if the gray value of a certain pixel is not the maximum compared with the gray values of two pixels in front and at back in the gradient direction, the pixel value is set to be 0, namely, the local maximum value point in the image gradient is reserved, and other non-local maximum values are set to be zero to obtain the refined edge. And the gradient direction of the pixel point should be quantized according to 8-connectivity, if the gradient direction of the central pixel points to the direction of the C area, then the adjacent pixels needing to be compared when local non-maximum suppression is carried out are two pixel points at the upper left and the lower right.
(4) The gradient is "lag thresholded" and edge connected. The so-called "hysteresis thresholding" is the double thresholding of the gradient, a strong threshold Th 1 A weak threshold Th 2 And two thresholds satisfy Th 1 =0.4·Th 2 . All the pixel points with the gradient amplitude smaller than the weak threshold are determined not to be edge points, and more edge information can be reserved due to the fact that the value of the weak threshold is low; and all the pixel points with the gradient amplitude larger than the strong threshold are determined as edge points. Since the value of the strong threshold is high, some edge information is lost although most of the noise is removed, so that the missing image of the edge point cannot form a complete contour. Then, the pixel points with gradient amplitude values between the two can judge whether the pixel points are edge points or not by observing 8-connectivity of the pixel points and other edge points. If the gradient amplitude of a pixel point is greater than the weak threshold and less than the strong threshold and has a connection relation with another edge point, the pixel may beTo be considered as edge points; but pixels that are not connected to any edge point cannot be considered as pixel points. This allows the missing edge points to be connected to form the contour.
The edge detection by using the Canny operator can reduce the edge interruption phenomenon in the template detection, is favorable for obtaining a more complete edge, has high positioning precision and large signal-to-noise ratio, and can also detect some weaker edges in the image background.
Compared with the prior art, the invention adopting the technical scheme has the following technical effects: firstly, uniformly partitioning an original image; then, calculating the threshold value of each block of area by adopting an Otsu algorithm; then, each threshold value is utilized to carry out thresholding processing on the respective region; and finally, carrying out edge detection on the binary image by using an edge detection operator. The method of the invention improves the edge detection quality, can effectively filter noise, reserve key edges, and can obtain single-pixel closed edge lines with good continuity, thereby providing good foundation for feature extraction, target identification and the like.
Drawings
FIG. 1 is a flow chart of a local Otsu thresholding based edge detection method of the present invention;
FIG. 2 computes a global threshold (left) and computes a local threshold (right);
FIG. 3 Global thresholding (left) and local thresholding (right);
FIG. 4 shows the detection results of the original image by using Sobel operator (left) and Canny operator (right);
FIG. 5 shows the results of global thresholding image detection using Sobel operator (left) and Canny operator (right);
FIG. 6 shows the results of local thresholding image detection using Sobel operator (left) and Canny operator (right);
FIG. 7 shows the detection result (left) and the local map (right) of the original image by using Canny operator;
fig. 8 detects the result (left) and the local map (right) for the locally thresholded image using the Canny operator.
Detailed Description
The invention and its advantageous effects are described in detail below with reference to the accompanying drawings and various embodiments.
As shown in fig. 1, the edge detection method based on local Otsu thresholding of the present invention includes:
firstly, an image is obtained and equally divided into four areas according to the area size, and the image can be a color image or a gray image. A grayscale image is actually a process of making R, G, B component values of a color image equal. The obtained color image is processed, for example, using the rgb2gray function of MATLAB itself, resulting in a gray scale image. Other known methods may also be used to achieve image graying and are not illustrated here. As shown in fig. 2, the image is equally divided into four regions according to the size of the area.
Second, the local threshold is calculated using the Otsu algorithm. To compare the effect of the local Otsu calculated threshold, the left graph in fig. 4 is the global threshold calculated using the global Otsu algorithm. It can be seen that the global threshold is 170, while the local thresholds computed by the partitions are 177, 166, 172, 146, respectively. The lower right region has a local threshold significantly lower than the global threshold due to uneven illumination of the image.
The specific steps of Otsu calculating the threshold are as follows:
(1) let the gray scale value of the image be G ═ 0, 1, …, 255}, and use f as the number of all pixels with gray scale value i i Representing, total pixel points are represented by N, then
Figure BDA0003619332420000071
(2) With P i The probability of the pixel point with the gray value i in the image is represented, so that
Figure BDA0003619332420000072
(3) Selecting a threshold value t, and dividing pixel points in the image into two types C 0 And C 1 ,C 0 From grey values in the image
Figure BDA0003619332420000075
All pixel points in between, C 1 From grey values in the image
Figure BDA0003619332420000076
All the pixel points are formed;
(4) calculating the probability P of the occurrence of two types of pixel point sets 0 And P 1
Figure BDA0003619332420000074
P 1 =1-P 0
(5) Calculating the average gray value mu of the two types of pixel points 0 、μ 1 And average gray value μ of the entire image:
Figure BDA0003619332420000081
(6) calculating C 0 And C 1 Inter-class variance of (c):
σ 2 =P 0 (μ-μ 0 ) 2 +P 1 (μ-μ 1 ) 2 =P 0 P 101 ) 2 (26)
(7) traversing the gray values of the image, setting each gray value as a threshold, and then solving the inter-class variance, so that the threshold with the maximum inter-class variance is the optimal threshold.
And thirdly, performing thresholding processing on respective regions by using each threshold value to obtain a binary image. In fig. 3, the left image is an output image subjected to global Otsu thresholding, and the right image is subjected to local Otsu thresholding, so that due to uneven illumination of the image, shadow regions are misjudged as foreground regions by the global Otsu thresholding, and the local Otsu thresholding has a good effect.
The thresholding treatment is specifically
The thresholding process is in fact an extreme but very useful grey scale transformation. It compares the gray value of each pixel in the image with some preset value, i.e. a threshold, and transforms the pixel of that point into one of two possible output gray levels, usually represented by black and white, according to the comparison result. The image after the thresholding is thus actually a binary image. If the original image is f (x, y) and the thresholded image is g (x, y), the threshold value is set to be f (x, y)
Figure BDA0003619332420000082
And fourthly, carrying out edge detection on the binary image by using an edge detection operator to obtain an edge image. As shown in fig. 4, there are three problems in detecting the edge of the image by directly using the Sobel operator: (1) isolated edge points; (2) a false edge; (3) the edge points are not continuous. The edges detected by directly using the Canny operator have good continuity, but have excessive false edges. As shown in fig. 5, the edge is detected by using a global Otsu thresholding method, and the image target object is misjudged, which directly causes an error in the later edge detection effect. As shown in fig. 6, the edge detection result by using local Otsu thresholding greatly improves the defects of the Sobel operator and the Canny operator, can extract the edge line more accurately, and has good edge continuity and strong noise resistance.
The edge detection operators are specifically Sobel operators and Canny operators.
Specifically, the Sobel operator is
The Sobel operator is a template with the size of 3 multiplied by 3 and is a template with the size of an odd number, so that the gradient value of the pixel point to be processed can be calculated by placing the pixel point at the center of the template. The Sobel operator strengthens the weight of the pixels in the four fields of the central pixel, the gray value weighting difference of the upper field, the lower field, the left field and the right field of the pixel to be processed is added, an extreme value is reached at the edge, the edge information of the image is highlighted, and therefore the edge is detected, and the expression is as follows:
Figure BDA0003619332420000091
wherein
G x =[f(x+1,y-1)+2f(x+1,y)+f(x+1,y+1)]-[f(x-1,y-1)+2f(x-1,y)+f(x-1,y+1)] (29)
G y =[f(x-1,y+1)+2f(x,y+1)+f(x+1,y+1)]-[f(x-1,y-1)+2f(x,y-1)+f(x+1,y-1)] (30)
The two templates are respectively:
Figure BDA0003619332420000092
the two templates represent the horizontal gradient and the vertical gradient of the image, respectively.
The Sobel operator strengthens the central pixel point. Therefore, if an image with uniform noise points is processed, false edges are easy to appear by using a Sobel operator. However, if the noise of the image is distributed near the edge, the detection effect of the Sobel operator will be good.
Specifically, the Canny operator is mainly divided into the following 4 steps:
(1) the image is first smoothed with the first derivative of a two-dimensional gaussian function. Let the two-dimensional gaussian function be:
Figure BDA0003619332420000093
where σ is the variance parameter of the Gaussian function. As in the previous application of the LOG operator to edge detection, the size of the variance parameter directly affects the smoothness of the image, so that the variance parameter needs to be selected appropriately according to the actual situation.
(2) Then, the gradient size and the gradient direction of all pixel points in the image are calculated. Let the first directional derivative of the gaussian function in a certain direction:
Figure BDA0003619332420000101
Figure BDA0003619332420000102
wherein n is a direction vector;
Figure BDA0003619332420000103
is a gradient vector.
Canny operator is built in two dimensions
Figure BDA0003619332420000104
On the basis, the edge strength of the image can be represented by the amplitude of the gradient of the smoothed image at the pixel point to be detected:
Figure BDA0003619332420000105
the direction of the edge is expressed as:
Figure BDA0003619332420000106
(3) "non-maxima suppression" is applied to the magnitude of the gradient. Because the global gradient in the image can not determine the edge, the whole image needs to be traversed, if the gray value of a certain pixel is not the maximum compared with the gray values of the front and the back pixels in the gradient direction, the pixel value is set to be 0, namely the local maximum value point in the image gradient is reserved, and other non-local maximum values are set to be zero to obtain the refined edge. And the gradient direction of the pixel point should be quantized according to 8-connectivity, as shown in the following figure, if the gradient direction of the central pixel points to the direction of the C region, then the adjacent pixels to be compared when performing local non-maximum suppression are two pixel points, namely, the upper left pixel point and the lower right pixel point.
(4) The gradient is "lag thresholded" and edge connected. The so-called "hysteresis thresholding" is the double thresholding of the gradient, a strong threshold Th 1 A weak threshold Th 2 And two thresholds satisfy Th 1 =0.4·Th 2 . Whenever the gradient magnitude is less than the weak thresholdThe pixel points are determined not to be edge points, and more edge information can be reserved due to the lower value of the weak threshold; all pixels with gradient magnitude greater than the strong threshold are certainly considered as edge points. Since the value of the strong threshold is high, some edge information is lost although most of the noise is removed, so that the missing image of the edge point cannot form a complete contour. Then, the pixel points with gradient amplitude values between the two can judge whether the pixel points are edge points or not by observing 8-connectivity of the pixel points and other edge points. If the gradient amplitude of one pixel point is larger than the weak threshold and smaller than the strong threshold and has a communication relation with another edge point, the pixel can be regarded as the edge point; but pixels that are not connected to any edge point cannot be considered as pixel points. This allows the missing edge points to be connected to form the contour.
The edge detection by using the Canny operator can reduce the edge interruption phenomenon in the template detection, is favorable for obtaining a more complete edge, has high positioning precision and large signal-to-noise ratio, and can also detect some weaker edges in the image background.
In conclusion, the algorithm provided by the invention improves the edge detection quality, can effectively filter noise, reserve key edges, and obtain single-pixel closed edge lines with good continuity, thereby providing a good foundation for feature extraction, target identification and the like.
Finally, it should be pointed out that: the above examples are only for illustrating the technical solutions of the present invention, and are not limited thereto. Although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and these modifications or substitutions may not make the essence of the corresponding technical solutions depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and although the embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that various changes, modifications, substitutions and alterations may be made thereto without departing from the principle and spirit of the present invention, the scope of which is defined by the appended claims and their equivalents.

Claims (7)

1. The edge detection method based on local Otsu thresholding is characterized by comprising the following steps of:
s1, inputting an image, equally dividing the input image into four regions according to the area size;
s2, calculating the threshold value of each block of area by adopting an Otsu (maximum inter-class variance) algorithm to obtain four threshold values;
s3, performing thresholding processing on the respective region by using each threshold value to obtain a binary image;
and S4, finally, carrying out edge detection on the binary image by using an edge detection operator to obtain an edge image.
2. The edge detection method based on local Otsu thresholding as claimed in claim 1, wherein the step of Otsu algorithm in S2 is:
(1) let the gray scale value of the image be G ═ 0, 1, …, 255}, and use f as the number of all pixels with gray scale value i i Representing, total pixel points are represented by N, then
Figure FDA0003619332410000011
(2) With P i The probability of the pixel point with the gray value i in the image is represented, so that
Figure FDA0003619332410000012
(3) Selecting a threshold value t, and dividing pixel points in the image into two types C 0 And C 1 ,C 0 From grey values in the image
Figure FDA0003619332410000015
All pixel points in between, C 1 From grey values in the image
Figure FDA0003619332410000016
All the pixel points are formed;
(4) calculating the probability P of the occurrence of two types of pixel point sets 0 And P 1
Figure FDA0003619332410000013
P 1 =1-P 0
(5) Calculating the average gray value mu of two types of pixel points 0 、μ 1 And average gray value μ of the entire image:
Figure FDA0003619332410000014
(6) calculating C 0 And C 1 Inter-class variance of (c):
σ 2 =P 0 (μ-μ 0 ) 2 +P 1 (μ-μ 1 ) 2 =P 0 P 101 ) 2 (4);
(7) traversing the gray values of the image, setting each gray value as a threshold, and then solving the inter-class variance, so that the threshold with the maximum inter-class variance is the optimal threshold.
3. The edge detection method based on local Otsu thresholding as claimed in claim 1, wherein the thresholding process in S3 is specifically:
assuming that the original image is f (x, y) and the thresholded image is g (x, y), the method will be described
Figure FDA0003619332410000021
4. The edge detection method based on local Otsu thresholding as claimed in claim 1, characterized in that the edge detection operators in S4 are specifically Sobel operators and Canny operators.
5. The edge detection method based on local Otsu thresholding as claimed in claim 4, wherein the Sobel operator is:
the Sobel operator strengthens the weight of the pixels in the four fields of the central pixel, weights the gray values of the upper field, the lower field, the left field and the right field of the pixel to be processed, reaches an extreme value at the edge, and highlights the edge information of the image so as to detect the edge.
6. The edge detection method based on local Otsu thresholding as claimed in claim 4, characterized in that said Canny operator is specifically divided into the following 4 steps:
(1) firstly, smoothing an image by using a first derivative of a two-dimensional Gaussian function, wherein the two-dimensional Gaussian function is set as:
Figure FDA0003619332410000022
where σ is the variance parameter of the Gaussian function.
(2) Then, calculating the gradient size and the gradient direction of all pixel points in the image, and setting a first-order directional derivative of the Gaussian function in a certain direction:
Figure FDA0003619332410000031
Figure FDA0003619332410000032
wherein n is a direction vector;
Figure FDA0003619332410000033
is a gradient vector;
canny operator is built in two dimensions
Figure FDA0003619332410000034
On the basis, the edge strength of the image can be represented by the amplitude of the gradient of the smoothed image at the pixel point to be detected:
Figure FDA0003619332410000035
the direction of the edge is expressed as:
Figure FDA0003619332410000036
(3) for the amplitude of the gradient, if the gray value of a certain pixel is not the maximum compared with the gray values of two pixels in front and at the back in the gradient direction, setting the pixel value as 0, reserving a local maximum value point in the image gradient, and setting other non-local maximum values as zero to obtain a refined edge, wherein the gradient direction of the pixel point is quantized according to 8-connectivity, and if the gradient direction of a central pixel points to the direction of the C area, adjacent pixels needing to be compared during local non-maximum value suppression are two pixel points at the upper left and the lower right;
(4) double threshold for gradient, one strong threshold Th 1 A weak threshold Th 2 And two thresholds satisfy Th 1 =0.4·Th 2
7. The edge detection method based on local Otsu thresholding as claimed in claim 1, wherein in S1, the input image is any one of a grayscale image or a color image.
CN202210457562.3A 2022-04-27 2022-04-27 Edge detection method based on local Otsu thresholding Pending CN114943744A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210457562.3A CN114943744A (en) 2022-04-27 2022-04-27 Edge detection method based on local Otsu thresholding

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210457562.3A CN114943744A (en) 2022-04-27 2022-04-27 Edge detection method based on local Otsu thresholding

Publications (1)

Publication Number Publication Date
CN114943744A true CN114943744A (en) 2022-08-26

Family

ID=82907236

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210457562.3A Pending CN114943744A (en) 2022-04-27 2022-04-27 Edge detection method based on local Otsu thresholding

Country Status (1)

Country Link
CN (1) CN114943744A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115861307A (en) * 2023-02-21 2023-03-28 深圳市百昌科技有限公司 Fascia gun power supply drive plate welding fault detection method based on artificial intelligence
CN116416268B (en) * 2023-06-09 2023-08-18 浙江双元科技股份有限公司 Method and device for detecting edge position of lithium battery pole piece based on recursion dichotomy

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115861307A (en) * 2023-02-21 2023-03-28 深圳市百昌科技有限公司 Fascia gun power supply drive plate welding fault detection method based on artificial intelligence
CN115861307B (en) * 2023-02-21 2023-04-28 深圳市百昌科技有限公司 Fascia gun power supply driving plate welding fault detection method based on artificial intelligence
CN116416268B (en) * 2023-06-09 2023-08-18 浙江双元科技股份有限公司 Method and device for detecting edge position of lithium battery pole piece based on recursion dichotomy

Similar Documents

Publication Publication Date Title
CN108629343B (en) License plate positioning method and system based on edge detection and improved Harris corner detection
CN112819772B (en) High-precision rapid pattern detection and recognition method
Raju et al. Image segmentation by using histogram thresholding
CN108022233A (en) A kind of edge of work extracting method based on modified Canny operators
CN102156996B (en) Image edge detection method
CN108898132B (en) Terahertz image dangerous article identification method based on shape context description
CN114943744A (en) Edge detection method based on local Otsu thresholding
CN114399522A (en) High-low threshold-based Canny operator edge detection method
CN113837198B (en) Improved self-adaptive threshold Canny edge detection method based on three-dimensional block matching
CN111369570A (en) Multi-target detection tracking method for video image
CN108090492B (en) Contour detection method based on scale clue suppression
CN113296095A (en) Target hyperbolic edge extraction method for pulse ground penetrating radar
CN116524269A (en) Visual recognition detection system
CN113781413B (en) Electrolytic capacitor positioning method based on Hough gradient method
CN114494318A (en) Method for extracting cornea contour from cornea dynamic deformation video based on Otsu algorithm
CN109146905A (en) For the CANNY operator edge detection algorithm of low-light level environment
CN115984863B (en) Image processing method, device, equipment and storage medium
Liu et al. Bowstring-based dual-threshold computation method for adaptive Canny edge detector
CN113643290B (en) Straw counting method and device based on image processing and storage medium
CN112085683B (en) Depth map credibility detection method in saliency detection
CN114820718A (en) Visual dynamic positioning and tracking algorithm
Youssef et al. Color image edge detection method based on multiscale product using Gaussian function
CN112967304A (en) Edge detection algorithm for multi-edge window collaborative filtering
Sharma et al. Analysis of proposed hybrid approaches for laplacian edge based image segmentation using morphological image processing
Liao et al. Detection method of Si3N4 bearing rollers point microcrack defects based on adaptive region growing segmentation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination