CN113989313B - Edge detection method and system based on image multidimensional analysis - Google Patents

Edge detection method and system based on image multidimensional analysis Download PDF

Info

Publication number
CN113989313B
CN113989313B CN202111586499.5A CN202111586499A CN113989313B CN 113989313 B CN113989313 B CN 113989313B CN 202111586499 A CN202111586499 A CN 202111586499A CN 113989313 B CN113989313 B CN 113989313B
Authority
CN
China
Prior art keywords
image
pixel
spectrogram
edge
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111586499.5A
Other languages
Chinese (zh)
Other versions
CN113989313A (en
Inventor
马东风
陈玲杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Zhibotong Technology Co ltd
Original Assignee
Wuhan Zhibotong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Zhibotong Technology Co ltd filed Critical Wuhan Zhibotong Technology Co ltd
Priority to CN202111586499.5A priority Critical patent/CN113989313B/en
Publication of CN113989313A publication Critical patent/CN113989313A/en
Application granted granted Critical
Publication of CN113989313B publication Critical patent/CN113989313B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration by non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration by the use of histogram techniques
    • G06T5/70
    • G06T5/90
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20056Discrete and fast Fourier transform, [DFT, FFT]

Abstract

The invention relates to the technical field of image processing, in particular to an edge detection method and system based on image multidimensional analysis, detecting the edge of the target image by a Canny edge detection algorithm to obtain a first edge image, performing frequency domain analysis on the target image to obtain an original spectrogram, processing the original spectrogram to obtain corresponding pixel number difference index, gray histogram difference index, adjacent pixel value difference index and pixel value difference average value index, and calculating to obtain an optimal high-frequency spectrogram selection index according to the indexes, further determining the optimal high-frequency spectrogram according to the optimal high-frequency spectrogram selection index, and integrating the optimal high-frequency spectrogram and the first edge image to obtain a final edge image.

Description

Edge detection method and system based on image multidimensional analysis
Technical Field
The invention relates to the technical field of image processing, in particular to an edge detection method and system based on image multi-dimensional analysis.
Background
When Canny edge detection is used, the first step is to remove noise by using Gaussian blur, but the edges are smoothed at the same time, so that edge information is weakened, and some required edges, especially weak edges and isolated edges, can be missed in the later step and can be eliminated in the double-threshold and connected calculation. Therefore, the accuracy of the acquired edges is reduced by acquiring the image edges only by means of Canny edge detection. Although the prior art combines Canny edge detection and high-frequency edge detection to obtain an edge image, the combining process is simpler, the combining depth is shallower, and the obtained edge accuracy is still lower.
Disclosure of Invention
The invention aims to provide an edge detection method and system based on image multi-dimensional analysis, which are used for solving the technical problem that the accuracy of an edge obtained by the existing edge detection method is low.
The adopted technical scheme is as follows:
an edge detection method based on image multi-dimensional analysis comprises the following steps:
detecting the edge of a target image through a Canny edge detection algorithm to obtain a first edge image;
performing frequency domain analysis on the target image to obtain an original spectrogram;
taking the center of the original spectrogram as a circle center, and making a plurality of concentric circles with different radiuses to obtain a multilayer spectrogram;
obtaining a pixel number difference index of each initial high-frequency spectrogram and the first edge image according to the pixel number difference between the gray level image converted back by each layer of spectrogram and the first edge image;
calculating a gray level histogram difference index of the gray level histograms of the initial high-frequency spectrograms and the first edge image;
selecting a gray histogram difference index smaller than a preset gray histogram difference index threshold value to obtain a candidate high-frequency spectrogram, and for any one candidate high-frequency spectrogram, replacing a pixel point at a corresponding position in the first edge image with any one pixel point in a gray image converted back by the candidate high-frequency spectrogram to obtain an adjacent pixel value difference index between the pixel point before and after replacement and an adjacent pixel point;
acquiring an optimal high-frequency spectrogram selection index of each candidate high-frequency spectrogram according to the pixel number difference index, the gray histogram difference index, the adjacent pixel value difference index and the pixel value difference average value index of each candidate high-frequency spectrogram and the first edge image;
and determining the candidate high-frequency spectrogram corresponding to the minimum optimal high-frequency spectrogram selection index as the optimal high-frequency spectrogram, and integrating the optimal high-frequency spectrogram and the first edge image to obtain a final edge image.
Further, the obtaining of the pixel number difference indicator between each initial high-frequency spectrogram and the first edge image according to the pixel number difference between the gray-scale image converted back by each layer spectrogram and the first edge image includes:
for any layer of spectrogram, respectively binarizing the grayscale image converted from the layer of spectrogram and the first edge image to obtain a first binary image and a second binary image;
calculating a pixel number error value of the pixel numbers of the first binary image and the second binary image;
comparing the pixel quantity error value with a preset pixel quantity error value threshold, and if the pixel quantity error value is less than or equal to the preset pixel quantity error value threshold, determining that the layer of spectrogram is the initial high-frequency spectrogram; if the pixel number error value is greater than the preset pixel number error value threshold, overlapping the layer of spectrogram with an adjacent previous layer of spectrogram to obtain a first overlapped spectrogram, recalculating a pixel number error value between the gray-scale image converted back from the first overlapped spectrogram and the first edge image, comparing the pixel number error value with the preset pixel number error value threshold, if the calculated new pixel number error value is greater than the preset pixel number error value threshold, overlapping the first overlapped spectrogram with the adjacent previous layer of spectrogram to obtain a second overlapped spectrogram, recalculating a pixel number error value between the gray-scale image converted back from the second overlapped spectrogram and the first edge image again, and continuously circulating until the obtained pixel number error value is less than or equal to the preset pixel number error value threshold, judging that the currently obtained superposition spectrogram is the initial high-frequency spectrogram;
and calculating the difference index of the pixel number of each initial high-frequency spectrogram and the first edge image.
Further, the calculating a difference indicator of the number of pixels between each initial high frequency spectrogram and the first edge image includes:
for any initial high-frequency spectrogram, converting the initial high-frequency spectrogram into a gray image for binarization to obtain a third binary image;
and calculating the pixel quantity error value of the pixel quantities of the third binary image and the second binary image to obtain the pixel quantity difference index of the initial high-frequency spectrogram.
Further, the calculating a gray level histogram difference indicator of the gray level histogram of each initial high frequency spectrogram and the gray level histogram of the first edge image includes:
converting each initial high-frequency spectrogram back into a gray image;
acquiring a gray level histogram of the gray level image converted from each initial high-frequency spectrogram and a gray level histogram of the first edge image;
calculating the difference value of the number of pixels of each layer of gray level of the gray level histogram of the gray level image converted back by each initial high-frequency spectrogram and the gray level histogram of the first edge image;
and calculating the sum of the difference values of the pixel number of each layer of gray level to obtain the gray level histogram difference index of each initial high-frequency spectrogram and the gray level histogram of the first edge image.
Further, the step of replacing a pixel point at a corresponding position in the first edge image with any pixel point in the gray scale image converted back from the candidate high-frequency spectrogram to obtain an adjacent pixel value difference index between the pixel point before and after the replacement and the adjacent pixel point includes:
calculating the square sum of the difference values of the pixel point at the corresponding position in the first edge image and the pixel values of the four adjacent pixel points around to obtain the square sum of the difference values of the pixel values before replacement for any pixel point in the gray scale image converted back by the candidate high-frequency spectrogram;
calculating the sum of squares of differences between pixel values of the pixel points and four adjacent pixel points around after the pixel point in the gray-scale image converted back by the candidate high-frequency spectrogram replaces the pixel point at the corresponding position in the first edge image, and obtaining the sum of squares of differences of the pixel values after replacement;
calculating the difference value of the sum of squares of the difference values of the pixel values before replacement and the sum of squares of the difference values of the pixel values after replacement to obtain the difference value of the adjacent pixel values of the pixel points in the gray level image converted back by the candidate high-frequency spectrogram;
and calculating the average value of the adjacent pixel value difference values of all pixel points in the gray level image converted back by the candidate high-frequency spectrogram to obtain the adjacent pixel value difference index.
Further, the process of calculating the pixel value difference average indicator includes:
for any one candidate high-frequency spectrogram, calculating the difference value of the pixel values of each pixel point in the gray-scale image converted back by the candidate high-frequency spectrogram and the pixel points at the corresponding positions in the first edge image, and then calculating the average value of the difference values of all the pixel points to obtain the pixel value difference average value index;
correspondingly, the obtaining an optimal high-frequency spectrogram selection index of each candidate high-frequency spectrogram according to the pixel number difference index, the gray histogram difference index, the adjacent pixel value difference index and the pixel value difference average value index of each candidate high-frequency spectrogram and the first edge image includes:
and for any one candidate high-frequency spectrogram, calculating the product of the pixel number difference index, the gray level histogram difference index, the adjacent pixel value difference index and the pixel value difference average value index corresponding to the candidate high-frequency spectrogram to obtain the optimal high-frequency spectrogram selection index of the candidate high-frequency spectrogram.
Further, the integrating the optimal high frequency spectrogram and the first edge image to obtain a final edge image includes:
obtaining a binary image of the gray level image converted from the optimal high-frequency spectrogram to obtain a first initial binary matrix and a binary image of the first edge image to obtain a second initial binary matrix;
performing inter-operation and exclusive-or operation on the first initial binary matrix and the second initial binary matrix respectively to obtain a first target binary matrix and a second target binary matrix, setting an image corresponding to the first target binary matrix as an initial edge image, setting each pixel point in the initial edge image as an edge pixel point, and setting the image corresponding to the second target binary matrix as a blurred edge image;
and for any pixel point in the blurred edge image, calculating the average value of the pixel value difference values of the pixel point and each edge pixel point in the preset range within a preset range with the corresponding position of the pixel point in the initial edge image as the center, if the average value of the pixel value difference values is less than or equal to a preset difference value average value threshold value, judging that the pixel point in the blurred edge image is an edge pixel point, and adding the pixel point to the corresponding position of the pixel point in the initial edge image to obtain the final edge image.
An edge detection system based on image multidimensional analysis comprises a processor and a memory, wherein the processor is used for processing instructions stored in the memory to realize the edge detection method based on image multidimensional analysis.
The embodiment of the invention at least has the following beneficial effects: after a first edge image of a target image is obtained through a Canny edge detection algorithm, performing frequency domain analysis on the target image to obtain an original frequency spectrogram, performing corresponding processing on the original frequency spectrogram to respectively obtain a pixel number difference index, a gray histogram difference index, an adjacent pixel value difference index and a pixel value difference average value index which are related to the first edge image, calculating according to the indexes to obtain an optimal high-frequency spectrogram selection index, finally selecting the minimum optimal high-frequency spectrogram selection index, determining a corresponding candidate high-frequency spectrogram as the optimal high-frequency spectrogram, integrating the optimal high-frequency spectrogram and the first edge image to obtain a final edge image. Therefore, based on the multi-dimensional analysis of the image, the invention deeply combines the Canny operator with the edge image of the high-frequency image of the image, and a series of data processing is carried out, so that the detail of the image obtained by the Canny operator can be increased by the combined edge image, the accuracy of the obtained edge is improved, the noise influence in the high-frequency information can be reduced and eliminated, and compared with the existing mode of simply combining the Canny edge detection and the high-frequency edge to obtain the edge image, the invention has the advantages that the combination depth is deeper, and the accuracy of the edge detection is greatly improved.
Drawings
FIG. 1 is a flow chart of an edge detection method based on image multidimensional analysis according to the present invention;
FIG. 2 is an image edge after binarization based on Canny operator;
FIG. 3 is a gray scale image of an original image;
FIG. 4 is a spectral diagram;
FIG. 5 is a spectral diagram of the high frequency portion of an image;
FIG. 6 is an original image;
FIG. 7 is a resulting image of the high frequency portion of the original image;
FIG. 8 is a schematic illustration of a spectrogram hierarchy;
FIG. 9 is a schematic structural diagram of an edge detection system based on image multidimensional analysis according to the present invention;
fig. 10 is a schematic structural diagram of an edge detection apparatus based on image multidimensional analysis according to the present invention.
Detailed Description
To further illustrate the technical means and effects of the present invention for achieving the predetermined objects, the following detailed description of the edge detection method based on image multidimensional analysis, its specific implementation, structure, features and effects will be given below with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" refers to not necessarily the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following describes a specific scheme of the edge detection method based on image multidimensional analysis in detail with reference to the accompanying drawings.
Referring to fig. 1, a flowchart illustrating steps of an edge detection method based on image multidimensional analysis according to an embodiment of the present invention is shown, where the method includes the following steps:
step S1: detecting the edge of the target image through a Canny edge detection algorithm to obtain a first edge image:
the target image is an image which needs to be subjected to edge detection, the Canny edge detection algorithm is a currently known and commonly used edge detection algorithm, the edge of the target image is detected through the Canny edge detection algorithm to obtain a first edge image, and the first edge image can be marked as a graph TB. The Canny edge detection algorithm is briefly introduced as follows: and performing Gaussian smoothing on the target image, reducing the error rate, calculating the gradient amplitude and direction to estimate the edge intensity and direction of each point, and performing non-maximum suppression on the gradient amplitude according to the gradient direction. The Canny edge detection algorithm is essentially a further refinement of the results of Sobel, Prewitt, etc. operators.
As shown in fig. 2, the image edge is obtained by performing Canny edge detection algorithm on the real image shown in fig. 3 and performing binarization, and it can be seen that the details of the regions circled in fig. 3 are not seen in fig. 2, and the edge image detected by only using the Canny edge detection algorithm has a great defect.
Step S2: performing frequency domain analysis on the target image to obtain an original spectrogram:
the frequency domain analysis of the target image is performed to obtain an original spectrogram, and the spectrogram obtained by the frequency domain analysis of the image belongs to the conventional technical means, and is described in brief below.
Firstly, graying the target image to obtain a grayscale image of the target image. Then, fourier transform is performed on the gray scale map to obtain a spectrogram, as shown in fig. 4, where the obtained spectrogram is defined as an original spectrogram.
If the high frequency spectrogram shown in fig. 5 is simply obtained from the original spectrogram, it can be seen from fig. 6 and 7 that although the high frequency image can obtain an image containing image edge information through inverse fourier transform, the image is not ideal in effect, and it can be seen that the image has many other image information which does not belong to the edge.
Therefore, a high-frequency spectrogram meeting the requirements, namely a high-frequency spectrogram containing image edge information and having as little other information as possible, needs to be obtained subsequently according to the original spectrogram.
Step S3: and taking the center of the original spectrogram as a circle center, and making a plurality of concentric circles with different radiuses to obtain a multilayer spectrogram:
the pixel values of the pixels in the image (in this embodiment, the pixel values are gray values) change fast to be the high frequency part, and change slowly to be the low frequency part, so that the image edge information is in the high frequency part of the image, that is, the image needs to be divided, so as to find out the high frequency image representing the edge.
As shown in fig. 8, a plurality of concentric circles are made with different radii by using the center of the original spectrogram as the center of a circle, so as to obtain a multilayer spectrogram. The size of each radius is set by actual need, and moreover, the quantity of concentric circles is also set by actual need, and in order to promote the accuracy of subsequent data processing, the number of concentric circles that set should not be too few, in this embodiment, n-1 circles are set, and according to the order that the radius is from small to large, each circle is defined as: circle 1, circle 2, circle 3, circle … …, circle n-1. This example is defined as follows: a circular region formed by the 1 st circle is defined as a 1 st layer spectrogram, an annular region formed by the 1 st circle and the 2 nd circle is defined as a 2 nd layer spectrogram, an annular region formed by the 2 nd circle and the 3 rd circle is defined as a 3 rd layer spectrogram, an annular region formed by the 3 rd circle and the 4 th circle is defined as a 4 th layer spectrogram, … …, an annular region formed by the n-2 th circle and the n-1 th circle is defined as an n-1 th layer spectrogram, and the rest of regions except the n-1 th circle are defined as n-1 th layer spectrogram. That is, the original spectrogram from the center to the edge is respectively marked as a layer 1 spectrogram, a layer 2 spectrogram, a layer 3 spectrogram, a layer 4 spectrogram, … …, an n-1 spectrogram, and an n-th spectrogram. Then, for the n-1 st layer spectrogram, the previous layer spectrogram of the n-1 st layer spectrogram is the n-2 nd layer spectrogram, and the next layer spectrogram is the nth layer spectrogram.
Step S4: obtaining a pixel number difference index of each initial high-frequency spectrogram and the first edge image according to the pixel number difference between the gray level image converted back by each layer of spectrogram and the first edge image:
for any layer of spectrogram, the processing is performed according to the following processing procedure, and the processing sequence of each layer of spectrogram is set according to actual needs, which may be performed according to the sequence from the layer 1 spectrogram, the layer 2 spectrogram, the layer 3 spectrogram, the layer 4 spectrogram, … …, the layer n-1 spectrogram and the layer n spectrogram, or according to the sequence from the layer n spectrogram, the layer n-1 spectrogram, … …, the layer 4 spectrogram, the layer 3 spectrogram, the layer 2 spectrogram and the layer 1 spectrogram, or according to other arrangement sequences. In this embodiment, the sequence is from the nth layer spectrogram, the n-1 st layer spectrogram, … …, the 4 th layer spectrogram, the 3 rd layer spectrogram, the 2 nd layer spectrogram and the 1 st layer spectrogram.
The first time processing object is the nth layer spectrogram, and the nth layer spectrogram is converted back to a gray level image, namely the gray level image corresponding to the nth layer spectrogram can be converted back through inverse Fourier transform. Then, the grayscale image converted back from the nth layer spectrogram and the first edge image are respectively binarized, in this embodiment, a grayscale threshold is set to be 127, the pixel value of the pixel of which the pixel value is smaller than 127 in the grayscale image is modified to be 0, and the pixel value of the pixel of which the pixel value is greater than or equal to 127 in the grayscale image is modified to be 255, so that a binary image which is the first binary image is obtained. Similarly, the pixel value of the pixel with the pixel value smaller than 127 in the first edge image is modified to 0, and the pixel value of the pixel with the pixel value greater than or equal to 127 in the first edge image is modified to 255, so that a binary image is obtained, and the binary image is a second binary image.
The number of pixels in the first binary image, that is, the number of pixels having a pixel value of 255 is obtained, and similarly, the number of pixels in the second binary image is obtained. A pixel number error value is calculated for the number of pixels of the first binary image and the second binary image. The error value may be a difference value, or may be calculated by using the following calculation formula:
Figure 855684DEST_PATH_IMAGE002
where S is the error value, S1 is the number of pixels in the first binary image, and S2 is the number of pixels in the second binary image.
A pixel quantity error value threshold is preset, and the specific value of the pixel quantity error value threshold is set according to actual needs. And comparing the pixel quantity error value S with a preset pixel quantity error value threshold, and if the pixel quantity error value S is smaller than or equal to the preset pixel quantity error value threshold, indicating that the difference between the pixel quantity of the gray-scale image converted back from the nth-layer spectrogram and the pixel quantity of the first edge image is smaller, determining that the nth-layer spectrogram is the initial high-frequency spectrogram. If the pixel number error value S is greater than the preset pixel number error value threshold, indicating that the difference between the number of pixels of the grayscale image converted back from the nth layer of spectrogram and the first edge image is large, overlapping the nth layer of spectrogram and the previous layer of spectrogram adjacent to the nth layer of spectrogram, i.e., overlapping the nth layer of spectrogram and the n-1 layer of spectrogram to obtain an overlapped spectrogram, which is defined as a first overlapped spectrogram, and then recalculating the pixel number error value between the grayscale image converted back from the first overlapped spectrogram and the first edge image (wherein the conversion process of converting the first overlapped spectrogram back into the grayscale image and the calculation process of the pixel number error value can both adopt the given manner), and comparing the pixel number error value with the preset pixel number error value threshold, if the calculated new pixel number error value is less than or equal to the preset pixel number error value threshold, if the difference between the pixel numbers of the gray-scale image converted back from the first superimposed spectrogram and the first edge image is smaller, determining that the first superimposed spectrogram is an initial high-frequency spectrogram corresponding to the nth layer spectrogram, if the calculated new pixel number error value is greater than a preset pixel number error value threshold, superimposing the first superimposed spectrogram and an adjacent previous layer spectrogram (i.e., the nth-2 layer spectrogram) to obtain a superimposed spectrogram, defining the superimposed spectrogram as a second superimposed spectrogram, then recalculating the pixel number error value between the gray-scale image converted back from the second superimposed spectrogram and the first edge image again, and comparing the recalculated pixel number error value with the preset pixel number error value threshold, if the calculated new pixel number error value is less than or equal to the preset pixel number error value threshold, indicating that the difference between the gray-scale image converted back from the second superimposed spectrogram and the first edge image is smaller, and if the calculated new pixel number error value is greater than the preset pixel number error value threshold, overlapping the second superimposed spectrogram with an adjacent previous layer spectrogram (namely, the nth-3 layer spectrogram) to obtain a superimposed spectrogram, defining the superimposed spectrogram as a third superimposed spectrogram, recalculating the pixel number error value between the gray-scale image converted back by the third superimposed spectrogram and the first edge image again, performing subsequent processing, and continuously circulating until the obtained pixel number error value is less than or equal to the preset pixel number error value threshold, and determining that the currently obtained spectrogram is the initial high-frequency spectrogram corresponding to the nth layer spectrogram. Therefore, according to the above process, the comparison and spectrogram superposition are continuously performed until the difference between the number of pixels of the gray-scale image converted back from the obtained superposition spectrogram and the first edge image is smaller, so as to obtain the initial high-frequency spectrogram corresponding to the nth layer spectrogram.
And then processing the n-1 layer spectrogram according to the processing process to obtain an initial high-frequency spectrogram corresponding to the n-1 layer spectrogram. And finally obtaining initial high-frequency spectrograms corresponding to the spectrograms of all layers, and then obtaining n initial high-frequency spectrograms.
And calculating the pixel number difference index of each initial high-frequency spectrogram and the first edge image. And for any initial high-frequency spectrogram, converting the converted gray-scale image of the initial high-frequency spectrogram back into a binary image to obtain a third binary image, and then calculating the pixel number error value of the pixel number of the third binary image and the pixel number error value of the pixel number of the second binary image to obtain the pixel number difference index of the initial high-frequency spectrogram. And finally obtaining the pixel number difference index of each initial high-frequency spectrogram.
Step S5: calculating a gray level histogram difference index of the gray level histograms of the initial high-frequency spectrograms and the first edge image:
and converting each initial high-frequency spectrogram back to a gray image, wherein the specific conversion process is not repeated.
And acquiring a gray level histogram of the gray level image converted back by each initial high-frequency spectrogram. It should be understood that the interval size of each gray level and the interval maximum pixel value and minimum pixel value are set by actual needs. Similarly, a gray histogram of the first edge image is obtained. It should be understood that each of the initial high frequency spectrogram and the first edge image are divided by the same gray scale interval.
And calculating the difference value of the pixel number of each layer of gray level of the gray level histogram of the gray level image converted back by each initial high-frequency spectrogram and the gray level histogram of the first edge image. That is, for any one initial high-frequency spectrogram, the difference value of the number of pixels of each layer of gray level in the gray level histogram of the gray level image converted back by the initial high-frequency spectrogram and the gray level histogram of the first edge image is calculated. Here, the difference value may be a difference value directly, or may be calculated according to the following calculation formula:
Figure 491065DEST_PATH_IMAGE004
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE005
Figure 443977DEST_PATH_IMAGE006
gray level histograms of the gray image and the first edge image respectively converted back from the initial high frequency spectrogramaThe number of pixels of (a) to be processed,
Figure DEST_PATH_IMAGE007
Figure 289967DEST_PATH_IMAGE008
are respectively grey levelsaA maximum value of gray scale and a minimum value of gray scale.
Obtaining gray scale by the above calculation formulaaThe difference value of the number of pixels of each layer of gray level is obtained by analogy. To pairFor any level in the gray histogram, the smaller the difference value of the pixel number, the more similar the image.
Then, the sum of the difference values of the number of pixels of each layer of gray scale is calculated to obtain a gray histogram difference index D of the gray histogram of the initial high frequency spectrogram and the first edge image. And further obtaining a gray level histogram difference index of the gray level histograms of the initial high-frequency spectrogram and the first edge image.
Step S6: selecting a gray histogram difference index smaller than a preset gray histogram difference index threshold value to obtain a candidate high-frequency spectrogram, and for any candidate high-frequency spectrogram, replacing a pixel point at a corresponding position in the first edge image with any pixel point in a gray image converted back by the candidate high-frequency spectrogram to obtain an adjacent pixel value difference index between the pixel point before and after replacement and an adjacent pixel point:
a gray level histogram difference index threshold is preset, and the gray level histogram difference index threshold is set according to actual needs. After the gray histogram difference index D of the gray histogram of each initial high-frequency spectrogram and the first edge image is obtained, comparing the gray histogram difference index D with a preset gray histogram difference index threshold value to obtain a gray histogram difference index smaller than the preset gray histogram difference index threshold value, and then, the initial high-frequency spectrogram corresponding to the gray histogram difference index smaller than the preset gray histogram difference index threshold value is a candidate high-frequency spectrogram. Therefore, the obtained candidate high-frequency spectrogram is an initial high-frequency spectrogram corresponding to a gray histogram difference index smaller than a preset gray histogram difference index threshold. And setting the number of the candidate high-frequency spectrograms as m, and then, m is less than or equal to n.
For any candidate high-frequency spectrogram, any pixel point in a gray-scale image converted back by the candidate high-frequency spectrogram replaces a pixel point at a corresponding position in the first edge image, and an adjacent pixel value difference index X between a pixel point before and after replacement and an adjacent pixel point is obtained, and as a specific implementation mode, a specific implementation process is given as follows:
for the candidate high frequencyAny pixel point in the grayscale image converted back by the spectrogram is completely the same as the parameters such as the size of the grayscale image converted back by each candidate high-frequency spectrogram and the first edge image, so that each pixel point in the grayscale image converted back by each candidate high-frequency spectrogram corresponds to the position of each pixel point in the first edge image one by one, then, according to any determined pixel point in the grayscale image converted back by the candidate high-frequency spectrogram, the pixel point at the same position in the first edge image can be obtained, then, the pixel point at the corresponding position in the first edge image is determined to be connected with four neighboring pixel points around the pixel point in the first edge image, and if the pixel point (namely, the image point) at the corresponding position in the first edge image is the pixel point (namely, the image point) at the corresponding position in the first edge image
Figure DEST_PATH_IMAGE010A
Then the four neighboring pixel points around are respectively
Figure DEST_PATH_IMAGE012A
. Then, calculating pixel points
Figure DEST_PATH_IMAGE010AA
The sum of squares of differences between pixel values of four neighboring pixel points
Figure DEST_PATH_IMAGE014AAAA
The calculation formula is as follows:
Figure 102065DEST_PATH_IMAGE016
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE018A
is a pixel point
Figure DEST_PATH_IMAGE010AAA
The value of the pixel of (a) is,
Figure DEST_PATH_IMAGE020A
is a pixel point
Figure DEST_PATH_IMAGE022
The value of the pixel of (a) is,
Figure DEST_PATH_IMAGE024
is a pixel point
Figure DEST_PATH_IMAGE026
The value of the pixel of (a) is,
Figure DEST_PATH_IMAGE028
is a pixel point
Figure DEST_PATH_IMAGE030
The value of the pixel of (a) is,
Figure DEST_PATH_IMAGE032
is a pixel point
Figure DEST_PATH_IMAGE034
The pixel value of (2).
Then, the pixel point in the gray scale image converted back by the candidate high frequency spectrogram replaces the pixel point in the corresponding position in the first edge image, that is, the pixel point in the corresponding position in the first edge image is deleted, and the pixel point in the gray scale image converted back by the candidate high frequency spectrogram is added to the corresponding position in the first edge image, or in other words, the pixel value of the pixel point in the gray scale image converted back by the candidate high frequency spectrogram replaces the pixel value of the pixel point in the corresponding position in the first edge image. Then, after the pixel point in the gray scale image converted back by the candidate high-frequency spectrogram replaces the pixel point at the corresponding position in the first edge image, the sum of squares of differences of the pixel point and pixel values of four adjacent pixel points around the pixel point in the first edge image is calculated, and the sum of squares of differences of the pixel values after replacement is obtained
Figure 234145DEST_PATH_IMAGE035
The calculation method is the same as the above.
Calculating the difference value of the sum of squares of the difference values of the pixel values before replacement and the sum of squares of the difference values of the pixel values after replacement to obtain the difference value of the adjacent pixel values of the pixel points in the gray-scale image converted back by the candidate high-frequency spectrogram, then calculating the average value of the difference values of the adjacent pixel values of all the pixel points in the gray-scale image converted back by the candidate high-frequency spectrogram to obtain an adjacent pixel value difference index X, wherein the calculation formula is as follows:
Figure 510406DEST_PATH_IMAGE037
step S7: acquiring an optimal high-frequency spectrogram selection index of each candidate high-frequency spectrogram according to the pixel number difference index, the gray histogram difference index, the adjacent pixel value difference index and the pixel value difference average value index of each candidate high-frequency spectrogram and the first edge image:
for any candidate high-frequency spectrogram, calculating the difference value between each pixel point in the gray-scale image converted back by the candidate high-frequency spectrogram and the pixel value of the pixel point at the corresponding position in the first edge image, and then calculating the average value of the difference values of the pixel points to obtain a pixel value difference average value index G corresponding to the candidate high-frequency spectrogram, wherein the calculation formula is as follows:
Figure 258788DEST_PATH_IMAGE039
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE041A
converting pixel points in gray scale image back to candidate high frequency spectrogram
Figure DEST_PATH_IMAGE010AAAA
P is the number of pixels of the gray image and the first edge image converted back by the candidate high-frequency spectrogram.
According to the pixel number difference index, the gray histogram difference index, the adjacent pixel value difference index and the pixel value difference average value index of each candidate high-frequency spectrogram and the first edge image, obtaining the optimal high-frequency spectrogram selecting index of each candidate high-frequency spectrogram. It should be understood that the pixel number difference index, the gray level histogram difference index, the adjacent pixel value difference index, and the pixel value difference average value index all reflect the image similarity degree to some extent, and the smaller the numerical value is, the greater the similarity degree is, that is, the smaller the optimal high-frequency spectrogram selection index U is, the more similar the image is. As another embodiment, in the case that the above logical relationship is satisfied, the optimal high-frequency spectrogram selection indicator U may also be calculated in another manner, for example, a summation operation is performed on the four indicators, or another operation capable of reflecting a positive correlation is performed.
Step S8: determining the candidate high-frequency spectrogram corresponding to the minimum optimal high-frequency spectrogram selection index as an optimal high-frequency spectrogram, and integrating the optimal high-frequency spectrogram and the first edge image to obtain a final edge image:
and each candidate high-frequency spectrogram obtains a corresponding optimal high-frequency spectrogram selection index U, the minimum optimal high-frequency spectrogram selection index U is selected from the optimal high-frequency spectrogram selection indexes U, and the minimum optimal high-frequency spectrogram selection index U represents the highest similarity.
And determining the candidate high-frequency spectrogram corresponding to the minimum optimal high-frequency spectrogram selection index as the optimal high-frequency spectrogram. And then, integrating the optimal high-frequency spectrogram and the first edge image to obtain a final edge image. As a specific embodiment, the following gives a specific implementation of the integration:
and obtaining a binary image of the gray level image converted from the optimal high-frequency spectrogram, obtaining a first initial binary matrix (namely the binary image can be equivalent to a binary matrix), obtaining a binary image of the first edge image, and obtaining a second initial binary matrix.
Performing inter-operation on the first initial binary matrix and the second initial binary matrix to obtain a first target binary matrix; and performing exclusive-or operation on the first initial binary matrix and the second initial binary matrix to obtain a second target binary matrix. The intersection operation refers to performing intersection operation on the two sets to obtain an intersection of the two sets, and it can also be understood that the and operation of corresponding elements in the first initial binary matrix and the second initial binary matrix is calculated, and the operation result is 1 only when the two initial binary matrices are simultaneously 1, and the operation result is 0 when one of the two initial binary matrices is 0 and the other is 1, or the two sets are simultaneously 0. The image pixels corresponding to the first target binary matrix can be accurately considered as pixels forming the edge of the image, the image corresponding to the first target binary matrix is set as an initial edge image, and each pixel point in the initial edge image is an edge pixel point. The exclusive-or operation means that two elements are not 1, but 0. The xor operation is used to obtain pixels with large pixel value difference in the two images, and it is necessary to determine whether the pixels are edge pixels, so that the image corresponding to the second target binary matrix is set as a blurred edge image.
For any pixel point in the blurred edge image, the corresponding position of the pixel point in the initial edge image is found, a preset range is set by taking the corresponding position as the center, the shape of the preset range is set according to actual needs, such as a circle, a square and the like, and in the embodiment, the preset range is 5 × 5. In the preset range, the edge pixel points in the preset range are found, and it should be understood that the number of the edge pixel points is uncertain, and may be 1, may be multiple, and more specifically may be 0. And calculating the average value of the pixel value difference value of the pixel point and each edge pixel point in the preset range, so that the smaller the average value of the pixel value difference value is, the higher the possibility that the pixel point is an edge pixel point is. Therefore, a difference average threshold is preset, the size of the difference average threshold is set according to actual needs, if the average of the pixel value difference is smaller than or equal to the preset difference average threshold, the pixel point in the blurred edge image is determined to be an edge pixel point, and the pixel point is added to the corresponding position of the pixel point in the initial edge image, so that a final edge image is obtained. The specific implementation means for adding the pixel point to the corresponding position of the pixel point in the initial edge image may be: deleting the pixel point at the corresponding position of the pixel point in the initial edge image, and adding the pixel point to the position, or modifying the pixel value of the pixel point at the corresponding position of the pixel point in the initial edge image into the pixel value of the pixel point, so as to realize that the pixel point is added to the corresponding position of the pixel point in the initial edge image. Of course, if the average value of the pixel value difference is greater than the preset difference average value threshold, it is determined that the pixel point in the blurred edge image is not an edge pixel point. And completing the detection until all pixel points in the blurred edge image are detected. Resulting in a final edge image.
The embodiment also provides an edge detection system based on image multidimensional analysis, which comprises a processor and a memory, wherein the processor is used for processing instructions stored in the memory to implement the edge detection method based on image multidimensional analysis, as shown in fig. 9.
Corresponding to the edge detection method based on image multidimensional analysis described in the above embodiments, the present embodiment also provides an edge detection apparatus based on image multidimensional analysis, and for convenience of description, only the relevant parts of the embodiments of the present application are shown.
Referring to fig. 10, the edge detection apparatus based on image multidimensional analysis includes:
the first edge image acquisition module is used for detecting the edge of the target image through a Canny edge detection algorithm to obtain a first edge image;
the original spectrogram acquisition module is used for carrying out frequency domain analysis on the target image to obtain an original spectrogram;
the multilayer spectrogram acquisition module is used for making a plurality of concentric circles with different radiuses by taking the center of the original spectrogram as a circle center to obtain a multilayer spectrogram;
a pixel number difference index obtaining module, configured to obtain a pixel number difference index between each initial high-frequency spectrogram and the first edge image according to a pixel number difference between the grayscale image converted back by each layer of spectrogram and the first edge image;
a gray histogram difference index obtaining module, configured to calculate a gray histogram difference index between each initial high-frequency spectrogram and a gray histogram of the first edge image;
the adjacent pixel value difference index acquisition module is used for selecting a gray histogram difference index smaller than a preset gray histogram difference index threshold value to obtain a candidate high-frequency spectrogram, and for any one candidate high-frequency spectrogram, any one pixel point in a gray image converted back by the candidate high-frequency spectrogram replaces a pixel point at a corresponding position in the first edge image to acquire an adjacent pixel value difference index between the pixel point before and after replacement and the adjacent pixel point;
the optimal high-frequency spectrogram selection index acquisition module is used for acquiring an optimal high-frequency spectrogram selection index of each candidate high-frequency spectrogram according to the pixel number difference index, the gray level histogram difference index, the adjacent pixel value difference index and the pixel value difference average value index of each candidate high-frequency spectrogram and the first edge image;
and the final edge image acquisition module is used for determining the candidate high-frequency spectrogram corresponding to the minimum optimal high-frequency spectrogram selection index as the optimal high-frequency spectrogram and integrating the optimal high-frequency spectrogram and the first edge image to obtain a final edge image.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program can implement the steps in the above method embodiments.
The embodiment of the present application provides a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed system/apparatus and method may be implemented in other ways. For example, the above-described system/apparatus embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, and the indirect coupling or communication connection of the modules may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (6)

1. An edge detection method based on image multi-dimensional analysis is characterized by comprising the following steps:
detecting the edge of a target image through a Canny edge detection algorithm to obtain a first edge image;
performing frequency domain analysis on the target image to obtain an original spectrogram;
taking the center of the original spectrogram as a circle center, and making a plurality of concentric circles with different radiuses to obtain a multilayer spectrogram;
obtaining a pixel number difference index of each initial high-frequency spectrogram and the first edge image according to the pixel number difference between the gray level image converted back by each layer of spectrogram and the first edge image;
calculating a gray level histogram difference index of the gray level histograms of the initial high-frequency spectrograms and the first edge image;
selecting a gray histogram difference index smaller than a preset gray histogram difference index threshold value to obtain a candidate high-frequency spectrogram, and for any one candidate high-frequency spectrogram, replacing a pixel point at a corresponding position in the first edge image with any one pixel point in a gray image converted back by the candidate high-frequency spectrogram to obtain an adjacent pixel value difference index between the pixel point before and after replacement and an adjacent pixel point;
acquiring an optimal high-frequency spectrogram selection index of each candidate high-frequency spectrogram according to the pixel number difference index, the gray histogram difference index, the adjacent pixel value difference index and the pixel value difference average value index of each candidate high-frequency spectrogram and the first edge image;
determining the candidate high-frequency spectrogram corresponding to the minimum optimal high-frequency spectrogram selection index as an optimal high-frequency spectrogram, and integrating the optimal high-frequency spectrogram and the first edge image to obtain a final edge image;
the obtaining of the pixel number difference index between each initial high-frequency spectrogram and the first edge image according to the pixel number difference between the gray-scale image converted back by each layer of spectrogram and the first edge image includes:
for any layer of spectrogram, respectively binarizing the grayscale image converted from the layer of spectrogram and the first edge image to obtain a first binary image and a second binary image;
calculating a pixel number error value of the pixel numbers of the first binary image and the second binary image;
comparing the pixel quantity error value with a preset pixel quantity error value threshold, and if the pixel quantity error value is less than or equal to the preset pixel quantity error value threshold, determining that the layer of spectrogram is the initial high-frequency spectrogram; if the pixel number error value is greater than the preset pixel number error value threshold, overlapping the layer of spectrogram with an adjacent previous layer of spectrogram to obtain a first overlapped spectrogram, recalculating a pixel number error value between the gray-scale image converted back from the first overlapped spectrogram and the first edge image, comparing the pixel number error value with the preset pixel number error value threshold, if the calculated new pixel number error value is greater than the preset pixel number error value threshold, overlapping the first overlapped spectrogram with the adjacent previous layer of spectrogram to obtain a second overlapped spectrogram, recalculating a pixel number error value between the gray-scale image converted back from the second overlapped spectrogram and the first edge image again, and continuously circulating until the obtained pixel number error value is less than or equal to the preset pixel number error value threshold, judging that the currently obtained superposition spectrogram is the initial high-frequency spectrogram;
calculating a difference index of the number of pixels of each initial high-frequency spectrogram and the first edge image;
the obtaining of the optimal high-frequency spectrogram selection index of each candidate high-frequency spectrogram according to the pixel number difference index, the gray histogram difference index, the adjacent pixel value difference index and the pixel value difference average value index of each candidate high-frequency spectrogram and the first edge image includes:
for any candidate high-frequency spectrogram, calculating the product of a pixel number difference index, a gray level histogram difference index, an adjacent pixel value difference index and a pixel value difference average value index corresponding to the candidate high-frequency spectrogram to obtain an optimal high-frequency spectrogram selection index of the candidate high-frequency spectrogram;
the integrating the optimal high-frequency spectrogram and the first edge image to obtain a final edge image comprises:
obtaining a binary image of the gray level image converted from the optimal high-frequency spectrogram to obtain a first initial binary matrix and a binary image of the first edge image to obtain a second initial binary matrix;
performing inter-operation and exclusive-or operation on the first initial binary matrix and the second initial binary matrix respectively to obtain a first target binary matrix and a second target binary matrix, setting an image corresponding to the first target binary matrix as an initial edge image, setting each pixel point in the initial edge image as an edge pixel point, and setting the image corresponding to the second target binary matrix as a blurred edge image;
and for any pixel point in the blurred edge image, calculating the average value of the pixel value difference values of the pixel point and each edge pixel point in the preset range within a preset range with the corresponding position of the pixel point in the initial edge image as the center, if the average value of the pixel value difference values is less than or equal to a preset difference value average value threshold value, judging that the pixel point in the blurred edge image is an edge pixel point, and adding the pixel point to the corresponding position of the pixel point in the initial edge image to obtain the final edge image.
2. The method according to claim 1, wherein the calculating the difference indicator of the number of pixels between each of the initial high frequency spectrograms and the first edge image comprises:
for any initial high-frequency spectrogram, converting the initial high-frequency spectrogram into a gray image for binarization to obtain a third binary image;
and calculating the pixel quantity error value of the pixel quantities of the third binary image and the second binary image to obtain the pixel quantity difference index of the initial high-frequency spectrogram.
3. The method of claim 1, wherein the calculating a histogram difference metric for gray level histograms of the initial high frequency spectrograms and the first edge image comprises:
converting each initial high-frequency spectrogram back into a gray image;
acquiring a gray level histogram of the gray level image converted from each initial high-frequency spectrogram and a gray level histogram of the first edge image;
calculating the difference value of the number of pixels of each layer of gray level of the gray level histogram of the gray level image converted back by each initial high-frequency spectrogram and the gray level histogram of the first edge image;
and calculating the sum of the difference values of the pixel number of each layer of gray level to obtain the gray level histogram difference index of each initial high-frequency spectrogram and the gray level histogram of the first edge image.
4. The image multidimensional analysis based edge detection method according to claim 1, wherein the step of replacing a pixel point at a corresponding position in the first edge image with any pixel point in the gray scale image converted back from the candidate high frequency spectrogram to obtain an adjacent pixel value difference index between the pixel point before and after the replacement and the adjacent pixel point comprises:
calculating the square sum of the difference values of the pixel point at the corresponding position in the first edge image and the pixel values of the four adjacent pixel points around to obtain the square sum of the difference values of the pixel values before replacement for any pixel point in the gray scale image converted back by the candidate high-frequency spectrogram;
calculating the sum of squares of differences between pixel values of the pixel points and four adjacent pixel points around after the pixel point in the gray-scale image converted back by the candidate high-frequency spectrogram replaces the pixel point at the corresponding position in the first edge image, and obtaining the sum of squares of differences of the pixel values after replacement;
calculating the difference value of the sum of squares of the difference values of the pixel values before replacement and the sum of squares of the difference values of the pixel values after replacement to obtain the difference value of the adjacent pixel values of the pixel points in the gray level image converted back by the candidate high-frequency spectrogram;
and calculating the average value of the adjacent pixel value difference values of all pixel points in the gray level image converted back by the candidate high-frequency spectrogram to obtain the adjacent pixel value difference index.
5. The method for detecting the edge based on the multidimensional image analysis according to claim 1, wherein the calculation of the pixel value difference average indicator comprises:
for any one candidate high-frequency spectrogram, calculating the difference value of the pixel values of each pixel point in the gray-scale image converted back by the candidate high-frequency spectrogram and the pixel point at the corresponding position in the first edge image, and then calculating the average value of the difference values of the pixel values of all the pixel points to obtain the pixel value difference average value index.
6. An edge detection system based on image multidimensional analysis, characterized by comprising a processor and a memory, wherein the processor is used for processing instructions stored in the memory to realize the edge detection method based on image multidimensional analysis according to any one of claims 1-5.
CN202111586499.5A 2021-12-23 2021-12-23 Edge detection method and system based on image multidimensional analysis Active CN113989313B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111586499.5A CN113989313B (en) 2021-12-23 2021-12-23 Edge detection method and system based on image multidimensional analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111586499.5A CN113989313B (en) 2021-12-23 2021-12-23 Edge detection method and system based on image multidimensional analysis

Publications (2)

Publication Number Publication Date
CN113989313A CN113989313A (en) 2022-01-28
CN113989313B true CN113989313B (en) 2022-03-22

Family

ID=79734105

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111586499.5A Active CN113989313B (en) 2021-12-23 2021-12-23 Edge detection method and system based on image multidimensional analysis

Country Status (1)

Country Link
CN (1) CN113989313B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115082441B (en) * 2022-07-22 2022-11-11 山东微山湖酒业有限公司 Retort material tiling method in wine brewing distillation process based on computer vision
CN114937212B (en) * 2022-07-26 2022-11-11 南通华锐软件技术有限公司 Aerial photography road type identification method based on frequency domain space conversion
CN114998343A (en) * 2022-08-04 2022-09-02 南通广信塑料机械有限公司 Mold surface polishing degree detection method based on vision
CN115131387B (en) * 2022-08-25 2023-01-24 山东鼎泰新能源有限公司 Gasoline engine spray wall collision parameter automatic extraction method and system based on image processing
CN115115864B (en) * 2022-08-26 2022-11-15 济宁安泰矿山设备制造有限公司 Dynamic balance testing method and system for pump precision rotor shaft
CN115266536B (en) * 2022-09-26 2022-12-13 南通钧儒卫生用品有限公司 Method for detecting water absorption performance of paper diaper
CN115661122B (en) * 2022-11-14 2024-01-12 南京图格医疗科技有限公司 Image grid pattern removing method and system
CN116703958B (en) * 2023-08-03 2023-11-17 山东仕达思医疗科技有限公司 Edge contour detection method, system, equipment and storage medium for microscopic image
CN116740653A (en) * 2023-08-14 2023-09-12 山东创亿智慧信息科技发展有限责任公司 Distribution box running state monitoring method and system
CN117173186B (en) * 2023-11-03 2024-03-05 南通苏禾车灯配件有限公司 Train door gap parameter detection method and system
CN117232791B (en) * 2023-11-07 2024-02-09 智翼博智能科技(苏州)有限公司 Intelligent detection method for surface flaws and defects of optical film

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101860667A (en) * 2010-05-06 2010-10-13 中国科学院西安光学精密机械研究所 Method of quickly eliminating composite noise in images
CN103440620A (en) * 2013-06-17 2013-12-11 中国航天科工集团第三研究院第八三五八研究所 Method for automatically recognizing and restoring degraded images

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8510283B2 (en) * 2006-07-31 2013-08-13 Ricoh Co., Ltd. Automatic adaption of an image recognition system to image capture devices
CN103530885B (en) * 2013-10-23 2015-10-07 北京倍肯恒业科技发展有限责任公司 A kind of one dimensional image adaptive layered rim detection extracting method
CN106716450B (en) * 2014-05-06 2020-05-19 河谷控股Ip有限责任公司 Image-based feature detection using edge vectors
KR101998593B1 (en) * 2017-11-27 2019-07-10 주식회사 디알엠인사이드 System and method for identifying online comics based on region of interest
CN109409190A (en) * 2018-08-21 2019-03-01 南京理工大学 Pedestrian detection method based on histogram of gradients and Canny edge detector
CN110009653A (en) * 2019-03-12 2019-07-12 江苏理工学院 Increase limb recognition point sharp picture based on gray level threshold segmentation method and knows method for distinguishing

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101860667A (en) * 2010-05-06 2010-10-13 中国科学院西安光学精密机械研究所 Method of quickly eliminating composite noise in images
CN103440620A (en) * 2013-06-17 2013-12-11 中国航天科工集团第三研究院第八三五八研究所 Method for automatically recognizing and restoring degraded images

Also Published As

Publication number Publication date
CN113989313A (en) 2022-01-28

Similar Documents

Publication Publication Date Title
CN113989313B (en) Edge detection method and system based on image multidimensional analysis
CN108805023B (en) Image detection method, device, computer equipment and storage medium
CN109903282B (en) Cell counting method, system, device and storage medium
CN111080661A (en) Image-based line detection method and device and electronic equipment
CN112561940B (en) Dense multi-target parameter extraction method and device and terminal equipment
CN112037185B (en) Chromosome splitting phase image screening method and device and terminal equipment
CN116137036A (en) Gene detection data intelligent processing system based on machine learning
CN107230212B (en) Vision-based mobile phone size measuring method and system
CN117094975A (en) Method and device for detecting surface defects of steel and electronic equipment
CN113344907B (en) Image detection method and device
CN112686896B (en) Glass defect detection method based on frequency domain and space combination of segmentation network
CN107392948B (en) Image registration method of amplitude-division real-time polarization imaging system
CN110544243A (en) Automatic detection, quantification and reliability evaluation method for small defects of CT (computed tomography) image
CN112926695A (en) Image recognition method and system based on template matching
CN113538263A (en) Motion blur removing method, medium, and device based on improved DeblurgAN model
CN115829980B (en) Image recognition method, device and equipment for fundus photo and storage medium
CN115273123B (en) Bill identification method, device and equipment and computer storage medium
CN111311610A (en) Image segmentation method and terminal equipment
CN113537253B (en) Infrared image target detection method, device, computing equipment and storage medium
Jia et al. Fractional‐integral‐operator‐based improved SVM for filtering salt‐and‐pepper noise
CN113643225A (en) Arc detection method and arc detection device
CN114842399B (en) Video detection method, training method and device for video detection model
CN115908429B (en) Method and system for detecting grinding precision of foot soaking powder
CN114881908B (en) Abnormal pixel identification method, device and equipment and computer storage medium
CN112652004B (en) Image processing method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant