CN115861354A - Image edge detection method, device, equipment and storage medium - Google Patents

Image edge detection method, device, equipment and storage medium Download PDF

Info

Publication number
CN115861354A
CN115861354A CN202211667529.XA CN202211667529A CN115861354A CN 115861354 A CN115861354 A CN 115861354A CN 202211667529 A CN202211667529 A CN 202211667529A CN 115861354 A CN115861354 A CN 115861354A
Authority
CN
China
Prior art keywords
edge
image
gradient
target
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211667529.XA
Other languages
Chinese (zh)
Inventor
余世杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Skyworth New World Technology Co ltd
Original Assignee
Shenzhen Skyworth New World Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Skyworth New World Technology Co ltd filed Critical Shenzhen Skyworth New World Technology Co ltd
Priority to CN202211667529.XA priority Critical patent/CN115861354A/en
Publication of CN115861354A publication Critical patent/CN115861354A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The application discloses an image edge detection method, an image edge detection device, image edge detection equipment and a storage medium, which relate to the technical field of computer vision, wherein the method comprises the following steps: if an initial image input by a user is received, carrying out Gaussian filtering on the initial image to obtain a Gaussian image; acquiring gradient information of each pixel point in the Gaussian image, and determining a target edge threshold corresponding to the Gaussian image according to each gradient information; screening out each edge point from each pixel point according to the target edge threshold value; and connecting the edge points to obtain an edge image of the initial image. The image edge detection method and device solve the technical problem that the image edge detection efficiency is low at present.

Description

Image edge detection method, device, equipment and storage medium
Technical Field
The present application relates to the field of computer vision technologies, and in particular, to a method, an apparatus, a device, and a storage medium for detecting an image edge.
Background
The edge is the most basic visual feature in the image, is one of the most important contents of image processing, and in order to more accurately acquire important structural attributes in the image, edge detection should be generated, and with the rapid development of artificial intelligence, the edge detection is widely applied in many fields, such as the fields with 2D image processing requirements, such as pedestrian tracking, face recognition, driving assistance, and the like.
At present, many effective edge detection algorithms exist, such as Canny operator, canny has the advantages that weak edges can be obtained and generation of false edges is inhibited, on the basis of pure gradient operator, a single-pixel edge expansion strategy with good anti-noise performance and high positioning accuracy is introduced, on the basis of first-order differentiation, non-maximum inhibition and double-threshold detection are added, and the method has the advantages that the method has the non-negligible disadvantage that the method depends on advanced input of parameters, such as: the output parameters such as the image position, the upper threshold limit, the lower threshold limit, and the size of the image template result in weak reusability in different scenes, and the input of the parameters can reduce the efficiency of edge detection to a certain extent.
Disclosure of Invention
The present application mainly aims to provide an image edge detection method, apparatus, device and storage medium, and aims to solve the technical problem of low image edge detection efficiency at present.
In order to achieve the above object, the present application provides an image edge detection method, including:
if an initial image input by a user is received, carrying out Gaussian filtering on the initial image to obtain a Gaussian image;
acquiring gradient information of each pixel point in the Gaussian image, and determining a target edge threshold corresponding to the Gaussian image according to each gradient information;
screening out each edge point from each pixel point according to the target edge threshold value;
and connecting the edge points to obtain an edge image of the initial image.
Optionally, the gradient information includes a target gradient value and a gradient direction;
the step of obtaining the gradient information of each pixel point in the Gaussian image comprises the following steps:
acquiring a sum of absolute gradient values between the vertical gradient value and the horizontal gradient value of each pixel point, and taking the sum of absolute gradient values as the target gradient value;
and judging the gradient direction of each pixel point by comparing the vertical gradient value with the horizontal gradient value.
Optionally, the step of determining a target edge threshold corresponding to the gaussian image according to each piece of gradient information includes:
acquiring the width and the height of the initial image, and calculating the ratio of the product of the width and the height to each target gradient value;
and aggregating the ratios based on the preset threshold weight to obtain the target edge threshold.
Optionally, the step of screening out each edge point from each pixel point according to the target edge threshold includes:
calculating a target gradient value difference between each pixel point and an adjacent pixel point based on each gradient direction;
and if the difference of the target gradient values is detected to be larger than or equal to the target edge threshold, taking each pixel point as the edge point.
Optionally, the gradient directions include a horizontal direction and a vertical direction, and the step of calculating a target gradient value difference between each pixel point and an adjacent pixel point based on each gradient direction includes:
if the gradient direction is horizontal, calculating a target gradient value difference between each pixel point and a horizontally adjacent pixel point;
and if the gradient direction is vertical, calculating the target gradient value difference between each pixel point and the vertically adjacent pixel point.
Optionally, the step of connecting the edge points to obtain an edge image of the initial image includes:
detecting the growth direction of each edge point according to the gradient direction corresponding to each edge point, wherein the growth direction refers to the direction in which each edge point extends;
and traversing each target edge point meeting a preset termination condition based on each growth direction, and connecting each edge point and the corresponding target edge point to obtain the edge image.
Optionally, the step of detecting the growth direction of each edge point according to the gradient direction corresponding to each edge point includes:
if the gradient direction is vertical, judging that the growth direction of each edge point is upward or downward growth;
and if the gradient direction is transverse, judging that the growth direction of each edge point grows leftwards or rightwards.
Further, to achieve the above object, the present application also provides an image edge detecting device including:
the Gaussian filtering module is used for carrying out Gaussian filtering on the initial image to obtain a Gaussian image if the initial image input by a user is received;
the gradient calculation and edge threshold determination module is used for acquiring gradient information of each pixel point in the Gaussian image and determining a target edge threshold corresponding to the Gaussian image according to each gradient information;
the edge point screening module is used for screening each edge point from each pixel point according to the target edge threshold value;
and the edge image obtaining module is used for connecting the edge points to obtain an edge image of the initial image.
Optionally, the gradient calculating and determining edge threshold module is further configured to:
acquiring a sum of absolute gradient values between the vertical gradient value and the horizontal gradient value of each pixel point, and taking the sum of absolute gradient values as the target gradient value;
and judging the gradient direction of each pixel point by comparing the vertical gradient value with the horizontal gradient value.
Optionally, the gradient calculating and determining edge threshold module is further configured to:
acquiring the width and the height of the initial image, and calculating the ratio of the product of the width and the height to each target gradient value;
and aggregating the ratios based on the preset threshold weight to obtain the target edge threshold.
Optionally, the edge point filtering module is further configured to:
calculating a target gradient value difference between each pixel point and an adjacent pixel point based on each gradient direction;
and if the difference of the target gradient values is detected to be larger than or equal to the target edge threshold, taking each pixel point as the edge point.
Optionally, the edge point filtering module is further configured to:
if the gradient direction is horizontal, calculating a target gradient value difference between each pixel point and a horizontally adjacent pixel point;
and if the gradient direction is vertical, calculating the target gradient value difference between each pixel point and the vertically adjacent pixel point.
Optionally, the obtain edge image module is further configured to:
detecting the growth direction of each edge point according to the gradient direction corresponding to each edge point, wherein the growth direction refers to the direction in which each edge point extends;
and traversing each target edge point meeting a preset termination condition based on each growth direction, and connecting each edge point and the corresponding target edge point to obtain the edge image.
Optionally, the module for obtaining an edge image is further configured to:
if each gradient direction is vertical, determining that the growth direction of each edge point is upward or downward growth;
and if the gradient directions are transverse, judging that the growth direction of the edge points is leftward or rightward.
The present application further provides an image edge detection apparatus, which includes: the image edge detection method comprises a memory, a processor and an image edge detection program stored on the memory and capable of running on the processor, wherein the image edge detection program realizes the steps of the image edge detection method when being executed by the processor.
The present application further provides a readable storage medium, on which an image edge detection program is stored, which when executed by a processor implements the steps of the image edge detection method as described above.
The present application also provides a computer program product comprising a computer program which, when executed by a processor, performs the steps of the image edge detection method as described above.
Compared with the existing edge detection method which needs to input parameters, the method comprises the steps of carrying out Gaussian filtering on an initial image input by a user if the initial image is received, carrying out image noise suppression to obtain a smooth Gaussian image, obtaining each pixel point in the Gaussian image, obtaining gradient information of each pixel point in the Gaussian image, and determining a target edge threshold corresponding to the Gaussian image according to each gradient information; screening out each edge point from each pixel point according to the target edge threshold value; and connecting the edge points to obtain an edge image of the initial image. According to the method and the device, the pixel values of all the pixel points are input into the preset gradient function to obtain the gradient information, and then the corresponding target edge threshold is calculated according to the gradient information, so that the edge points are screened out according to the target edge threshold, and then the edge images are obtained by connecting the edge points, so that the purpose of obtaining the edge images can be achieved only by inputting the initial images, namely the continuous calculation process of the gradients and the edge points, any parameter does not need to be input in advance, manual intervention is not needed, the technical defects that the parameter needs to be input in advance in the existing edge detection algorithm, the reusability of the edge detection algorithm is poor, and the efficiency of edge detection is low are overcome, and therefore the efficiency of image edge detection is improved.
Drawings
FIG. 1 is a schematic flowchart of a first embodiment of an image edge detection method according to the present application;
FIG. 2 is an initial image involved in the image edge detection method of the present application;
FIG. 3 is a Gaussian image involved in the image edge detection method of the present application;
FIG. 4 is an edge point image involved in the image edge detection method of the present application;
FIG. 5 is an edge image involved in the image edge detection method of the present application;
FIG. 6 is a schematic diagram of an image edge detection apparatus according to the present invention;
fig. 7 is a schematic structural diagram of a hardware operating environment involved in the image edge detection method according to the present application.
The implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, embodiments of the present application are described in detail below with reference to the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Example one
At present, many effective edge detection algorithms exist, such as Canny operator, which has the advantages that weak edges can be obtained while generation of false edges is inhibited, on the basis of a simple gradient operator, a single-pixel edge expansion strategy with good anti-noise performance and high positioning accuracy is introduced, on the basis of first-order differentiation, non-maximum inhibition and dual-threshold detection are added, which brings advantages, and meanwhile, the method has a non-negligible disadvantage that the method depends on advanced input of parameters, such as: the output parameters such as the image position, the upper threshold limit, the lower threshold limit, and the size of the image template result in weak reusability in different scenes, and the input of the parameters can reduce the efficiency of edge detection to a certain extent.
In a first embodiment of the image edge detection method of the present application, referring to fig. 1, the image edge detection method includes:
step S10, if an initial image input by a user is received, gaussian filtering is carried out on the initial image to obtain a Gaussian image;
step S20, obtaining gradient information of each pixel point in the Gaussian image, and determining a target edge threshold corresponding to the Gaussian image according to each gradient information;
s30, screening out each edge point from each pixel point according to the target edge threshold;
and S40, connecting the edge points to obtain an edge image of the initial image.
In this embodiment, it should be noted that the initial image is an image for which edge detection is required, and may be a color image or a grayscale image; the Gaussian image is an image obtained after Gaussian filtering; the gradient information may include the target gradient value and the gradient direction, where the target gradient value is a sum of absolute gradients of horizontal gradient values and vertical gradient values of each pixel point, and is used to represent gradient values of each pixel point on the whole image; and the target edge threshold is used for screening each edge point from each pixel point.
As an example, steps S10 to S40 include: if an initial image input by a user is received, gaussian filtering is performed on the initial image through any one of a direct convolution method, a repeated convolution method, an FFT (fast Fourier transform) implementation method, a recursive implementation method, or the like, so as to obtain a gaussian image, where the gaussian image at least includes one pixel point, and a gaussian kernel in this embodiment may be set to [5,5], and may also be set according to an actual detection condition; acquiring pixel values corresponding to the pixel points, inputting the pixel values into a preset gradient function, and calculating a target gradient value and a gradient direction of each pixel point, wherein the preset gradient function can comprise a preset gradient value function and a preset gradient direction function, the preset gradient value function is used for calculating the horizontal gradient value and the vertical gradient value of each pixel point, and calculating the target gradient value by summing the absolute values of the gradient values in the two directions, and the preset gradient direction function is used for judging the corresponding gradient direction according to the horizontal gradient value and the vertical gradient value of each pixel point; acquiring the width and the height of the initial image, calculating a product between the width and the height, calculating a ratio between the product and a target gradient value of each pixel point, and finally aggregating the ratios based on the preset threshold weight to obtain the target edge threshold, wherein the preset threshold weight is an empirical value and can be set according to actual detection conditions; screening out qualified edge points from the pixel points according to the target edge threshold, wherein the qualified edge points are obtained by calculating a target gradient value difference between each pixel point and two adjacent pixel points according to each gradient direction and detecting each edge point by comparing each target gradient value difference with the target edge threshold; and determining the growth direction of each edge point according to the gradient direction of each edge point, traversing the edge points in the growth direction of each edge point until each target edge point meeting a preset termination condition is detected, stopping growth, and connecting each edge point with the corresponding target edge point to finally obtain a complete edge image.
For example, as shown in fig. 2, as an initial image, a smooth gaussian image, that is, fig. 2, can be obtained after gaussian filtering is performed on a, a gradient value and a gradient direction of each pixel point in B are calculated, so as to calculate a target edge threshold, and then edge points are screened out according to the target edge threshold, an edge point image, that is, fig. 4, can be generated, each edge point in fig. 4 is connected according to a corresponding growth direction, and finally, a complete edge image, that is, fig. 5, is obtained.
The step of obtaining the gradient information of each pixel point in the gaussian image comprises:
step S21, obtaining a sum of absolute gradient values between the vertical gradient value and the horizontal gradient value of each pixel point, and taking the sum of absolute gradient values as the target gradient value;
and S22, judging the gradient direction of each pixel point by comparing the vertical gradient value with the horizontal gradient value.
In this embodiment, it should be noted that, a general gradient direction calculation manner at present is to calculate an arc tangent function between a vertical gradient value and a horizontal gradient value, and calculate an angle value to characterize a gradient direction, however, generally, eight neighborhoods corresponding to edge points exist, that is, an upper, a lower, a left, a right, an upper left, an upper right, a lower left, and a lower right of a corresponding position, are adjacent positions and obliquely adjacent positions, and total 8 directions.
As one example, steps S21 to S22 include: acquiring pixel values of the pixel points under the coordinate points, inputting the pixel values into a preset gradient function, calculating a vertical gradient value, a horizontal gradient value and a gradient absolute value sum of the pixel points, and taking the gradient absolute value sum as the target gradient value; and inputting the absolute value of the horizontal gradient value and the absolute value of the vertical gradient value into the preset gradient direction function, comparing the two absolute values, if the absolute value of the horizontal gradient value is greater than the absolute value of the vertical gradient value, judging that the gradient direction of the corresponding pixel point is 0, namely, the horizontal direction, and if the absolute value of the horizontal gradient value is less than or equal to the absolute value of the vertical gradient value, judging that the gradient direction of the corresponding pixel point is 1, namely, the vertical direction.
In one possible implementation, the preset gradient magnitude function is as follows:
G x =f(i,j)-f(i+1,j+1)
G y =f(i,j+1)-f(i+1,j)
G=|G x |+|G y |
wherein f (i, j) is shown inPixel values, G, shown at i, j coordinate points in the image x Representing a horizontal gradient value, G y Representing the vertical gradient value and G the target gradient value.
The preset gradient direction function is as follows:
Figure BDA0004015117770000081
where 0 denotes that the gradient direction is the horizontal direction, and 1 denotes that the gradient direction is the vertical direction.
Wherein, the step of determining the target edge threshold corresponding to the gaussian image according to each piece of gradient information comprises:
step S23, obtaining the width and the height of the initial image, and calculating the ratio of the product of the width and the height to each target gradient value;
and S24, aggregating all the ratios based on preset threshold weight to obtain the target edge threshold.
As one example, steps S23 to S24 include: when a user inputs the initial image, the width and the height of the initial image can be obtained, the product of the width and the height is calculated, and the ratio of the product to each target gradient value is calculated; inputting the ratio into a preset edge function, and multiplying the ratio by the preset threshold weight to aggregate the ratios to obtain the target edge threshold, wherein the aggregation is to calculate an average value of all the ratios, the preset edge function is used for solving the target edge threshold of the subsequently selected edge point, and the function has self-adaptability because the function only relates to the parameters of the initial image and does not need to additionally input other parameters, thereby improving the simplicity of image edge detection.
In one possible implementation, the preset edge function is as follows:
Figure BDA0004015117770000091
wherein k represents the preset threshold weight, H represents the height of the initial image, W represents the width of the initial image, and G (i, j) represents the target gradient value of the pixel point of the coordinate (i, j).
The step of screening out each edge point from each pixel point according to the target edge threshold value comprises the following steps:
step S31, calculating a target gradient value difference between each pixel point and an adjacent pixel point based on each gradient direction;
step S32, if it is detected that the difference between the target gradient values is greater than or equal to the target edge threshold, taking each pixel point as the edge point.
As an example, steps S31 to S32 include: if the gradient direction is horizontal, calculating a target gradient value difference between each pixel point and a horizontally adjacent pixel point, and if the gradient direction is vertical, calculating a target gradient value difference between each pixel point and a vertically adjacent pixel point; inputting each target gradient value difference into a preset edge point judgment function to detect whether each target gradient value difference is larger than or equal to the target edge threshold value, if so, judging a corresponding pixel point as the edge point, and if not, judging that the corresponding pixel point is not the edge point.
In an implementation manner, the predetermined edge point determination function is as follows:
ang =1, G (i, j) -G (i, j-1) is more than or equal to T, G (i, j) -G (i, j + 1) is more than or equal to T, and then the pixel point of the coordinate (i, j) is an edge point;
and ang =0, G (i, j) -G (i-1,j) ≧ T, and G (i, j) -G (i +1,j) ≧ T, so that the pixel point of the coordinate (i, j) is an edge point.
Where ang denotes the gradient direction and T denotes the target edge threshold.
Wherein the step of calculating a target gradient value difference between each of the pixel points and an adjacent pixel point based on each of the gradient directions includes:
step S311, if the gradient direction is horizontal, calculating a target gradient value difference between each pixel point and a horizontally adjacent pixel point;
in step S312, if the gradient direction is vertical, a target gradient value difference between each pixel point and a vertically adjacent pixel point is calculated.
As an example, steps S311 to S312 include: if the gradient direction is horizontal, calculating a target gradient value difference between each pixel point and two adjacent pixel points in the horizontal direction; if the gradient direction is vertical, calculating a target gradient value difference between each pixel point and two adjacent pixel points in the vertical direction, detecting whether each target gradient value difference is larger than or equal to the target edge threshold value, and if so, determining each pixel point as the edge point.
For example, assuming that the gradient direction of the pixel point a (i, j) is 1, i.e. vertical, a first target gradient value difference between target gradient values of a and two adjacent pixel points B (i, j + 1) and C (i, j-1) in the vertical direction, i.e. G, is calculated y1 = G (i, j) -G (i, j + 1) and G y2 = G (i, j) -G (i, j-1), G is detected assuming target edge threshold is T y1 And G y2 Whether the average value is greater than or equal to T, if yes, the A is judged as an edge point; similarly, assuming that the gradient direction of A is 0, i.e. horizontal, a second target gradient value difference between the target gradient values of A and two horizontally adjacent pixel points M (i +1,j) and N (i-1,j), i.e. G x1 = G (i, j) -G (i-1,j) and G x2 = G (i, j) -G (i +1,j), detect G x1 And G x2 And if the number of the edge points is more than or equal to T, judging that A is an edge point.
Wherein the step of connecting each of the edge points to obtain an edge image of the initial image comprises:
step S41, detecting the growth direction of each edge point according to the gradient direction corresponding to each edge point, wherein the growth direction refers to the direction in which each edge point extends;
and step S42, traversing each target edge point meeting a preset termination condition based on each growth direction, and connecting each edge point and the corresponding target edge point to obtain the edge image.
In this embodiment, it should be noted that the preset termination condition refers to a condition that each edge point stops growing, and may be set to G (i, j) =0, where no gradient information exists at or around the edge generated by another seed point.
As an example, steps S41 to S42 include: detecting the growth direction of each edge point according to the gradient direction corresponding to each edge point, wherein when the gradient direction is horizontal, the corresponding growth direction can comprise leftward growth and rightward growth, and when the gradient direction is vertical, the corresponding growth direction comprises upward growth and downward growth; and traversing edge points according to the growth direction of each edge point until the target edge point meeting a preset termination condition is detected, stopping growth, connecting each edge point with the corresponding target edge point, continuously traversing by taking the target edge point as a base point according to the gradient direction and the growth direction of the target edge point, detecting the next target edge point corresponding to the target edge point, and connecting to obtain a complete edge image finally.
Wherein the step of detecting the growth direction of each edge point according to the gradient direction corresponding to each edge point comprises:
step S411, if each gradient direction is vertical, determining that the growth direction of each edge point is upward or downward growth;
in step S412, if each gradient direction is horizontal, it is determined that the growth direction of each edge point is left or right.
As an example, steps S411 to S412 include: if each gradient direction is vertical, determining that the growth direction of each edge point is upward or downward growth; and if the gradient directions are transverse, judging that the growth direction of each edge point is leftward or rightward growth, wherein whether the edge points grow leftward and upward, leftward and downward, rightward and downward or rightward and upward can be further judged according to the target gradient value of each edge point.
For example, assuming that the target gradient value of the edge point D (i, J) is G (i, J), calculating the target gradient values of E (i-1,j-1), F (i-1,j), J (i-1, J + 1), namely G (E), G (F) and G (J), and detecting whether G (E) is greater than G (F) or not and whether G (E) is greater than G (J) or not, if so, determining that D grows to the left and the bottom; similarly, calculating a target gradient value of H (i-1, j + 1), namely G (H), detecting whether G (H) is greater than G (F) or not, and whether G (H) is greater than G (E) or not, and if so, judging that D grows upwards to the left; if the two types of the growth conditions are not met, the D is judged to grow leftwards, and the growth towards the lower right and the growth towards the upper right can be detected in the same way.
Compared with the edge detection method which needs to input parameters at present, the method, the device, the equipment and the storage medium for detecting the edge of the image carry out Gaussian filtering on an initial image input by a user if the initial image is received, and carry out image noise suppression, so that a smooth Gaussian image is obtained, each pixel point in the Gaussian image is obtained, gradient information of each pixel point in the Gaussian image is obtained, and a target edge threshold corresponding to the Gaussian image is determined according to each gradient information; screening out each edge point from each pixel point according to the target edge threshold value; and connecting the edge points to obtain an edge image of the initial image. In the embodiment, the pixel values of the pixel points are directly input into the preset gradient function to obtain the gradient information, and the corresponding target edge threshold is further calculated according to the gradient information, so that the edge points are screened out according to the target edge threshold, and then the edge images are obtained by connecting the edge points.
In addition, an embodiment of the present application further provides an image edge detection apparatus, as shown in fig. 6, the image edge detection apparatus includes:
the gaussian filtering module 10 is configured to, if an initial image input by a user is received, perform gaussian filtering on the initial image to obtain a gaussian image;
a gradient calculation and edge threshold determination module 20, configured to obtain gradient information of each pixel point in the gaussian image, and determine a target edge threshold corresponding to the gaussian image according to each gradient information;
the edge point screening module 30 is configured to screen out each edge point from each pixel point according to the target edge threshold;
and an edge image obtaining module 40, configured to connect the edge points to obtain an edge image of the initial image.
Optionally, the gradient calculating and determining edge threshold module 20 is further configured to:
acquiring a sum of absolute gradient values between the vertical gradient value and the horizontal gradient value of each pixel point, and taking the sum of absolute gradient values as the target gradient value;
and judging the gradient direction of each pixel point by comparing the vertical gradient value with the horizontal gradient value.
Optionally, the gradient calculating and determining edge threshold module 20 is further configured to:
acquiring the width and the height of the initial image, and calculating the ratio of the product of the width and the height to each target gradient value;
and aggregating the ratios based on the preset threshold weight to obtain the target edge threshold.
Optionally, the edge point filtering module 30 is further configured to:
calculating a target gradient value difference between each pixel point and an adjacent pixel point based on each gradient direction;
and if the difference of the target gradient values is detected to be larger than or equal to the target edge threshold, taking each pixel point as the edge point.
Optionally, the edge point filtering module 30 is further configured to:
if the gradient direction is horizontal, calculating a target gradient value difference between each pixel point and a horizontally adjacent pixel point;
and if the gradient direction is vertical, calculating the target gradient value difference between each pixel point and the vertically adjacent pixel point.
Optionally, the obtain edge image module 40 is further configured to:
detecting the growth direction of each edge point according to the gradient direction corresponding to each edge point, wherein the growth direction refers to the direction in which each edge point extends;
and traversing each target edge point meeting a preset termination condition based on each growth direction, and connecting each edge point and the corresponding target edge point to obtain the edge image.
Optionally, the obtain edge image module 40 is further configured to:
if each gradient direction is vertical, determining that the growth direction of each edge point is upward or downward growth;
and if the gradient directions are transverse, judging that the growth direction of the edge points is leftward or rightward.
The image edge detection device provided by the application adopts the image edge detection method in the embodiment to solve the technical problem that the current image edge detection efficiency is low. Compared with the prior art, the image edge detection device provided by the embodiment of the present application has the same beneficial effects as the image edge detection method provided by the above embodiment, and other technical features of the image edge detection device are the same as those disclosed in the above embodiment method, which are not repeated herein.
An embodiment of the present application provides an electronic device, and with reference to fig. 7, the electronic device includes: at least one processor; and a memory communicatively coupled to the at least one processor; the memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor to enable the at least one processor to execute the image edge detection method in the first embodiment.
FIG. 7 illustrates a schematic diagram of an electronic device suitable for use in implementing embodiments of the present disclosure. The electronic devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., car navigation terminals), and the like, and fixed terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 7, the electronic device may include a processing means (e.g., a central processing unit, a graphic processor, etc.) that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) or a program loaded from a storage means into a Random Access Memory (RAM). In the RAM, various programs and data necessary for the operation of the electronic apparatus are also stored. The processing device, the ROM, and the RAM are connected to each other through a bus. An input/output (I/O) interface is also connected to the bus.
Generally, the following systems may be connected to the I/O interface: input devices including, for example, touch screens, touch pads, keyboards, mice, image sensors, microphones, accelerometers, gyroscopes, and the like; output devices including, for example, liquid Crystal Displays (LCDs), speakers, vibrators, and the like; storage devices including, for example, magnetic tape, hard disk, etc.; and a communication device. The communication means may allow the electronic device to communicate wirelessly or by wire with other devices to exchange data. While the figures illustrate an electronic device with various systems, it is to be understood that not all illustrated systems are required to be implemented or provided. More or fewer systems may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means, or installed from a storage means, or installed from a ROM. The computer program, when executed by a processing device, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
The electronic device provided by the application adopts the image edge detection method in the embodiment, and solves the technical problem that the image edge detection efficiency is low at present. Compared with the prior art, the beneficial effects of the electronic device provided by the embodiment of the present application are the same as the beneficial effects of the image edge detection method provided by the above embodiment, and other technical features of the electronic device are the same as those disclosed in the above embodiment method, which are not repeated herein.
It should be understood that portions of the present disclosure may be implemented in hardware, software, firmware, or a combination thereof. In the foregoing description of embodiments, the particular features, structures, materials, or characteristics may be combined in any suitable manner in any one or more embodiments or examples.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
The present embodiment provides a computer-readable storage medium having computer-readable program instructions stored thereon for performing the image edge detection method in the first embodiment.
The computer readable storage medium provided by the embodiments of the present application may be, for example, a usb disk, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, or device, or a combination of any of the above. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present embodiment, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, or device. Program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer-readable storage medium may be embodied in an electronic device; or may be separate and not incorporated into the electronic device.
The computer readable storage medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: if an initial image input by a user is received, carrying out Gaussian filtering on the initial image to obtain a Gaussian image; acquiring gradient information of each pixel point in the Gaussian image, and determining a target edge threshold corresponding to the Gaussian image according to each gradient information; screening out each edge point from each pixel point according to the target edge threshold value; and connecting the edge points to obtain an edge image of the initial image.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, small talk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present disclosure may be implemented by software or hardware. Wherein the names of the modules do not in some cases constitute a limitation of the unit itself.
The computer-readable storage medium provided by the application stores computer-readable program instructions for executing the image edge detection method, and solves the technical problem that the image edge detection efficiency is low at present. Compared with the prior art, the beneficial effects of the computer-readable storage medium provided by the embodiment of the present application are the same as the beneficial effects of the image edge detection method provided by the above embodiment, and are not described herein again.
The present application also provides a computer program product comprising a computer program which, when executed by a processor, performs the steps of the image edge detection method as described above.
The computer program product provided by the application solves the technical problem that the image edge detection efficiency is low at present. Compared with the prior art, the beneficial effects of the computer program product provided by the embodiment of the present application are the same as the beneficial effects of the image edge detection method provided by the above embodiment, and are not described herein again.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all equivalent structures or equivalent processes, which are directly or indirectly applied to other related technical fields, and which are not limited by the present application, are also included in the scope of the present application.

Claims (10)

1. An image edge detection method, characterized in that the image edge detection method comprises:
if an initial image input by a user is received, carrying out Gaussian filtering on the initial image to obtain a Gaussian image;
acquiring gradient information of each pixel point in the Gaussian image, and determining a target edge threshold corresponding to the Gaussian image according to each gradient information;
screening out each edge point from each pixel point according to the target edge threshold value;
and connecting the edge points to obtain an edge image of the initial image.
2. The image edge detection method of claim 1, wherein the gradient information includes a target gradient value and a gradient direction;
the step of obtaining the gradient information of each pixel point in the Gaussian image comprises the following steps:
acquiring a sum of absolute gradient values between the vertical gradient value and the horizontal gradient value of each pixel point, and taking the sum of absolute gradient values as the target gradient value;
and judging the gradient direction of each pixel point by comparing the vertical gradient value with the horizontal gradient value.
3. The image edge detection method of claim 2, wherein the step of determining the target edge threshold corresponding to the gaussian image according to each piece of gradient information comprises:
acquiring the width and the height of the initial image, and calculating the ratio of the product of the width and the height to each target gradient value;
and aggregating all the ratios based on the weight of a preset threshold to obtain the target edge threshold.
4. The image edge detection method of claim 3, wherein the step of screening each of the pixel points for an edge point according to the target edge threshold comprises:
calculating a target gradient value difference between each pixel point and an adjacent pixel point based on each gradient direction;
and if the difference of the target gradient values is detected to be larger than or equal to the target edge threshold, taking each pixel point as the edge point.
5. The image edge detection method of claim 4, wherein the gradient directions include a horizontal direction and a vertical direction, and the step of calculating the target gradient value difference between each pixel point and an adjacent pixel point based on each gradient direction comprises:
if the gradient direction is horizontal, calculating a target gradient value difference between each pixel point and a horizontally adjacent pixel point;
and if the gradient direction is vertical, calculating the target gradient value difference between each pixel point and the vertically adjacent pixel point.
6. The image edge detection method of claim 2, wherein the step of connecting each of the edge points to obtain the edge image of the initial image comprises:
detecting the growth direction of each edge point according to the gradient direction corresponding to each edge point, wherein the growth direction refers to the direction in which each edge point extends;
and traversing each target edge point meeting a preset termination condition based on each growth direction, and connecting each edge point and the corresponding target edge point to obtain the edge image.
7. The image edge detection method of claim 6, wherein the step of detecting the growth direction of each edge point according to the gradient direction corresponding to each edge point comprises:
if the gradient direction is vertical, judging that the growth direction of each edge point is upward or downward growth;
and if the gradient direction is transverse, judging that the growth direction of each edge point grows leftwards or rightwards.
8. An image edge detection apparatus, characterized in that the image edge detection apparatus comprises:
the Gaussian filtering module is used for carrying out Gaussian filtering on the initial image to obtain a Gaussian image if the initial image input by a user is received;
the gradient calculation and edge threshold determination module is used for acquiring gradient information of each pixel point in the Gaussian image and determining a target edge threshold corresponding to the Gaussian image according to each gradient information;
the edge point screening module is used for screening each edge point from each pixel point according to the target edge threshold value;
and the edge image obtaining module is used for connecting the edge points to obtain an edge image of the initial image.
9. An electronic device, characterized in that the electronic device comprises:
at least one processor; and (c) a second step of,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the steps of the image edge detection method of any one of claims 1 to 7.
10. A readable storage medium, characterized in that the readable storage medium has stored thereon a program for implementing an image edge detection method, the program being executed by a processor to implement the steps of the image edge detection method according to any one of claims 1 to 7.
CN202211667529.XA 2022-12-23 2022-12-23 Image edge detection method, device, equipment and storage medium Pending CN115861354A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211667529.XA CN115861354A (en) 2022-12-23 2022-12-23 Image edge detection method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211667529.XA CN115861354A (en) 2022-12-23 2022-12-23 Image edge detection method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115861354A true CN115861354A (en) 2023-03-28

Family

ID=85654409

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211667529.XA Pending CN115861354A (en) 2022-12-23 2022-12-23 Image edge detection method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115861354A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116385472A (en) * 2023-06-07 2023-07-04 深圳市锦红兴科技有限公司 Hardware stamping part deburring effect evaluation method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116385472A (en) * 2023-06-07 2023-07-04 深圳市锦红兴科技有限公司 Hardware stamping part deburring effect evaluation method
CN116385472B (en) * 2023-06-07 2023-08-08 深圳市锦红兴科技有限公司 Hardware stamping part deburring effect evaluation method

Similar Documents

Publication Publication Date Title
CN110276346B (en) Target area recognition model training method, device and computer readable storage medium
JPWO2018051459A1 (en) Object detection apparatus and object detection method
CN110070551B (en) Video image rendering method and device and electronic equipment
CN110781823B (en) Screen recording detection method and device, readable medium and electronic equipment
CN111222509B (en) Target detection method and device and electronic equipment
CN112037223B (en) Image defect detection method and device and electronic equipment
WO2019128495A1 (en) Method and apparatus for detecting image resolution, storage medium, and electronic device
CN109300139B (en) Lane line detection method and device
CN112306301A (en) Touch data processing method, device, equipment and storage medium
CN115861354A (en) Image edge detection method, device, equipment and storage medium
CN110827301B (en) Method and apparatus for processing image
CN111191556A (en) Face recognition method and device and electronic equipment
CN113129366B (en) Monocular SLAM initialization method and device and electronic equipment
CN114972113A (en) Image processing method and device, electronic equipment and readable storage medium
CN114419322B (en) Image instance segmentation method and device, electronic equipment and storage medium
US20110158516A1 (en) Image classification methods and systems
CN112337675B (en) Spraying control method and device for spraying robot and electronic equipment
CN110349109B (en) Fisheye distortion correction method and system and electronic equipment thereof
CN112926539A (en) Image processing method and device and electronic equipment
CN104360854A (en) Information processing method and electronic equipment
CN114724528B (en) Display control method and device of display device, electronic device and storage medium
CN114839200A (en) Foreign object detection method, electronic device, and readable storage medium
CN116993886B (en) Method and related device for generating regional contour map in rendering
CN112884797B (en) Image background removing method and device and electronic equipment
CN116363165A (en) Jitter evaluation method and device, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination