Density-based image processing method, image processing apparatus and device
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a density-based image processing method, a density-based image processing apparatus, and a device having an image processing function.
Background
In recent years, with the rapid development of artificial intelligence and big data technology, more and more products begin to develop towards intellectualization, and compared with non-intellectualized products, the intellectualized products have the characteristics of more powerful functions, more comfortable user experience and the like. In a plurality of intelligent directions, image recognition is a very important field in intellectualization, and a complete image recognition system takes an image as input information, recognizes objects in the image by different methods, and finally outputs a recognition result. However, in the image recognition, since there are many complicated backgrounds and non-target objects, which may cause great influence on the recognition result, before the image recognition, the image needs to be segmented to extract a clean image and remove the complicated backgrounds and the non-target objects.
At present, most of mainstream image segmentation methods adopt threshold segmentation, color gradient segmentation and edge segmentation, and a large number of segmentation methods with universality are derived according to the principle. Especially for images with complex and changeable textures and large color changes, the segmentation effect in the prior art is poor.
The current image segmentation method is mainly based on two modes, namely an edge-based segmentation method and a region-based segmentation method, wherein if complex and irregular cross textures exist in an image, non-target objects in the image cannot be accurately removed, and a region-based segmentation algorithm also often causes segmentation errors due to the diversity of image contents, and the segmentation time is too long and the real-time performance is poor, so that the non-target objects cannot be automatically removed.
Therefore, how to accurately remove the non-target object in the image and improve the accuracy and speed of image recognition, thereby ensuring the accuracy and real-time of image recognition becomes a technical problem to be solved urgently.
Disclosure of Invention
The present invention is directed to solving at least one of the problems of the prior art or the related art.
To this end, it is an object of the invention to propose a density-based image processing method.
Another object of the present invention is to provide an image processing apparatus based on density.
It is a further object of the invention to propose a device with image processing functionality.
To achieve at least one of the above objects, according to an embodiment of a first aspect of the present invention, there is provided a density-based image processing method including: carrying out edge extraction processing on an image to be processed; performing convolution on the image subjected to the edge extraction processing by using a target convolution kernel according to convolution step length to obtain an edge density point space; screening points in the edge density point space using a density threshold; acquiring a target connected domain in the screened edge density point space; and determining the boundary of the target object in the image according to the target connected domain.
According to the density-based image processing method, the edge density point space of the image is obtained through the convolution principle, the density threshold is used as a main judgment basis, the points in the edge density point space are screened to determine the boundary of the target object in the image, even if complex and irregular cross textures or diversity of image contents exist in the image, for example, a food image, non-target objects in the image can be accurately removed, the image can be accurately segmented, and the accuracy of image recognition is improved. In addition, the algorithm of the scheme has high flexibility, and the speed and the effect of image processing can be adjusted by adjusting the target convolution kernel, the convolution step length and the density threshold, so that the accuracy and the real-time performance of image identification are ensured.
The density-based image processing method according to the above-described embodiment of the present invention may further have the following technical features:
according to an embodiment of the present invention, before the convolving the image subjected to the edge extraction processing, the method further includes: performing boundary extension processing on the image subjected to the edge extraction processing to convolve the image subjected to the boundary extension processing, wherein when the space of the image subjected to the edge extraction processing is p (x, y), the space of the image subjected to the boundary extension processing is H (x, y),
rows is the width of the image subjected to the edge extraction processing, and h.cols is the height of the image subjected to the edge extraction processing.
According to the density-based image processing method provided by the embodiment of the invention, by carrying out the boundary extension processing, the point beyond the boundary can be avoided from being convoluted when the image is convoluted, so that the accuracy and reliability of convolution are ensured.
According to one embodiment of the present invention, when the target convolution kernel is h (x, y), (0 < x < h.rows, 0 < y < h.cols), h.rows is the width of the target convolution kernel, and h.cols is the height of the target convolution kernel, the edge density point space is Con (x, y),
wherein s is the convolution step length, k is a positive integer, x belongs to (0, H.rows), and y belongs to (0, H.cols).
According to the density-based image processing method provided by the embodiment of the invention, the edge density point space conforms to the formula, the width and the height of the edge density point space obtained after convolution can be ensured to be consistent with those of an original image, and the boundary of a target object in the image can be further accurately determined. In addition, in order to ensure the speed of calculating the edge density point space, the calculation can be realized by setting a proper convolution step length, the time consumption is reduced when the convolution step length is larger, and the obtained density point space is more sparse.
According to an embodiment of the present invention, the edge density point space after screening is Cone (x, y):
where Con (x, y) is the edge density point space and e is the density threshold.
According to the density-based image processing method, the screened edge density point space is obtained by screening the points in the edge density point space, so that non-target objects in the image are removed, the identified image is ensured to be purer, and the accuracy of image identification is further ensured.
According to an embodiment of the present invention, the acquiring a target connected domain in a filtered edge density point space specifically includes: performing expansion processing on the screened edge density point space by using an expansion window so as to connect adjacent points in the screened edge density point space into a connected domain; and selecting the connected domain with the largest area as the target connected domain.
According to the density-based image processing method of the embodiment of the invention, since the area where the target object is located has the largest number of adjacent points, the boundary of the target object can be determined by connecting the adjacent points. In addition, the expansion processing is carried out on the screened edge density point space, so that non-target objects in the image are filtered more comprehensively, and the finally determined target objects are ensured to be purer.
According to one embodiment of the invention, the side length of the dilation window is greater than half the convolution step size.
According to the density-based image processing method provided by the embodiment of the invention, because the side length of the expansion window is greater than half of the convolution step length, adjacent points in the expanded density point space are connected into a connected domain, and at the moment, the connected domain with the largest area corresponds to the target object.
According to an embodiment of a second aspect of the present invention, there is provided a density-based image processing apparatus including: the edge extraction unit is used for carrying out edge extraction processing on the image to be processed; the convolution unit is used for performing convolution on the image subjected to the edge extraction processing by using a target convolution kernel according to the convolution step length to obtain an edge density point space; a screening unit for screening points in the edge density point space using a density threshold; the acquisition unit is used for acquiring a target connected domain in the edge density point space after screening; and the determining unit is used for determining the boundary of the target object in the image according to the target connected domain.
According to the density-based image processing device of the embodiment of the invention, the edge density point space of the image is obtained by the convolution principle, and the points in the edge density point space are screened by taking the density threshold value as the main judgment basis to determine the boundary of the target object in the image, so that even if complex and irregular cross textures or diversity of image contents exist in the image, such as a food image, non-target objects in the image can be accurately removed, the image can be accurately segmented, and the accuracy of image recognition is further improved. In addition, the algorithm of the scheme has high flexibility, and the speed and the effect of image processing can be adjusted by adjusting the target convolution kernel, the convolution step length and the density threshold, so that the accuracy and the real-time performance of image identification are ensured.
According to an embodiment of the present invention, further comprising: a boundary extension unit configured to perform boundary extension processing on the image subjected to the edge extraction processing to convolve the image subjected to the boundary extension processing, wherein when a space of the image subjected to the edge extraction processing is p (x, y), the space of the image subjected to the boundary extension processing is H (x, y),
rows is the width of the image subjected to the edge extraction processing, and h.cols is the height of the image subjected to the edge extraction processing.
According to the density-based image processing device of the embodiment of the invention, by carrying out the boundary extension processing, the point beyond the boundary can be avoided from being convoluted when the image is convoluted, so that the accuracy and the reliability of convolution are ensured.
According to one embodiment of the present invention, when the target convolution kernel is h (x, y), (0 < x < h.rows, 0 < y < h.cols), h.rows is the width of the target convolution kernel, and h.cols is the height of the target convolution kernel, the edge density point space is Con (x, y),
wherein s is the convolution step length, k is a positive integer, x belongs to (0, H.rows), and y belongs to (0, H.cols).
According to the density-based image processing device provided by the embodiment of the invention, the edge density point space conforms to the formula, the width and the height of the edge density point space obtained after convolution can be ensured to be consistent with those of an original image, and the boundary of a target object in the image can be further accurately determined. In addition, in order to ensure the speed of calculating the edge density point space, the calculation can be realized by setting a proper convolution step length, the time consumption is reduced when the convolution step length is larger, and the obtained density point space is more sparse.
According to an embodiment of the present invention, the edge density point space after screening is Cone (x, y):
where Con (x, y) is the edge density point space and e is the density threshold.
According to the density-based image processing device provided by the embodiment of the invention, the screened edge density point space is obtained by screening the points in the edge density point space, so that non-target objects in the image are removed, the identified image is ensured to be purer, and the accuracy of image identification is further ensured.
According to one embodiment of the present invention, the acquisition unit includes: the connecting subunit is used for performing expansion processing on the screened edge density point space by using an expansion window so as to connect adjacent points in the screened edge density point space into a connected domain; and the selecting subunit is used for selecting the connected domain with the largest area as the target connected domain.
According to the density-based image processing apparatus of the embodiment of the present invention, since the area where the target object is located has the largest number of adjacent points, the boundary of the target object can be determined by connecting the adjacent points. In addition, the expansion processing is carried out on the screened edge density point space, so that non-target objects in the image are filtered more comprehensively, and the finally determined target objects are ensured to be purer.
According to one embodiment of the invention, the side length of the dilation window is greater than half the convolution step size.
According to the density-based image processing device of the embodiment of the invention, since the side length of the expansion window is greater than half of the convolution step length, adjacent points in the density point space after expansion processing are connected into a connected domain, and at this time, the connected domain with the largest area corresponds to the target object.
According to an embodiment of the third aspect of the present invention, there is provided an apparatus having an image processing function, including the density-based image processing apparatus according to any one of the above-mentioned technical solutions, so that the apparatus has the same technical effects as the density-based image processing apparatus according to any one of the above-mentioned technical solutions, and details thereof are omitted here.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 shows a schematic flow diagram of a density-based image processing method according to an embodiment of the invention;
FIG. 2 shows a schematic flow diagram of a density-based image processing method according to another embodiment of the invention;
FIG. 3 shows a schematic block diagram of a density-based image processing apparatus according to an embodiment of the present invention;
fig. 4 shows a schematic block diagram of an apparatus having an image processing function according to an embodiment of the present invention.
Detailed Description
In order that the above objects, features and advantages of the present invention can be more clearly understood, a more particular description of the invention will be rendered by reference to the appended drawings. It should be noted that the embodiments and features of the embodiments of the present application may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, however, the present invention may be practiced in other ways than those specifically described herein, and therefore the scope of the present invention is not limited by the specific embodiments disclosed below.
Example one
As shown in fig. 1, a density-based image processing method according to an embodiment of the present invention includes:
and 102, performing edge extraction processing on the image to be processed.
And 104, performing convolution on the image subjected to the edge extraction processing by using a target convolution kernel according to a convolution step size to obtain an edge density point space.
Wherein, when the target convolution kernel is h (x, y), (x is more than 0 and less than h.rows, y is more than 0 and less than h.cols), h.rows is the width of the target convolution kernel, h.cols is the height of the target convolution kernel, the edge density point space is Con (x, y),
p (x, y) represents the space of the image subjected to the edge extraction process, s is the convolution step, k is a positive integer, x belongs to (0, H.rows), and y belongs to (0, H.cols).
In order to ensure the speed of calculating the edge density point space, the calculation can be realized by setting a proper convolution step length, the time consumption is less when the convolution step length is larger, and the obtained density point space is more sparse.
For example, the target convolution kernel is a convolution kernel of all 1 s. Or the target convolution kernel may be a convolution kernel of another function, such as a gaussian function.
And 106, screening the points in the edge density point space by using a density threshold.
If the density threshold is e, the edge density point space Cone (x, y) after screening is:
the screened edge density point space is obtained by screening points in the edge density point space, so that non-target objects in the image are removed, the identified image is ensured to be purer, and the accuracy of image identification is ensured.
And step 108, acquiring a target connected domain in the screened edge density point space.
Preferably, step 108 specifically comprises: performing expansion processing on the screened edge density point space by using an expansion window so as to connect adjacent points in the screened edge density point space into a connected domain; and selecting the connected domain with the largest area as the target connected domain.
Since the area where the target object is located has the largest number of adjacent points, the boundary of the target object can be determined by connecting the adjacent points. In addition, the expansion processing is carried out on the screened edge density point space, so that non-target objects in the image are filtered more comprehensively, and the finally determined target objects are ensured to be purer.
Preferably, the side length of the dilation window is greater than half the convolution step.
Because the side length of the expansion window is larger than half of the convolution step length, adjacent points in the density point space after expansion processing are connected into a connected domain, and at the moment, the connected domain with the largest area corresponds to the target object.
And step 110, determining the boundary of the target object in the image according to the target connected domain.
In the technical scheme, an edge density point space of the image is obtained through a convolution principle, and points in the edge density point space are screened by taking a density threshold value as a main judgment basis to determine the boundary of a target object in the image. In addition, the algorithm of the scheme has high flexibility, and the speed and the effect of image processing can be adjusted by adjusting the target convolution kernel, the convolution step length and the density threshold, so that the accuracy and the real-time performance of image identification are ensured.
Example two
As shown in fig. 2, a density-based image processing method according to another embodiment of the present invention includes:
step 202, performing edge extraction processing on the image to be processed.
Step 202 specifically includes: and converting the image to be processed into a gray image, and performing edge extraction processing on the gray image by using an edge extraction algorithm.
For example, after loading an image to be processed (such as a food image), the image is converted into an 8-bit grayscale image, and the edge of the grayscale image is extracted using the canny algorithm.
And step 204, performing boundary extension processing on the image subjected to the edge extraction processing.
When the space of the image subjected to the edge extraction processing is p (x, y), the space of the image subjected to the boundary extension processing is H (x, y),
rows is the width of the image subjected to the edge extraction process, and h.cols is the height of the image subjected to the edge extraction process.
In step 206, the image subjected to the boundary extension processing is convolved with a convolution step size by using a target convolution kernel to obtain an edge density point space Con (x, y).
Preferably, the edge density point space Con (x, y) is:
wherein s is convolution step length, k is a positive integer, x belongs to (0, H.rows), and y belongs to (0, H.cols).
In addition, in order to ensure the speed of calculating the edge density point space, the calculation can be realized by setting a proper convolution step length, the time consumption is reduced when the convolution step length is larger, and the obtained density point space is more sparse.
At step 208, a density threshold (e.g., 40000) is used to screen points in the edge density point space.
If the density threshold is e, the edge density point space Cone (x, y) after screening is:
and step 210, acquiring a target connected domain in the screened edge density point space.
Preferably, step 210 specifically includes: performing expansion processing on the screened edge density point space by using an expansion window; connecting adjacent points in the edge density point space after expansion treatment to form a connected domain; and selecting the connected domain with the largest area as the target connected domain.
Preferably, the side length of the dilation window is greater than half the convolution step size. For example, the convolution step size is 15 and the dilation window side length is 8.
Step 212, determining the boundary of the target object in the image according to the target connected domain.
The rectangular boundary of the target connected domain is the boundary of the target object.
In the technical scheme, an edge density point space of the image is obtained through a convolution principle, and points in the edge density point space are screened by taking a density threshold value as a main judgment basis to determine the boundary of a target object in the image. In addition, the algorithm of the scheme has high flexibility, and the speed and the effect of image processing can be adjusted by adjusting the target convolution kernel, the convolution step length and the density threshold, so that the accuracy and the real-time performance of image identification are ensured.
Fig. 3 shows a schematic block diagram of a density-based image processing apparatus according to an embodiment of the present invention.
As shown in fig. 3, the density-based image processing apparatus 300 according to an embodiment of the present invention includes: an edge extraction unit 302, a convolution unit 304, a screening unit 306, an acquisition unit 308, and a determination unit 310.
An edge extraction unit 302, configured to perform edge extraction processing on an image to be processed; a convolution unit 304, configured to convolve the image subjected to the edge extraction processing by using a target convolution kernel with a convolution step size to obtain an edge density point space; a screening unit 306 for screening points in the edge density point space using a density threshold; an obtaining unit 308, configured to obtain a target connected domain in the screened edge density point space; a determining unit 310, configured to determine a boundary of the target object in the image according to the target connected component.
According to the density-based image processing device 300 of the embodiment of the invention, the edge density point space of the image is obtained by the convolution principle, and the points in the edge density point space are screened by taking the density threshold value as the main judgment basis to determine the boundary of the target object in the image, so that even if complex and irregular cross textures or diversity of image contents exist in the image, for example, a food image, non-target objects in the image can be accurately removed, the image can be accurately segmented, and the accuracy of image recognition is further improved. In addition, the algorithm of the scheme has high flexibility, and the speed and the effect of image processing can be adjusted by adjusting the target convolution kernel, the convolution step length and the density threshold, so that the real-time performance of image identification is ensured.
According to an embodiment of the present invention, further comprising: a boundary extension unit 312, configured to perform boundary extension processing on the image subjected to the edge extraction processing to perform convolution on the image subjected to the boundary extension processing, where when a space of the image subjected to the edge extraction processing is p (x, y), the space of the image subjected to the boundary extension processing is H (x, y),
rows is the width of the image subjected to the edge extraction processing, and h.cols is the height of the image subjected to the edge extraction processing.
According to the density-based image processing apparatus 300 of the embodiment of the present invention, by performing the boundary extension processing, it is possible to avoid the convolution to a point outside the boundary when the image is convolved, thereby ensuring the accuracy and reliability of the convolution.
According to one embodiment of the present invention, when the target convolution kernel is h (x, y), (0 < x < h.rows, 0 < y < h.cols), h.rows is the width of the target convolution kernel, and h.cols is the height of the target convolution kernel, the edge density point space is Con (x, y),
wherein s is the convolution step length, k is a positive integer, x belongs to (0, H.rows), and y belongs to (0, H.cols).
According to the density-based image processing apparatus 300 of the embodiment of the present invention, the edge density point space conforms to the above formula, and it is ensured that the width and height of the edge density point space obtained after convolution are consistent with those of the original image, and further the boundary of the target object in the image is accurately determined. In addition, in order to ensure the speed of calculating the edge density point space, the calculation can be realized by setting a proper convolution step length, the time consumption is reduced when the convolution step length is larger, and the obtained density point space is more sparse.
According to an embodiment of the present invention, the edge density point space after screening is Cone (x, y):
where Con (x, y) is the edge density point space and e is the density threshold.
According to the density-based image processing device 300 of the embodiment of the invention, the screened edge density point space is obtained by screening the points in the edge density point space, so that non-target objects in the image are removed, the identified image is ensured to be purer, and the accuracy of image identification is ensured.
According to an embodiment of the present invention, the obtaining unit 308 includes: a connection subunit 3082, configured to perform expansion processing on the screened edge density point space by using an expansion window, so as to connect adjacent points in the screened edge density point space into a connected domain; and the selecting subunit 3084 is used for selecting the connected domain with the largest area as the target connected domain.
According to the density-based image processing apparatus 300 of the embodiment of the present invention, since the area where the target object is located has the largest number of adjacent points, the boundary of the target object can be determined by connecting the adjacent points. In addition, the expansion processing is carried out on the screened edge density point space, so that non-target objects in the image are filtered more comprehensively, and the finally determined target objects are ensured to be purer.
According to one embodiment of the invention, the side length of the dilation window is greater than half the convolution step size.
According to the density-based image processing apparatus 300 of the embodiment of the present invention, since the side length of the expansion window is greater than half of the convolution step, adjacent points in the expanded density point space are connected as a connected domain, and at this time, the connected domain with the largest area corresponds to the target object.
Fig. 4 shows a schematic block diagram of an apparatus having an image processing function according to an embodiment of the present invention.
As shown in fig. 4, the apparatus 400 having an image processing function according to an embodiment of the present invention includes the density-based image processing apparatus 300 according to any one of the above-mentioned technical solutions, and therefore, the apparatus 400 has the same technical effects as the density-based image processing apparatus 300 according to any one of the above-mentioned technical solutions, and is not described herein again.
Among them, the apparatus 400 having the image processing function includes, but is not limited to: domestic appliance, server and terminal.
The technical scheme of the invention is explained in detail in the above with the help of the attached drawings, and through the technical scheme, the non-target objects in the image can be accurately removed, and the accuracy and the speed of image recognition are improved, so that the accuracy and the real-time performance of the image recognition are ensured.
In the present invention, the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance; the term "plurality" means two or more. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.