CN107633503B - Image processing method for automatically detecting residual straws in grains - Google Patents

Image processing method for automatically detecting residual straws in grains Download PDF

Info

Publication number
CN107633503B
CN107633503B CN201710645656.2A CN201710645656A CN107633503B CN 107633503 B CN107633503 B CN 107633503B CN 201710645656 A CN201710645656 A CN 201710645656A CN 107633503 B CN107633503 B CN 107633503B
Authority
CN
China
Prior art keywords
image
point
pixel
gray
connected domain
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710645656.2A
Other languages
Chinese (zh)
Other versions
CN107633503A (en
Inventor
张寅�
闫钧华
杨勇
许倩倩
汪竞成
肖勇旗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN201710645656.2A priority Critical patent/CN107633503B/en
Publication of CN107633503A publication Critical patent/CN107633503A/en
Application granted granted Critical
Publication of CN107633503B publication Critical patent/CN107633503B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses an image processing method for automatically detecting residual straws in grains, which comprises the steps of firstly, collecting an original image; carrying out self-adaptive mean filtering on an original image, carrying out fuzzification processing on the image, and taking the fuzzified image as a background image; subtracting the background image from the original image to obtain an edge enhanced image; converting the edge enhanced image into a gray image, performing self-adaptive gray equalization, performing maximum inter-class variance binarization processing, simultaneously obtaining a binary image, and performing opening operation on the binary image to remove noise; carrying out connected domain marking on the binary image, and carrying out primary screening on the straw target by comparing the number of pixel points in the connected domain; drawing a minimum external rectangle of the connected domain, and screening a straw target according to the length of a diagonal line of the rectangle as a judgment condition; and drawing the straw outline on the original image. The invention can reduce the influence of stacking and illumination on the detection result; the situation that the color is difficult to distinguish by using a near infrared method in the prior art is avoided.

Description

Image processing method for automatically detecting residual straws in grains
Technical Field
The invention relates to an image processing technology, in particular to an image processing method for quickly and automatically detecting residual straws in grains processed by a thresher.
Background
At present, when an agricultural machine harvests rice, the rice needs to be peeled, and in the peeling process, a lot of crushed straws exist in a piled grain target, so that the sieving and filtering are repeatedly carried out to remove the straws, and the number of the crushed straws is in an allowable range. In the traditional method, people can only observe the crushed straws when judging whether the crushed straws are completely screened, and then whether the crushed straws need to be screened again is determined. This method is not only inefficient but also subject to subjective errors. Factors such as high temperature, noise, unsafe exist in the production place environment, and the image processing technology is applied to the residual straws in the grains for automatic detection, so that the subjective error caused by manual inspection can be eliminated, the detection consistency is improved, the efficiency and the precision can be improved, the personnel safety can be guaranteed, and the advantages of non-contact property and the like are realized.
The method comprises the steps of utilizing an image processing technology to rapidly and automatically detect the crushed straw target, firstly collecting an original image mixed with the crushed straw target after husking through a camera, then detecting the crushed straw target in the image by utilizing an image processing method, and judging whether to carry out screening again according to the number of straws. There are three main problems to be solved: firstly, because the stacking among grains, grains and straws is serious, all objects need to be separated as far as possible. Secondly, because part of the straws and the grains are similar in color and size, how to distinguish the straw targets in the grains. Thirdly, the image processing is also affected by illumination, because the illumination nonuniformity causes errors in the result of the image processing.
At present, the rapid automatic detection method of the straw target based on the image processing technology mainly comprises two methods: (1) one is direct outline detection, and a straw target is screened according to outline characteristics; (2) the other method is to use the near infrared image to screen the straw and the grain with different target colors. Method (1) requires that there is substantially no stacking in the straw and grain images, and in the case of stacking the processing is not ideal. The method (2) needs a near-infrared camera, has high cost and is not suitable for commercialization.
Disclosure of Invention
The purpose of the invention is as follows: in order to solve the problems in the prior art, reduce the influence of uneven stacking and illumination and improve the detection precision of straws in grains, the invention provides an image processing method for automatically detecting residual straws in the grains.
The technical scheme is as follows: the invention provides an image processing method for automatically detecting residual straws in grains, which comprises the following steps:
the method comprises the following steps: a camera collects an original image, and a computer reads the original image;
step two: carrying out self-adaptive mean filtering on an original image, carrying out fuzzification processing on the image, and taking the fuzzified image as a background image; subtracting the background image from the original image to obtain an edge enhanced image after the edge of the foreground object is enhanced;
step three: converting the edge enhanced image obtained in the step two into a gray image, carrying out self-adaptive gray equalization on the gray image, then carrying out maximum inter-class variance binarization processing, separating grains and straws from the background, simultaneously obtaining a binary image, carrying out opening operation on the binary image to remove noise points, and reducing adhesion between grains and between straws;
step four: carrying out connected domain marking on the binary image obtained in the step three, and carrying out primary screening on the straw target by comparing the number of pixel points in the connected domain;
step five: drawing a minimum external rectangle of the connected domain, and screening a straw target according to the length of a diagonal line of the rectangle as a judgment condition;
step six: and drawing the straw outline on the original image according to the result obtained in the step five.
Further, in the third step, the image is divided into N × N image blocks, where N is a positive integer greater than 1, and then maximum inter-class variance binarization processing is performed on each image block.
Further, the second step specifically includes:
2.1) setting the size of the original image as W × H, defining the size of a mean kernel s of mean filtering as (min (W, H)/20) × (min (W, H)/20), for a pixel f (x, y) to be processed in the original image, the pixel after mean filtering is g (x, y), and g (x, y) is represented as:
Figure GDA0002091463600000021
M=(min(W,H)/20)×(min(W,H)/20)
2.2) taking the obtained g (x, y) as the image of each pixel point pixel value as a background image, subtracting the background image from the original image to obtain an edge-enhanced image after the edge enhancement of the foreground object, and setting the pixel value of each pixel point of the edge-enhanced image as h (x, y) and expressing as follows:
h(x,y)=f(x,y)-g(x,y)。
further, the third step specifically includes:
3.1) recording R (x, y), G (x, y) and B (x, y) as pixel values of each point of each layer of the edge-enhanced image R, G, B, converting the pixel values into a gray image with gray values of gray (x, y) of each pixel point, wherein the gray (x, y) is as follows:
gray(x,y)=0.299*R(x,y)+0.587*G(x,y)+0.144*B(x,y)
3.2) carrying out gray level equalization on the obtained gray level image to obtain a maximum gray level Gmax and a minimum gray level Gmin, and setting gray level after the gray level equalization as gray '(x, y), wherein the gray' (x, y) is as follows:
Figure GDA0002091463600000022
3.3) recording u as a segmentation threshold value of the foreground and the background, wherein the proportion of the target points in the image is w0Average gray of u0(ii) a The number of background points in the image is w1Average gray of u1(ii) a Variance is g, then:
u=w0u0+w1u1
g=w0(u0-u)2+w1(u1-u)2=w0w1(u1-u0)2
when g is maximum, the division threshold is the maximum between-class variance division threshold TgmaxThen, the input image is segmented as follows:
Figure GDA0002091463600000031
b (x, y) is a pixel value of the output binary image at the point (x, y);
3.4) recording R1kerIn order to operate the radius of the kernel function, the kernel functions are all circular kernel functions, and the radius of the binary image is R1kerIs calculated as:
Figure GDA0002091463600000032
further, the fourth step specifically includes:
4.1) searching a connected domain by adopting a four-connection method, wherein for a binary image, any pixel gray value has only two values of 255 and 0, for a point b (x, y) with a gray value of 255, whether the pixel values of four pixel points of b (x-1, y), b (x +1, y), b (x, y-1) and b (x, y +1) are 255 is judged, the point with the pixel value of 255 is a four-adjacent point of b (x, y) is met, then the four-adjacent point is used as a center to search the four-adjacent point of the four-adjacent point, the four-adjacent point is circulated in sequence, and all the found pixel points are called a connected domain L until no four-adjacent point exists in the found point;
4.2) assuming that T1 is a threshold value for screening the connected domain region, when the number of pixel points included in the connected domain L is less than the threshold value, the connected domain is determined to be a non-straw region, otherwise, the connected domain is a straw region.
Further, in the fifth step, the method for screening the straw target according to the length of the diagonal line of the rectangle as the judgment condition comprises the following steps:
traversing the connected domain, and finding out the point (x) with the maximum x value in the pixel points in the connected domainmaxY) point where x is the smallest value (x)minY) point where y is the maximum (x, y)min) Minimum value of y (x, y)max) Thereby to (x)min,ymin),(xmax,ymax) Determining a minimum external rectangle by two points, representing the whole connected domain by the external rectangle, wherein the length l of the diagonal line of the rectangle is as follows:
Figure GDA0002091463600000033
note T2Is a diagonal length threshold value when l is less than or equal to T2Then, the connected component is considered as a non-straw target, and the value of the pixel in the connected component is set to 0; and the reverse is considered as the straw target.
Further, in the fifth step, after the straw target is screened according to the length of the diagonal line of the rectangle as a judgment condition, the grain target is secondarily screened by taking the symmetry of the connected domain as a judgment condition.
Further, the secondary screening method comprises the following steps:
the symmetry is taken as a judgment condition, and the horizontal coordinate of the center of the rectangle is taken
Figure GDA0002091463600000034
The pixel points in the connected domain are (x, y) as boundary points, the connected domain is divided into a first region and a second region according to the size between the x value and the horizontal coordinate of the center of the rectangle, the number of the pixel points in the first region sum1 and the second region sum2 is counted respectively, r is set as a proportionality coefficient, and r is set as:
Figure GDA0002091463600000041
using r as a judgment condition, setting T3Is a symmetry threshold if r is more than or equal to T3Determining that the connected domain does not meet the symmetry condition and is a non-straw target, and setting the gray values of all pixels in the connected domain to be 0; and the reverse is considered as the straw target.
Further, the method for drawing the straw profile in the sixth step comprises the following steps:
6.1) traversing the image from left to right and from top to bottom, assuming that A is a pixel point with the first value of 255, the pixel point is an outer contour point of a connected domain, if the A is not marked by other contour marks, giving A a new mark, and taking the point as a starting point to execute 6.2);
6.2) taking eight positions around the point A of the pixel point as a mark, taking the right point as 0, adding 1 clockwise in sequence, starting to search clockwise from the position 7 when searching the contour point until meeting the pixel point with the gray value of 255, then searching for one step in the direction, continuously circulating until the point with the pixel value of 255 cannot be found, and finally drawing the found straw contour at the corresponding position of the original image.
The image processing method for automatically detecting the residual straws in the grains has the following effects:
(1) according to the method, through the image preprocessing step, each target edge can be well extracted, the stacking influence is reduced, and the problem that the stacking influence the detection effect caused by direct contour detection is avoided;
(2) according to the method, the images are partitioned, then binarization processing is performed on each image by using the maximum between-class variance (OTSU), straw and grain targets and backgrounds are automatically separated, self-adaption of a threshold value is realized, robustness of illumination change, relative position unfixed and the like is enhanced, and influence of uneven illumination of the whole image on detection is reduced by partitioning processing;
(3) the method has the advantages that the connected domain is searched, the connected domain is operated based on the pixels, the method is more accurate and has good operability, and parameters can be conveniently adjusted for straws with different size standards to perform better screening;
(4) the method is not based on color space, can avoid mixing with the grain target when the straw turns yellow, and avoids the condition that the color is difficult to distinguish by using a near infrared method in the prior art.
Drawings
FIG. 1 is an overall flow chart of the image processing method for automatically detecting residual straw in grain according to the present invention;
FIG. 2 is a flow chart of the target edge enhancement according to the present invention;
FIG. 3 is a flow chart of the present invention for segmenting individual grain straw targets of the foreground;
FIG. 4 is a flow chart of the method for screening and labeling the straw target.
Detailed Description
The invention is further described with reference to the following figures and specific examples.
Referring to fig. 1 to 4, the image processing method for automatically detecting the residual straw in the grain comprises the following steps:
the method comprises the following steps: the camera collects an original image, and the computer reads the original image.
Step two: as shown in fig. 2, the original image is subjected to adaptive mean filtering, the image is subjected to blurring processing, and the blurred image is used as a background image; and subtracting the background image from the original image to obtain an edge-enhanced image after the edge of the foreground target is enhanced, and through the image preprocessing of the step, the edges of all targets can be better extracted, the stacking influence is reduced, and the stacking influence detection effect caused by direct contour detection is avoided.
2.1) setting the size of the original image as W × H, defining the size of a mean kernel s of mean filtering as (min (W, H)/20) × (min (W, H)/20), for a pixel f (x, y) to be processed in the original image, the pixel after mean filtering is g (x, y), and g (x, y) is represented as:
Figure GDA0002091463600000051
M=(min(W,H)/20)×(min(W,H)/20) (2)
the specific calculation formula is as follows:
Figure GDA0002091463600000052
2.2) taking the obtained g (x, y) as the image of each pixel point pixel value as a background image, subtracting the background image from the original image to obtain an edge-enhanced image after the edge enhancement of the foreground object, and setting the pixel value of each pixel point of the edge-enhanced image as h (x, y) and expressing as follows:
h(x,y)=f(x,y)-g(x,y) (4)
the edge of the treated target has a dark edge, so that the influence of stacking between straws and grains on the subsequent treatment can be reduced.
Step three: and as shown in fig. 3, converting the edge-enhanced image obtained in the second step into a gray image, stretching the gray range, performing adaptive gray equalization, dividing the image into N × N image blocks, wherein N is a positive integer greater than 1, performing maximum inter-cluster variance (OTSU) binarization on each image block, separating grains and straws from the background, simultaneously obtaining a binary image, performing open operation on the binary image to remove noise, and reducing adhesion between grains and between straws. The image is subjected to blocking and binarization processing, so that the influence of uneven illumination intensity on an image processing result can be avoided. The operation of opening can remove the noise point, reduces the adhesion between cereal grain and cereal grain, cereal grain and the straw, reduces and piles up the influence that brings.
3.1) recording R (x, y), G (x, y) and B (x, y) as pixel values of each point of each layer of the edge-enhanced image R, G, B, converting the pixel values into a gray image with gray values of gray (x, y) of each pixel point, wherein the gray (x, y) is as follows:
gray(x,y)=0.299*R(x,y)+0.587*G(x,y)+0.144*B(x,y) (5)
3.2) carrying out gray level equalization on the obtained gray level image to obtain a maximum gray level GmaxAnd a minimum value Gmin, where the gray-scale value after gray-scale equalization is gray '(x, y), and gray' (x, y) is:
Figure GDA0002091463600000061
3.3) recording u as a segmentation threshold value of the foreground and the background, wherein the proportion of the target points in the image is w0Average gray of u0(ii) a The number of background points in the image is w1Average gray of u1(ii) a Variance is g, then:
u=w0u0+w1u1(7)
g=w0(u0-u)2+w1(u1-u)2=w0w1(u1-u0)2(8)
when g is maximum, the segmentation threshold is the maximum between-class variance (OTSU) segmentation threshold TgmaxThen, the input image is segmented as follows:
Figure GDA0002091463600000062
b (x, y) is a pixel value of the output binary image at the point (x, y);
3.4) recording R1kerIn order to operate the radius of the kernel function, the kernel functions are all circular kernel functions, and the radius of the binary image is R1kerIs calculated as:
Figure GDA0002091463600000063
in the step 3.3), the OTSU is used for automatically acquiring a segmentation threshold value and binarizing a square image of a cross section area, and the implementation method is as follows:
(a) computing a normalized histogram of the input image, using piL-1 denotes each component of the histogram;
(b) for k 1, 2.. L-1, the cumulative sum P is calculated1(k):
Figure GDA0002091463600000064
(c) For k 1, 2.. L-1, the cumulative mean m (k) is calculated:
Figure GDA0002091463600000065
(d) calculating a global gray level mean mG
Figure GDA0002091463600000066
(e) For k 1, 2.. L-1, the between-class variance is calculated
Figure GDA0002091463600000067
Figure GDA0002091463600000071
So that
Figure GDA0002091463600000072
The maximum k value is the OTSU threshold k*The threshold value k at this time*=TgmaxThreshold values are split for OTSU.
In the step 3.4), opening operation to remove noise points and reduce adhesion between straws and grains, wherein the implementation method comprises the following steps:
(m) etching: as Z2The corrosion of B on A, denoted as Ae B, is defined as:
Figure GDA0002091463600000073
the above formula indicates that the result of B's corrosion on A is a set of all z, where B remains in A after z has been translated; in other words, the set obtained by etching a with B is a set of the origin positions of B when B is completely included in a;
(n) denotes R1kerTo operate on the radius of the kernel function, the kernel function used is a circular kernel function, the corresponding kernel function of which is denoted B1kerThen the on operation is defined as follows:
Figure GDA0002091463600000074
the opening operation is used for removing isolated noise points and thin protrusions in the image;
step four: and as shown in fig. 4, screening and marking the obtained binary image on the straw target, specifically referring to fig. 4, firstly marking the connected domain of the binary image obtained in the step three, and primarily screening the straw target by comparing the number of pixel points in the connected domain.
4.1) searching a connected domain by adopting a four-connection method, wherein for a binary image, any pixel gray value has only two values of 255 and 0, for a point b (x, y) with a gray value of 255, whether the pixel values of four pixel points of b (x-1, y), b (x +1, y), b (x, y-1) and b (x, y +1) are 255 is judged, the point with the pixel value of 255 is a four-adjacent point of b (x, y) is met, then the four-adjacent point is used as a center to search the four-adjacent point of the four-adjacent point, the four-adjacent point is circulated in sequence, and all the found pixel points are called a connected domain L until no four-adjacent point exists in the found point;
4.2) assuming that T1 is a threshold value for screening the connected domain region, when the number of pixel points contained in the connected domain L is less than the threshold value, the connected domain is determined to be a non-straw region, and the gray value of the pixel in the connected domain L is set to be 0; and the other is a straw area.
Step five: drawing a minimum external rectangle of the connected domain, screening the straw target according to the length of the diagonal line of the rectangle as a judgment condition, and then carrying out secondary screening on the grain target by taking the symmetry of the connected domain as a judgment condition.
The method for screening the straw target according to the judgment condition of the length of the rectangular diagonal line comprises the following steps:
traversing the connected domain, and finding out the point (x) with the maximum x value in the pixel points f (x, y) in the connected domainmaxY) point where x is the smallest value (x)minY) point (x, y) with the smallest value of ymin) Y is the maximum (x, y)max) Thereby to (x)min,ymin)、(xmax,ymax) The two points are two points of a rectangular diagonal, the minimum external rectangle is determined, the whole connected domain is represented by the external rectangle, and the length l of the rectangular diagonal is as follows:
Figure GDA0002091463600000081
note T2Is a diagonal length threshold value when l is less than or equal to T2Then, the connected component is considered as a non-straw target, and the value of the pixel in the connected component is set to 0; and the reverse is considered as the straw target.
The method for carrying out secondary screening on the grain target by taking the symmetry of the connected domain as a judgment condition comprises the following steps:
the symmetry is taken as a judgment condition, and the horizontal coordinate of the center of the rectangle is taken
Figure GDA0002091463600000082
The pixel points in the connected domain are (x, y) as boundary points, the connected domain is divided into a first region and a second region according to the size between the x value and the horizontal coordinate of the center of the rectangle, the number of the pixel points in the first region sum1 and the second region sum2 is counted respectively, r is set as a proportionality coefficient, and r is set as:
Figure GDA0002091463600000083
using r as a judgment condition, setting T3Is a symmetry threshold if r is more than or equal to T3Determining that the connected domain does not meet the symmetry condition and is a non-straw target, and setting the gray values of all pixels in the connected domain to be 0; and the reverse is considered as the straw target.
Step six: and drawing the straw outline on the original image according to the result obtained in the step five.
6.1) traversing the image from left to right and from top to bottom, assuming that A is a pixel point with the first value of 255, the pixel point is an outer contour point of a connected domain, if the A is not marked by other contour marks, giving A a new mark, and taking the point as a starting point to execute 6.2);
6.2) taking eight positions around the point A of the pixel point as a mark, taking the right point as 0, adding 1 clockwise in sequence, starting to search clockwise from the position 7 when searching the contour point until meeting the pixel point with the gray value of 255, then searching for one step in the direction, continuously circulating until the point with the pixel value of 255 cannot be found, and finally drawing the found straw contour at the corresponding position of the original image.
The invention solves the problems of the detection of broken straws under the condition that the straws and grains have similar colors and the detection effect is poor due to mutual stacking among the straws and between the straws and the grains. The method can eliminate subjective errors caused by manual inspection, improve detection consistency, improve efficiency and precision, ensure personnel safety, and has the advantages of non-touch property and the like.

Claims (9)

1. An image processing method for automatically detecting residual straws in grains is characterized by comprising the following steps:
the method comprises the following steps: a camera collects an original image, and a computer reads the original image;
step two: carrying out self-adaptive mean filtering on an original image, carrying out fuzzification processing on the image, and taking the fuzzified image as a background image; subtracting the background image from the original image to obtain an edge enhanced image after the edge of the foreground object is enhanced;
step three: converting the edge enhanced image obtained in the step two into a gray image, carrying out self-adaptive gray equalization on the gray image, then carrying out maximum inter-class variance binarization processing, separating grains and straws from the background, simultaneously obtaining a binary image, carrying out opening operation on the binary image to remove noise points, and reducing adhesion between grains and between straws;
step four: carrying out connected domain marking on the binary image obtained in the step three, and carrying out primary screening on the straw target by comparing the number of pixel points in the connected domain;
step five: drawing a minimum external rectangle of the connected domain, and screening a straw target according to the length of a diagonal line of the rectangle as a judgment condition;
step six: and drawing the straw outline on the original image according to the result obtained in the step five.
2. The image processing method for automatically detecting the residual straw in the grain according to claim 1, wherein in the third step, the image is divided into N x N image blocks, wherein N is a positive integer greater than 1, and then the maximum inter-class variance binarization processing is performed on each image block.
3. The image processing method for automatically detecting the residual straws in the grains according to the claim 1 or the claim 2, wherein the second step specifically comprises the following steps:
2.1) setting the size of the original image as W × H, defining the size of a mean kernel s of mean filtering as (min (W, H)/20) × (min (W, H)/20), for a pixel f (x, y) to be processed in the original image, the pixel after mean filtering is g (x, y), and g (x, y) is represented as:
Figure FDA0002398700980000011
M=(min(W,H)/20)×(min(W,H)/20)
2.2) taking the obtained g (x, y) as the image of each pixel point pixel value as a background image, subtracting the background image from the original image to obtain an edge-enhanced image after the edge enhancement of the foreground object, and setting the pixel value of each pixel point of the edge-enhanced image as h (x, y) and expressing as follows:
h(x,y)=f(x,y)-g(x,y) 。
4. the image processing method for automatically detecting the residual straws in the grains according to the claim 1 or the claim 2, wherein the step three specifically comprises the following steps:
3.1) recording R (x, y), G (x, y) and B (x, y) as pixel values of each point of each layer of the edge-enhanced image R, G, B, converting the pixel values into a gray image with gray values of gray (x, y) of each pixel point, wherein the gray (x, y) is as follows:
gray(x,y)=0.299*R(x,y)+0.587*G(x,y)+0.144*B(x,y)
3.2) carrying out gray level equalization on the obtained gray level image to obtain a maximum gray level Gmax and a minimum gray level Gmin, and setting gray level after the gray level equalization as gray '(x, y), wherein the gray' (x, y) is as follows:
Figure FDA0002398700980000021
3.3) recording u as a segmentation threshold value of the foreground and the background, wherein the proportion of the target points in the image is w0Average gray of u0(ii) a The number of background points in the image is w1Average gray of u1(ii) a Variance is g, then:
u=w0u0+w1u1
g=w0(u0-u)2+w1(u1-u)2=w0w1(u1-u0)2
when g is maximum, the division threshold is the maximum between-class variance division threshold TgmaxThen, the input image is segmented as follows:
Figure FDA0002398700980000022
b (x, y) is a pixel value of the output binary image at the point (x, y);
3.4) recording R1kerIn order to operate the radius of the kernel function, the kernel functions are all circular kernel functions, and the radius of the binary image is R1kerOn operation of
Figure FDA0002398700980000023
The on operation is calculated as:
Figure FDA0002398700980000024
wherein A is the set of image elements to be processed, and B is the radius R1kerThe number of the filter kernels is such that,
Figure FDA0002398700980000025
the on-operator is indicated by the presence of an operator,
Figure FDA0002398700980000026
it is shown that the corrosion operation is performed,
Figure FDA0002398700980000027
indicating the dilation operation.
5. The image processing method for automatically detecting the residual straws in the grains according to the claim 1 or the claim 2, wherein the step four specifically comprises the following steps:
4.1) searching a connected domain by adopting a four-connection method, wherein for a binary image, any pixel gray value has only two values of 255 and 0, and for a point f (x, y) with a gray value of 255, whether the pixel values of the four pixel points of f (x-1, y), f (x +1, y), f (x, y-1) and f (x, y +1) are 255 or not is judged, the point with the pixel value of 255 is a four-adjacent point of f (x, y), then the four-adjacent point of the four-adjacent point is searched by taking the four-adjacent point as a center, the four-adjacent point is circulated in sequence, and all the found pixel points are called a connected domain L until no four-adjacent point exists in the found points;
4.2) assuming that T1 is a threshold value for screening the connected domain region, when the number of pixel points included in the connected domain L is less than the threshold value, the connected domain is determined to be a non-straw region, otherwise, the connected domain is a straw region.
6. The image processing method for automatically detecting the residual straws in the grains according to the claim 1 or the claim 2, wherein in the step five, the method for screening the straws according to the length of the rectangular diagonal line as the judgment condition comprises the following steps:
traversing the connected domain, and finding out the point (x) with the maximum x value in the pixel points f (x, y) in the connected domainmaxY) point where x is the smallest value (x)minY) point where y is the maximum (x, y)min) Minimum value of y (x, y)max) And (x)min,ymin)、f(xmax,ymax) To (x)min,ymin)、f(xmax,ymax) The two points are two points of a rectangular diagonal line, so that a minimum external rectangle is determined, the whole connected domain is represented by the external rectangle, and the length l of the rectangular diagonal line is as follows:
Figure FDA0002398700980000031
note T2Is a diagonal length threshold value when l is less than or equal to T2Then, the connected component is considered as a non-straw target, and the value of the pixel in the connected component is set to 0; and the reverse is considered as the straw target.
7. The image processing method for automatically detecting the residual straws in the grains according to the claim 6, wherein in the fifth step, after the straw targets are screened according to the judgment condition of the length of the diagonal line of the rectangle, the grain targets are secondarily screened by taking the symmetry of the connected domain as the judgment condition.
8. The image processing method for automatically detecting the residual straws in the grains according to claim 7, wherein the secondary screening method comprises the following steps:
the symmetry is taken as a judgment condition, and the horizontal coordinate of the center of the rectangle is taken
Figure FDA0002398700980000032
The pixel points in the connected domain are (x, y) as boundary points, the connected domain is divided into a first region and a second region according to the size between the x value and the horizontal coordinate of the center of the rectangle, the number of the pixel points in the first region sum1 and the second region sum2 is counted respectively, r is set as a proportionality coefficient, and r is set as:
Figure FDA0002398700980000033
using r as a judgment condition, setting T3Is a symmetry threshold if r is more than or equal to T3Determining that the connected domain does not meet the symmetry condition and is a non-straw target, and setting the gray values of all pixels in the connected domain to be 0; and the reverse is considered as the straw target.
9. The image processing method for automatically detecting the residual straws in the grains according to the claim 1 or the claim 2, wherein the method for drawing the straw outline in the sixth step is as follows:
6.1) traversing the image from left to right and from top to bottom, assuming that A is a pixel point with the first value of 255, the pixel point is an outer contour point of a connected domain, if the A is not marked by other contour marks, giving A a new mark, and taking the point as a starting point to execute 6.2);
6.2) taking eight positions around the point A of the pixel point as a mark, taking the right point as 0, adding 1 clockwise in sequence, starting to search clockwise from the position 7 when searching the contour point until meeting the pixel point with the gray value of 255, then searching for one step in the direction, continuously circulating until the point with the pixel value of 255 cannot be found, and finally drawing the found straw contour at the corresponding position of the original image.
CN201710645656.2A 2017-08-01 2017-08-01 Image processing method for automatically detecting residual straws in grains Active CN107633503B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710645656.2A CN107633503B (en) 2017-08-01 2017-08-01 Image processing method for automatically detecting residual straws in grains

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710645656.2A CN107633503B (en) 2017-08-01 2017-08-01 Image processing method for automatically detecting residual straws in grains

Publications (2)

Publication Number Publication Date
CN107633503A CN107633503A (en) 2018-01-26
CN107633503B true CN107633503B (en) 2020-05-15

Family

ID=61099500

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710645656.2A Active CN107633503B (en) 2017-08-01 2017-08-01 Image processing method for automatically detecting residual straws in grains

Country Status (1)

Country Link
CN (1) CN107633503B (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109915924B (en) * 2018-05-31 2020-11-24 徐州云创物业服务有限公司 Safety protection type electric heater
CN108876795A (en) * 2018-06-07 2018-11-23 四川斐讯信息技术有限公司 A kind of dividing method and system of objects in images
CN110263205B (en) * 2019-06-06 2023-07-21 温州大学 Retrieval method for ginseng image
CN110296956A (en) * 2019-07-12 2019-10-01 上海交通大学 The method of the content of organic matter in a kind of fermentation of near infrared ray rice straw
CN110782440B (en) * 2019-10-22 2023-06-16 华中农业大学 Crop seed character measuring method
CN111626304B (en) * 2020-05-20 2023-08-04 中国科学院新疆理化技术研究所 Color feature extraction method based on machine vision and application thereof
CN111968148B (en) * 2020-07-20 2023-08-22 华南理工大学 Image processing-based no-load rate calculation method
CN112085725B (en) * 2020-09-16 2021-08-27 塔里木大学 Residual film residual quantity detection method and early warning system based on heuristic iterative algorithm
CN112258534B (en) * 2020-10-26 2022-09-16 大连理工大学 Method for positioning and segmenting small brain earthworm parts in ultrasonic image
CN112668565B (en) * 2020-12-10 2023-02-14 中国科学院西安光学精密机械研究所 Circular target interpretation method aiming at shielding deformation
CN112950535B (en) * 2021-01-22 2024-03-22 北京达佳互联信息技术有限公司 Video processing method, device, electronic equipment and storage medium
CN113139952B (en) * 2021-05-08 2024-04-09 佳都科技集团股份有限公司 Image processing method and device
CN114266748A (en) * 2021-12-22 2022-04-01 四川艾德瑞电气有限公司 Method and device for judging integrity of surface of process plate in rail transit maintenance field
CN114988567A (en) * 2022-07-15 2022-09-02 南通仁源节能环保科技有限公司 Sewage treatment method and system based on activated sludge foam
CN116758081B (en) * 2023-08-18 2023-11-17 安徽乾劲企业管理有限公司 Unmanned aerial vehicle road and bridge inspection image processing method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104867159A (en) * 2015-06-05 2015-08-26 北京大恒图像视觉有限公司 Stain detection and classification method and device for sensor of digital camera

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0717986D0 (en) * 2007-09-14 2007-10-24 Cnh Belgium Nv A method and apparatus for detecting errors in electronically processed images

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104867159A (en) * 2015-06-05 2015-08-26 北京大恒图像视觉有限公司 Stain detection and classification method and device for sensor of digital camera

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
基于Sauvola与Otsu算法的秸秆覆盖率图像检测方法;王丽丽 等;《农业工程》;20170731;第7卷(第4期);第29-35页 *
基于自动取阈分割算法的秸秆覆盖率检测系统;苏艳波 等;《农机化研究》;20120831(第8期);第138-142页 *
秸秆覆盖度图像处理识别及其对出苗率的影响;赵丽;《中国优秀硕士学位论文全文数据库 农业科技辑》;20051015;第2005年卷(第6期);第D045-2页 *

Also Published As

Publication number Publication date
CN107633503A (en) 2018-01-26

Similar Documents

Publication Publication Date Title
CN107633503B (en) Image processing method for automatically detecting residual straws in grains
CN108830832B (en) Plastic barrel surface defect detection method based on machine vision
CN111260616A (en) Insulator crack detection method based on Canny operator two-dimensional threshold segmentation optimization
CN108022233A (en) A kind of edge of work extracting method based on modified Canny operators
US20050175253A1 (en) Method for producing cloud free and cloud-shadow free images
CN108319973A (en) Citrusfruit detection method on a kind of tree
CN111915704A (en) Apple hierarchical identification method based on deep learning
EP1987475A2 (en) Automatic detection and correction of non-red eye flash defects
CN109978848B (en) Method for detecting hard exudation in fundus image based on multi-light-source color constancy model
CN109544583B (en) Method, device and equipment for extracting interested area of leather image
CN106875412B (en) Segmentation positioning method for two overlapped fruits
CN108898132B (en) Terahertz image dangerous article identification method based on shape context description
Sanghadiya et al. Surface defect detection in a tile using digital image processing: Analysis and evaluation
Khordehchi et al. Automatic lung nodule detection based on statistical region merging and support vector machines
JP2022551366A (en) Method, computer program product and computer readable medium for generating masks for camera streams
Arunachalam et al. Identification of defects in fruits using digital image processing
CN115131359A (en) Method for detecting pitting defects on surface of metal workpiece
CN107038690B (en) Moving shadow removing method based on multi-feature fusion
CN115272838A (en) Information fusion technology-based marine plankton automatic identification method and system
Meng et al. Size characterisation of edible bird nest impurities: a preliminary study
CN113971681A (en) Edge detection method for belt conveyor in complex environment
CN110956200A (en) Tire pattern similarity detection method
CN109961012A (en) A kind of underwater target tracking recognition methods
Hoshyar et al. Pre-processing of automatic skin cancer detection system: Comparative study
CN113888503A (en) Product appearance detection method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant