CN107633503A - The image processing method of stalk is remained in a kind of automatic detection grain - Google Patents
The image processing method of stalk is remained in a kind of automatic detection grain Download PDFInfo
- Publication number
- CN107633503A CN107633503A CN201710645656.2A CN201710645656A CN107633503A CN 107633503 A CN107633503 A CN 107633503A CN 201710645656 A CN201710645656 A CN 201710645656A CN 107633503 A CN107633503 A CN 107633503A
- Authority
- CN
- China
- Prior art keywords
- mrow
- image
- stalk
- gray
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Abstract
The invention discloses the image processing method that stalk is remained in a kind of automatic detection grain, original image is gathered first;Original image is carried out to adaptive mean filter, Fuzzy processing is carried out to image, using the image of blurring as background image;And the background image is subtracted with original image, obtain Edge Enhancement image;Edge Enhancement image is changed into gray level image, carries out adaptive grayscale equalization, then carries out maximum between-cluster variance binary conversion treatment, while obtains bianry image, operation is carried out out to bianry image and removes noise;Connected component labeling is carried out to bianry image, preliminary screening is carried out to stalk target by the number for comparing pixel in connected domain;The minimum enclosed rectangle of connected domain is drawn, is that Rule of judgment screens stalk target according to rectangle catercorner length;Stalk profile is drawn on the original image.The influence that the present invention can reduce stacking and illumination is brought to testing result;Prior art is avoided to be difficult to differentiate between the alike situation of color using near-infrared method.
Description
Technical field
The present invention relates to remained in grain after image processing techniques, more particularly to a kind of processing of fast automatic detecting threshing machine
The image processing method of stalk.
Background technology
At present, agricultural machinery is while rice is gathered in, it is necessary to peeling off for paddy is carried out, during peeling off, in heaps
Grain target in can have many crushing straws, so need repeated multiple times carry out sieving, remove stalk, make crushing straw number
In allowed band.Traditional method, eye-observation can only be used judging crushing straw when whether sieving is clean, then decide whether to need
Will sieving again.Not only efficiency is low but also subjective error be present for this method.High temperature, noise, uneasiness be present in Workplace environment
Congruent factor, image processing techniques is applied on the residual stalk in automatic detection grain, can not only eliminate hand inspection
Caused subjective error improves detection uniformity, and can improve efficiency and precision, support personnel's safety, while has non-
The advantages such as contact.
Crushing straw target fast automatic detecting is carried out using image processing techniques, after being peeled off first by camera acquisition
The original image of crushing straw target is mixed with, then crushing straw target present in image is examined with image processing method
Survey, determined a need for carrying out sieving again according to stalk number.Wherein need solve the problems, such as mainly there are three:One is due to paddy
Grain and grain, grain and stalk, stacked between stalk and stalk seriously, so needing as far as possible to open each Target Segmentation
Come.Two be due to that part stalk and grain are close in color and size, and these stalk targets how are told in grain.Three
It is that image procossing is also influenceed by illumination, because uneven illumination causes the result of image procossing error.
The fast automatic detecting method for being currently based on the stalk target of image processing techniques mainly has two kinds:(1) one kind is
Direct contour detecting, stalk target is screened according to contour feature;(2) another is to use stalk and paddy under near-infrared image
Color of object is different is screened for grain.Method (1) needs to there is no stacking in stalk and grain image, in the situation of stacking
Lower disposition is undesirable.Method (2) needs near-infrared video camera, and cost is higher, is not suitable for commercialization.
The content of the invention
Goal of the invention:In order to solve the problems, such as that prior art is present, reduce the influence stacked with uneven illumination, improve grain
The accuracy of detection of middle stalk, the present invention provide the image processing method that stalk is remained in a kind of automatic detection grain.
Technical scheme:The present invention provides the image processing method that stalk is remained in a kind of automatic detection grain, including following
Step:
Step 1:Camera gathers original image, and computer reads in the original image;
Step 2:Original image is carried out to adaptive mean filter, Fuzzy processing is carried out to image, blurring
Image is as background image;And the background image is subtracted with original image, obtain the edge after foreground subject edges are strengthened and add
Strong image;
Step 3:The Edge Enhancement image that step 2 obtains is changed into gray level image, gray level image carried out adaptive
Grayscale equalization, then maximum between-cluster variance binary conversion treatment is carried out, grain, stalk and background separation are opened, while obtain two-value
Image, operation is carried out out to bianry image and removes noise, reduces the adhesion between grain and grain, between grain and stalk;
Step 4:The bianry image obtained to step 3 carries out connected component labeling, by comparing pixel in connected domain
Number carries out preliminary screening to stalk target;
Step 5:The minimum enclosed rectangle of connected domain is drawn, is that Rule of judgment screens stalk according to rectangle catercorner length
Target;
Step 6:According to the results obtained in step five, stalk profile is drawn on the original image.
Further, in the step 3, image is divided into N*N image block, wherein N is the positive integer more than 1, then right
Each image block carries out maximum between-cluster variance binary conversion treatment.
Further, the step 2 specifically includes:
2.1) original image size is set as W × H, and the average core s sizes for defining mean filter are (min (W, H)/20)
× (min (W, H)/20), for a pixel f (x, y) pending in original image, after mean filter the point be g (x,
Y), g (x, y) is expressed as:
M=(min (W, H)/20) × (min (W, H)/20)
2.2) background image is used as using the image that obtained g (x, y) is each pixel pixel value, with the artwork image subtraction back of the body
Scape image obtains the Edge Enhancement image after foreground subject edges are strengthened, if the pixel value of each pixel of Edge Enhancement image is
H (x, y), is expressed as:
H (x, y)=f (x, y)-g (x, y).
Further, the step 3 specifically includes:
3.1) it is respectively the pixel value of each point of each layer of edge strengthening image R, G, B to remember R (x, y), G (x, y), B (x, y),
The gray value for being converted into each pixel is gray (x, y) gray level image, and gray (x, y) is:
Gray (x, y)=0.299*R (x, y)+0.587*G (x, y)+0.144*B (x, y)
3.2) grayscale equalization is carried out to the gray level image of acquisition, obtains gray scale maximum Gmax and minimum value Gmin, if
Gray value is gray'(x, y after grayscale equalization), gray'(x, y) be:
3.3) segmentation threshold that T is prospect and background is remembered, it is w that target points, which account for image scaled,0, average gray u0;Background
It is w that points, which account for image scaled,1, average gray u1;Variance is g, then has:
U=w0u0+w1u1
G=w0(u0-u)2+w1(u1-u)2=w0w1(u1-u0)2
When g maximums, segmentation threshold is maximum between-cluster variance segmentation threshold Tgmax, then input picture is divided as follows
Cut:
B (x, y) is pixel value of the output bianry image at point (x, y) place;
3.4) R is remembered1kerTo open the radius of operation kernel function, and used kernel function is circular kernel function, to two
It is R to be worth image to carry out radius1kerOpen operationOpen being calculated as operation:
Further, the step 4 specifically includes:
4.1) connected domain is found using four connection methods, for bianry image, any grey scale pixel value only has 255 and 0 liang
Individual value, for the point b (x, y) that a gray value is 255, judge b (x-1, y), b (x+1, y), b (x, y-1) and b (x, y+1) four
Whether the pixel value of individual pixel is 255, meets four abutment points that the point that pixel value is 255 is b (x, y), then with four adjoining
Four abutment points of four abutment points are found centered on point, are circulated successively, until finding o'clock there is no four abutment points, then all
The pixel found is referred to as a connected domain L;
4.2) threshold value that T1 is screening connected domain region is assumed, then when the number of the connected domain L pixels included is less than threshold
It is all non-stalk region that the connected domain is then judged during value, otherwise is stalk region.
Further, it is that Rule of judgment screens stalk mesh calibration method according to rectangle catercorner length in the step 5
For:
Connected domain is traveled through, is found in the connected domain in pixel, the maximum point (x of x valuesmax, y), the minimum point (x of x valuesmin,
Y), y values maximum point (x, ymin), y values minimum (x, ymax), so as to (xmin,ymin), (xmax,ymax) 2 points determine it is external most
Small rectangle, whole connected domain is represented with boundary rectangle, rectangle catercorner length l is:
Remember T2For catercorner length threshold value, as l≤T2When, the connected domain is considered as being non-stalk target, picture in its connected domain
The value of element is arranged to 0;Otherwise it is considered stalk target.
Further, in the step 5, after screening stalk target according to rectangle catercorner length for Rule of judgment, then with
The symmetry of connected domain carries out postsearch screening as decision condition to grain target.
Further, the method for postsearch screening is:
Using symmetry as decision condition, with rectangular centre abscissaFor separation, pixel is in connected domain
(x, y), first area and second area are divided into according to the size between x values and rectangular centre abscissa, count the firstth area respectively
Pixel number in domain sum1 and second area sum2, if r is proportionality coefficient, r is:
Using r as Rule of judgment, if T3For symmetry threshold value, if r >=T3Then regard as the connected domain and do not meet symmetry bar
Part, is non-stalk target, and all pixels gray value is also set to 0 in the connected domain;Otherwise it is considered stalk target.
Further, the method for stalk profile being drawn in the step 6 is:
6.1) from left to right, traversing graph picture from top to bottom, it is assumed that A is the pixel that first value run into is 255, then should
Point is the outline point of a connected domain, if A points are not recorded a demerit by other delineators, gives A mono- new label, and with this
Individual point is 6.2) starting point performs;
6.2) with eight positions around pixel A points, make a mark, be designated as 0 with its right point, add 1 successively clockwise,
When searching profile point, begun look in the direction of the clock since 7 positions, until running into the pixel that gray value is 255, so
A step is searched for by this direction again afterwards, constantly circulation is until can not find the point that pixel value is 255, finally in original image corresponding positions
Put the stalk profile for drawing and finding.
The image processing method of stalk is remained in a kind of automatic detection grain provided by the invention, is had the following effects that:
(1) present invention passes through the step of image preprocessing, preferably can extract each object edge, reduces heap
Folded influence, the problem of avoiding the stacking that direct contour detecting brings from influenceing Detection results;
(2) present invention first carries out piecemeal to image, then each block of image is entered using maximum between-cluster variance (OTSU) respectively
Row binary conversion treatment, stalk, grain target and background are automatically separated out, realizes the adaptive of threshold value, enhanced and illumination is become
Change, fixed etc. robustness, piecemeal processing do not reduce the influence that whole image uneven illumination comes to detection band to relative position;
(3) present invention is operated by finding connected domain based on pixel to connected domain, more accurately, and is had very well
Operability, for the stalk of different dimensional standards, in that context it may be convenient to adjusting parameter, preferably screened;
(4) present invention is not based on color space, can avoid, when stalk turns yellow, mutually mixing, avoiding existing with grain target
Technology is difficult to differentiate between the alike situation of color using near-infrared method.
Brief description of the drawings
Fig. 1 is the overall flow figure for the image processing method that stalk is remained in automatic detection grain of the present invention;
Fig. 2 is object edge Strengthen process figure of the present invention;
Fig. 3 is the flow chart that the present invention is partitioned into prospect independence grain stalk target;
Fig. 4 is that the present invention filters out stalk target and the flow chart marked.
Embodiment
The invention will be further described with specific embodiment below in conjunction with the accompanying drawings.
Referring to figs. 1 to shown in Fig. 4, the image processing method of stalk is remained in automatic detection grain, is comprised the following steps:
Step 1:Camera gathers original image, and computer reads in the original image.
Step 2:As shown in Fig. 2 original image to be carried out to adaptive mean filter, Fuzzy processing is carried out to image,
Using the image of blurring as background image;And the background image is subtracted with original image, obtain foreground subject edges reinforcing
Edge Enhancement image afterwards, by the image preprocessing of this step, preferably each object edge can be extracted, reduced
The influence of stacking, the stacking for avoiding direct contour detecting from bringing influence Detection results.
2.1) original image size is set as W × H, and the average core s sizes for defining mean filter are (min (W, H)/20)
× (min (W, H)/20), for a pixel f (x, y) pending in original image, after mean filter the point be g (x,
Y), g (x, y) is expressed as:
M=(min (W, H)/20) × (min (W, H)/20) (2)
Specific formula for calculation is as follows:
2.2) background image is used as using the image that obtained g (x, y) is each pixel pixel value, with the artwork image subtraction back of the body
Scape image obtains the Edge Enhancement image after foreground subject edges are strengthened, if the pixel value of each pixel of Edge Enhancement image is
H (x, y), is expressed as:
H (x, y)=f (x, y)-g (x, y) (4)
The edge of target will have dark side after treated, can reduce the shadow of heap overlay later process between stalk and grain
Ring.
Step 3:As shown in figure 3, the Edge Enhancement image that step 2 obtains is changed into gray level image, and by gray scale model
Enclosing and stretched, carry out adaptive grayscale equalization, image is divided into N*N image block, wherein N is the positive integer more than 1, then
Maximum between-cluster variance (OTSU) binary conversion treatment is carried out to each image block, grain, stalk are opened with background separation, obtained simultaneously
Bianry image, operation is carried out out to bianry image and removes noise, reduced viscous between grain and grain, between grain and stalk
Even.Image is subjected to piecemeal binary conversion treatment again, influence of the intensity of illumination inequality to processing result image can be avoided.Open operation
Noise can be gone, reduces the adhesion between grain and grain, grain and stalk, reduces the influence for stacking and bringing.
3.1) it is respectively the pixel value of each point of each layer of edge strengthening image R, G, B to remember R (x, y), G (x, y), B (x, y),
The gray value for being converted into each pixel is gray (x, y) gray level image, and gray (x, y) is:
Gray (x, y)=0.299*R (x, y)+0.587*G (x, y)+0.144*B (x, y) (5)
3.2) grayscale equalization is carried out to the gray level image of acquisition, obtains gray scale maximum Gmax and minimum value Gmin, if
Gray value is gray'(x, y after grayscale equalization), gray'(x, y) be:
3.3) segmentation threshold that T is prospect and background is remembered, it is w that target points, which account for image scaled,0, average gray u0;Background
It is w that points, which account for image scaled,1, average gray u1;Variance is g, then has:
U=w0u0+w1u1 (7)
G=w0(u0-u)2+w1(u1-u)2=w0w1(u1-u0)2 (8)
When g maximums, segmentation threshold is maximum between-cluster variance (OTSU) segmentation threshold Tgmax, then input picture is carried out such as
Lower segmentation:
B (x, y) is pixel value of the output bianry image at point (x, y) place;
3.4) R is remembered1kerTo open the radius of operation kernel function, and used kernel function is circular kernel function, to two
It is R to be worth image to carry out radius1kerOpen operationOpen being calculated as operation:
In step 3.3), segmentation threshold is obtained automatically using OTSU, binaryzation cross section square chart picture, implementation method is such as
Under:
(a) normalization histogram of calculating input image, uses pi, i=1,2...L-1 represents each point of the histogram
Amount;
(b) for k=1,2...L-1, accumulation and P are calculated1(k):
(c) for k=1,2...L-1, cumulative mean value m (k) is calculated:
(d) global gray average m is calculatedG:
(e) for k=1,2...L-1, inter-class variance is calculated
So thatMaximum k values, as OTSU threshold values k*, threshold value k now*=TgmaxFor OTSU segmentation thresholds.
In step 3.4), open operation and go noise, reduce the adhesion between stalk and grain, implementation method is as follows:
(m) corrode:As Z2In set A and B, be expressed asCorrosion of the B to A be defined as:
Above formula shows that the result of corrosion of the B to A is all z set, after wherein B translations z still in A;In other words, use
The set that B corrosion A is obtained is the set that B is entirely included in B origin position when in A;
(n) R is remembered1kerTo open the radius of operation kernel function, the kernel function used is circular kernel function, its corresponding kernel function
It is expressed as B1ker, then it is as follows to open Operation Definition:
The effect for opening operation is to remove the noise isolated in image and thin protrusion;
Step 4:As shown in figure 4, then the bianry image progress stalk target of acquisition is screened and marked, specifically
Shown in reference picture 4, the bianry image obtained first to step 3 carries out connected component labeling, by comparing pixel in connected domain
Number carries out preliminary screening to stalk target.
4.1) connected domain is found using four connection methods, for bianry image, any grey scale pixel value only has 255 and 0 liang
Individual value, for the point b (x, y) that a gray value is 255, judge b (x-1, y), b (x+1, y), b (x, y-1) and b (x, y+1) four
Whether the pixel value of individual pixel is 255, meets four abutment points that the point that pixel value is 255 is b (x, y), then with four adjoining
Four abutment points of four abutment points are found centered on point, are circulated successively, until finding o'clock there is no four abutment points, then all
The pixel found is referred to as a connected domain L;
4.2) threshold value that T1 is screening connected domain region is assumed, then when the number of the connected domain L pixels included is less than threshold
It is all non-stalk region that the connected domain is then judged during value, and its interior pixels gray value is both configured into 0;Otherwise it is stalk region.
Step 5:The minimum enclosed rectangle of connected domain is drawn, is first that Rule of judgment screens straw according to rectangle catercorner length
Stalk target, then postsearch screening is carried out to grain target using the symmetry of connected domain as decision condition.
It is that Rule of judgment screening stalk mesh calibration method is according to rectangle catercorner length:
Connected domain is traveled through, is found in the connected domain in pixel f (x, y), the maximum point (x of x valuesmax, y), x values it is minimum
Point (xmin, y), minimum point (x, the y of y valuesmin), y values maximum (x, ymax) so as to (xmin,ymin)、(xmax,ymax) 2 points be square
2 points of shape diagonal, it is determined that external minimum rectangle, represent whole connected domain with boundary rectangle, rectangle catercorner length l is:
Remember T2For catercorner length threshold value, as l≤T2When, the connected domain is considered as being non-stalk target, picture in its connected domain
The value of element is arranged to 0;Otherwise it is considered stalk target.
Using the symmetry of connected domain as decision condition to grain target carry out postsearch screening method as:
Using symmetry as decision condition, with rectangular centre abscissaFor separation, pixel is in connected domain
(x, y), first area and second area are divided into according to the size between x values and rectangular centre abscissa, count the firstth area respectively
Pixel number in domain sum1 and second area sum2, if r is proportionality coefficient, r is:
Using r as Rule of judgment, if T3For symmetry threshold value, if r >=T3Then regard as the connected domain and do not meet symmetry bar
Part, is non-stalk target, and all pixels gray value is also set to 0 in the connected domain;Otherwise it is considered stalk target.
Step 6:According to the results obtained in step five, stalk profile is drawn on the original image.
6.1) from left to right, traversing graph picture from top to bottom, it is assumed that A is the pixel that first value run into is 255, then should
Point is the outline point of a connected domain, if A points are not recorded a demerit by other delineators, gives A mono- new label, and with this
Individual point is 6.2) starting point performs;
6.2) with eight positions around pixel A points, make a mark, be designated as 0 with its right point, add 1 successively clockwise,
When searching profile point, begun look in the direction of the clock since 7 positions, until running into the pixel that gray value is 255, so
A step is searched for by this direction again afterwards, constantly circulation is until can not find the point that pixel value is 255, finally in original image corresponding positions
Put the stalk profile for drawing and finding.
The present invention is solved the problems, such as due to detection crushing straw in the case of stalk and grain Similar color, and due to stalk
Between, caused by being stacked between stalk and grain the problem of Detection results difference.Caused by can not only eliminating hand inspection
Subjective error improve detection uniformity, and efficiency and precision, support personnel's safety can be improved, while there is non-touch property etc.
Advantage.
Claims (9)
1. the image processing method of stalk is remained in a kind of automatic detection grain, it is characterised in that comprise the following steps:
Step 1:Camera gathers original image, and computer reads in the original image;
Step 2:Original image is carried out to adaptive mean filter, Fuzzy processing is carried out to image, the image of blurring
As background image;And the background image is subtracted with original image, obtain the Edge Enhancement figure after foreground subject edges are strengthened
Picture;
Step 3:The Edge Enhancement image that step 2 obtains is changed into gray level image, adaptive gray scale is carried out to gray level image
Equalization, then maximum between-cluster variance binary conversion treatment is carried out, grain, stalk and background separation are opened, while bianry image is obtained,
Operation is carried out out to bianry image and removes noise, reduces the adhesion between grain and grain, between grain and stalk;
Step 4:The bianry image obtained to step 3 carries out connected component labeling, by the number for comparing pixel in connected domain
Preliminary screening is carried out to stalk target;
Step 5:The minimum enclosed rectangle of connected domain is drawn, is that Rule of judgment screens stalk target according to rectangle catercorner length;
Step 6:According to the results obtained in step five, stalk profile is drawn on the original image.
2. the image processing method of stalk is remained in automatic detection grain according to claim 1, it is characterised in that described
In step 3, image is divided into N*N image block, wherein N is the positive integer more than 1, then carries out maximum kind to each image block
Between variance binary conversion treatment.
3. the image processing method of stalk is remained in automatic detection grain according to claim 1 or 2, it is characterised in that
The step 2 specifically includes:
2.1) set the original image size as W × H, define the average core s sizes of mean filter for (min (W, H)/20) ×
(min (W, H)/20), for a pixel f (x, y) pending in original image, the point is g (x, y), g after mean filter
(x, y) is expressed as:
<mrow>
<mi>g</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfrac>
<mn>1</mn>
<mi>M</mi>
</mfrac>
<munder>
<mo>&Sigma;</mo>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
<mo>&Element;</mo>
<mi>s</mi>
</mrow>
</munder>
<mi>f</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
</mrow>
M=(min (W, H)/20) × (min (W, H)/20)
2.2) using image that obtained g (x, y) is each pixel pixel value as background image, with artwork image subtraction Background
As obtain foreground subject edges strengthen after Edge Enhancement image, if the pixel value of each pixel of Edge Enhancement image be h (x,
Y), it is expressed as:
H (x, y)=f (x, y)-g (x, y).
4. the image processing method of stalk is remained in automatic detection grain according to claim 1 or 2, it is characterised in that
The step 3 specifically includes:
3.1) it is respectively the pixel value of each point of each layer of edge strengthening image R, G, B to remember R (x, y), G (x, y), B (x, y), is converted
Gray value for each pixel is gray (x, y) gray level image, and gray (x, y) is:
Gray (x, y)=0.299*R (x, y)+0.587*G (x, y)+0.144*B (x, y)
3.2) grayscale equalization is carried out to the gray level image of acquisition, gray scale maximum Gmax and minimum value Gmin is obtained, if gray scale
Gray value is gray'(x, y after equalization), gray'(x, y) be:
<mrow>
<msup>
<mi>gray</mi>
<mo>&prime;</mo>
</msup>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfrac>
<mrow>
<mi>g</mi>
<mi>r</mi>
<mi>a</mi>
<mi>y</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<mi>G</mi>
<mi> </mi>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
<mo>-</mo>
<mi>G</mi>
<mi> </mi>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
</mrow>
</mfrac>
<mo>*</mo>
<mn>255</mn>
</mrow>
3.3) segmentation threshold that T is prospect and background is remembered, it is w that target points, which account for image scaled,0, average gray u0;Background is counted
It is w to account for image scaled1, average gray u1;Variance is g, then has:
U=w0u0+w1u1
G=w0(u0-u)2+w1(u1-u)2=w0w1(u1-u0)2
When g maximums, segmentation threshold is maximum between-cluster variance segmentation threshold Tgmax, then input picture is split as follows:
<mrow>
<mi>b</mi>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>=</mo>
<mfenced open = "{" close = "">
<mtable>
<mtr>
<mtd>
<mn>255</mn>
</mtd>
<mtd>
<mrow>
<msup>
<mi>gray</mi>
<mo>&prime;</mo>
</msup>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo>></mo>
<msub>
<mi>T</mi>
<mrow>
<mi>g</mi>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
</mrow>
</msub>
</mrow>
</mtd>
</mtr>
<mtr>
<mtd>
<mn>0</mn>
</mtd>
<mtd>
<mrow>
<msup>
<mi>gray</mi>
<mo>&prime;</mo>
</msup>
<mrow>
<mo>(</mo>
<mi>x</mi>
<mo>,</mo>
<mi>y</mi>
<mo>)</mo>
</mrow>
<mo><</mo>
<msub>
<mi>T</mi>
<mrow>
<mi>g</mi>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
</mrow>
</msub>
</mrow>
</mtd>
</mtr>
</mtable>
</mfenced>
</mrow>
B (x, y) is pixel value of the output bianry image at point (x, y) place;
3.4) R is remembered1kerTo open the radius of operation kernel function, and used kernel function is circular kernel function, to bianry image
Progress radius is R1kerOpen operationOpen being calculated as operation:
5. the image processing method of stalk is remained in automatic detection grain according to claim 1 or 2, it is characterised in that
The step 4 specifically includes:
4.1) connected domain is found using four connection methods, for bianry image, any grey scale pixel value only has 255 and 0 two values,
For the point f (x, y) that a gray value is 255, four pictures of f (x-1, y), f (x+1, y), f (x, y-1) and f (x, y+1) are judged
Whether the pixel value of vegetarian refreshments is 255, meets four abutment points that the point that pixel value is 255 is f (x, y), then using four abutment points as
Four abutment points of four abutment points are found at center, are circulated successively, until finding o'clock there is no four abutment points, are then found all
Pixel be referred to as a connected domain L;
4.2) threshold value that T1 is screening connected domain region is assumed, then when the number of the connected domain L pixels included is less than threshold value
It is all non-stalk region then to judge the connected domain, otherwise is stalk region.
6. the image processing method of stalk is remained in automatic detection grain according to claim 1 or 2, it is characterised in that
It is that Rule of judgment screening stalk mesh calibration method is according to rectangle catercorner length in the step 5:
Connected domain is traveled through, is found in the connected domain in pixel f (x, y), the maximum point (x of x valuesmax, y), the minimum point of x values
(xmin, y), maximum point (x, the y of y valuesmin), y values minimum (x, ymax) and (xmin,ymin)、f(xmax,ymax), with (xmin,
ymin)、f(xmax,ymax) 2 points be 2 points of rectangle diagonal, so that it is determined that external minimum rectangle, represents whole with boundary rectangle
Connected domain, rectangle catercorner length l are:
<mrow>
<mi>l</mi>
<mo>=</mo>
<msqrt>
<mrow>
<msup>
<mrow>
<mo>(</mo>
<msub>
<mi>x</mi>
<mrow>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
</mrow>
</msub>
<mo>-</mo>
<msub>
<mi>x</mi>
<mrow>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
</mrow>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
<mo>-</mo>
<msup>
<mrow>
<mo>(</mo>
<msub>
<mi>y</mi>
<mrow>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
</mrow>
</msub>
<mo>-</mo>
<msub>
<mi>y</mi>
<mrow>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
</mrow>
</msub>
<mo>)</mo>
</mrow>
<mn>2</mn>
</msup>
</mrow>
</msqrt>
</mrow>
Remember T2For catercorner length threshold value, as l≤T2When, the connected domain is considered as being non-stalk target, pixel in its connected domain
Value is arranged to 0;Otherwise it is considered stalk target.
7. the image processing method of stalk is remained in automatic detection grain according to claim 1 or 2, it is characterised in that
It is the symmetry work after Rule of judgment screens stalk target, then with connected domain according to rectangle catercorner length in the step 5
Postsearch screening is carried out to grain target for decision condition.
8. the image processing method of stalk is remained in automatic detection grain according to claim 7, it is characterised in that secondary
The method of screening is:
Using symmetry as decision condition, with rectangular centre abscissaFor separation, in connected domain pixel for (x,
Y), first area and second area are divided into according to the size between x values and rectangular centre abscissa, count first area respectively
Pixel number in sum1 and second area sum2, if r is proportionality coefficient, r is:
<mrow>
<mi>r</mi>
<mo>=</mo>
<mfrac>
<mrow>
<mi>m</mi>
<mi>a</mi>
<mi>x</mi>
<mrow>
<mo>(</mo>
<mi>s</mi>
<mi>u</mi>
<mi>m</mi>
<mn>1</mn>
<mo>,</mo>
<mi>s</mi>
<mi>u</mi>
<mi>m</mi>
<mn>2</mn>
<mo>)</mo>
</mrow>
</mrow>
<mrow>
<mi>m</mi>
<mi>i</mi>
<mi>n</mi>
<mrow>
<mo>(</mo>
<mi>s</mi>
<mi>u</mi>
<mi>m</mi>
<mn>1</mn>
<mo>,</mo>
<mi>s</mi>
<mi>u</mi>
<mi>m</mi>
<mn>2</mn>
<mo>)</mo>
</mrow>
</mrow>
</mfrac>
</mrow>
Using r as Rule of judgment, if T3For symmetry threshold value, if r >=T3Then regard as the connected domain and do not meet Symmetry Condition,
It is non-stalk target, all pixels gray value is also set to 0 in the connected domain;Otherwise it is considered stalk target.
9. the image processing method of stalk is remained in automatic detection grain according to claim 1 or 2, it is characterised in that
The method that stalk profile is drawn in the step 6 is:
6.1) from left to right, traversing graph picture from top to bottom, it is assumed that A is the pixel that first value run into is 255, then the point is
The outline point of one connected domain, if A points are not recorded a demerit by other delineators, give A mono- new label, and with this point
Performed 6.2) for starting point;
6.2) with eight positions around pixel A points, make a mark, be designated as 0 with its right point, add 1 successively clockwise, searching
When seeking profile point, begun look in the direction of the clock since 7 positions, until running into the pixel that gray value is 255, Ran Houzai
A step is searched for by this direction, constantly circulation is finally drawn until can not find the point that pixel value is 255 in original image relevant position
Go out the stalk profile found.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710645656.2A CN107633503B (en) | 2017-08-01 | 2017-08-01 | Image processing method for automatically detecting residual straws in grains |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710645656.2A CN107633503B (en) | 2017-08-01 | 2017-08-01 | Image processing method for automatically detecting residual straws in grains |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107633503A true CN107633503A (en) | 2018-01-26 |
CN107633503B CN107633503B (en) | 2020-05-15 |
Family
ID=61099500
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710645656.2A Active CN107633503B (en) | 2017-08-01 | 2017-08-01 | Image processing method for automatically detecting residual straws in grains |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107633503B (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108876795A (en) * | 2018-06-07 | 2018-11-23 | 四川斐讯信息技术有限公司 | A kind of dividing method and system of objects in images |
CN110263205A (en) * | 2019-06-06 | 2019-09-20 | 温州大学 | A kind of search method for ginseng image |
CN110296956A (en) * | 2019-07-12 | 2019-10-01 | 上海交通大学 | The method of the content of organic matter in a kind of fermentation of near infrared ray rice straw |
WO2019228087A1 (en) * | 2018-05-31 | 2019-12-05 | Ge Gaoli | Safety protection type electric heater |
CN110782440A (en) * | 2019-10-22 | 2020-02-11 | 华中农业大学 | Crop grain character measuring method |
CN111626304A (en) * | 2020-05-20 | 2020-09-04 | 中国科学院新疆理化技术研究所 | Color feature extraction method based on machine vision and application thereof |
CN111968148A (en) * | 2020-07-20 | 2020-11-20 | 华南理工大学 | No-load rate calculation method based on image processing |
CN112085725A (en) * | 2020-09-16 | 2020-12-15 | 塔里木大学 | Residual film residual quantity detection method and early warning system based on heuristic iterative algorithm |
CN112258534A (en) * | 2020-10-26 | 2021-01-22 | 大连理工大学 | Method for positioning and segmenting small brain earthworm parts in ultrasonic image |
CN112668565A (en) * | 2020-12-10 | 2021-04-16 | 中国科学院西安光学精密机械研究所 | Circular target interpretation method aiming at shielding deformation |
CN112950535A (en) * | 2021-01-22 | 2021-06-11 | 北京达佳互联信息技术有限公司 | Video processing method and device, electronic equipment and storage medium |
CN113139952A (en) * | 2021-05-08 | 2021-07-20 | 佳都科技集团股份有限公司 | Image processing method and device |
CN114266748A (en) * | 2021-12-22 | 2022-04-01 | 四川艾德瑞电气有限公司 | Method and device for judging integrity of surface of process plate in rail transit maintenance field |
CN114988567A (en) * | 2022-07-15 | 2022-09-02 | 南通仁源节能环保科技有限公司 | Sewage treatment method and system based on activated sludge foam |
CN116758081A (en) * | 2023-08-18 | 2023-09-15 | 安徽乾劲企业管理有限公司 | Unmanned aerial vehicle road and bridge inspection image processing method |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2036424A2 (en) * | 2007-09-14 | 2009-03-18 | CNH Belgium N.V. | A method and apparatus for detecting errors in grain analysis apparatus using electronically processed images |
CN104867159A (en) * | 2015-06-05 | 2015-08-26 | 北京大恒图像视觉有限公司 | Stain detection and classification method and device for sensor of digital camera |
-
2017
- 2017-08-01 CN CN201710645656.2A patent/CN107633503B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2036424A2 (en) * | 2007-09-14 | 2009-03-18 | CNH Belgium N.V. | A method and apparatus for detecting errors in grain analysis apparatus using electronically processed images |
CN104867159A (en) * | 2015-06-05 | 2015-08-26 | 北京大恒图像视觉有限公司 | Stain detection and classification method and device for sensor of digital camera |
Non-Patent Citations (4)
Title |
---|
BILAL BATAINEH ET AL.: "An adaptive local binarization method for document images based on a novel thresholding method and dynamic windows", 《PATTERN RECOGNITION LETTERS》 * |
王丽丽 等: "基于Sauvola与Otsu算法的秸秆覆盖率图像检测方法", 《农业工程》 * |
苏艳波 等: "基于自动取阈分割算法的秸秆覆盖率检测系统", 《农机化研究》 * |
赵丽: "秸秆覆盖度图像处理识别及其对出苗率的影响", 《中国优秀硕士学位论文全文数据库 农业科技辑》 * |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019228087A1 (en) * | 2018-05-31 | 2019-12-05 | Ge Gaoli | Safety protection type electric heater |
CN108876795A (en) * | 2018-06-07 | 2018-11-23 | 四川斐讯信息技术有限公司 | A kind of dividing method and system of objects in images |
CN110263205A (en) * | 2019-06-06 | 2019-09-20 | 温州大学 | A kind of search method for ginseng image |
CN110263205B (en) * | 2019-06-06 | 2023-07-21 | 温州大学 | Retrieval method for ginseng image |
CN110296956A (en) * | 2019-07-12 | 2019-10-01 | 上海交通大学 | The method of the content of organic matter in a kind of fermentation of near infrared ray rice straw |
CN110782440B (en) * | 2019-10-22 | 2023-06-16 | 华中农业大学 | Crop seed character measuring method |
CN110782440A (en) * | 2019-10-22 | 2020-02-11 | 华中农业大学 | Crop grain character measuring method |
CN111626304A (en) * | 2020-05-20 | 2020-09-04 | 中国科学院新疆理化技术研究所 | Color feature extraction method based on machine vision and application thereof |
CN111626304B (en) * | 2020-05-20 | 2023-08-04 | 中国科学院新疆理化技术研究所 | Color feature extraction method based on machine vision and application thereof |
CN111968148A (en) * | 2020-07-20 | 2020-11-20 | 华南理工大学 | No-load rate calculation method based on image processing |
CN111968148B (en) * | 2020-07-20 | 2023-08-22 | 华南理工大学 | Image processing-based no-load rate calculation method |
CN112085725B (en) * | 2020-09-16 | 2021-08-27 | 塔里木大学 | Residual film residual quantity detection method and early warning system based on heuristic iterative algorithm |
CN112085725A (en) * | 2020-09-16 | 2020-12-15 | 塔里木大学 | Residual film residual quantity detection method and early warning system based on heuristic iterative algorithm |
CN112258534A (en) * | 2020-10-26 | 2021-01-22 | 大连理工大学 | Method for positioning and segmenting small brain earthworm parts in ultrasonic image |
CN112668565A (en) * | 2020-12-10 | 2021-04-16 | 中国科学院西安光学精密机械研究所 | Circular target interpretation method aiming at shielding deformation |
CN112668565B (en) * | 2020-12-10 | 2023-02-14 | 中国科学院西安光学精密机械研究所 | Circular target interpretation method aiming at shielding deformation |
CN112950535A (en) * | 2021-01-22 | 2021-06-11 | 北京达佳互联信息技术有限公司 | Video processing method and device, electronic equipment and storage medium |
CN112950535B (en) * | 2021-01-22 | 2024-03-22 | 北京达佳互联信息技术有限公司 | Video processing method, device, electronic equipment and storage medium |
CN113139952A (en) * | 2021-05-08 | 2021-07-20 | 佳都科技集团股份有限公司 | Image processing method and device |
CN113139952B (en) * | 2021-05-08 | 2024-04-09 | 佳都科技集团股份有限公司 | Image processing method and device |
CN114266748A (en) * | 2021-12-22 | 2022-04-01 | 四川艾德瑞电气有限公司 | Method and device for judging integrity of surface of process plate in rail transit maintenance field |
CN114988567A (en) * | 2022-07-15 | 2022-09-02 | 南通仁源节能环保科技有限公司 | Sewage treatment method and system based on activated sludge foam |
CN116758081A (en) * | 2023-08-18 | 2023-09-15 | 安徽乾劲企业管理有限公司 | Unmanned aerial vehicle road and bridge inspection image processing method |
CN116758081B (en) * | 2023-08-18 | 2023-11-17 | 安徽乾劲企业管理有限公司 | Unmanned aerial vehicle road and bridge inspection image processing method |
Also Published As
Publication number | Publication date |
---|---|
CN107633503B (en) | 2020-05-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107633503A (en) | The image processing method of stalk is remained in a kind of automatic detection grain | |
Lu et al. | Histogram-based automatic thresholding for bruise detection of apples by structured-illumination reflectance imaging | |
CN108022233A (en) | A kind of edge of work extracting method based on modified Canny operators | |
CN107220624A (en) | A kind of method for detecting human face based on Adaboost algorithm | |
CN111915704A (en) | Apple hierarchical identification method based on deep learning | |
Fabijańska | Variance filter for edge detection and edge-based image segmentation | |
CN108319973A (en) | Citrusfruit detection method on a kind of tree | |
CN104794502A (en) | Image processing and mode recognition technology-based rice blast spore microscopic image recognition method | |
CN109636824A (en) | A kind of multiple target method of counting based on image recognition technology | |
CN114399522A (en) | High-low threshold-based Canny operator edge detection method | |
CN106815843A (en) | A kind of fruit object acquisition methods based on convex closure center priori and absorbing Marcov chain | |
Ji et al. | Apple grading method based on features of color and defect | |
Choi et al. | Detection of dropped citrus fruit on the ground and evaluation of decay stages in varying illumination conditions | |
CN109978848A (en) | Method based on hard exudate in multiple light courcess color constancy model inspection eye fundus image | |
CN101706959A (en) | Method for extracting surface defects of metal sheets and strips on basis of two-dimensional information entropy | |
CN111814825B (en) | Apple detection grading method and system based on genetic algorithm optimization support vector machine | |
CN115272838A (en) | Information fusion technology-based marine plankton automatic identification method and system | |
Shen et al. | Development of a new machine vision algorithm to estimate potato's shape and size based on support vector machine | |
CN111353992A (en) | Agricultural product defect detection method and system based on textural features | |
CN109682821B (en) | Citrus surface defect detection method based on multi-scale Gaussian function | |
CN110047064B (en) | Potato scab detection method | |
CN108269264A (en) | The denoising of beans seed image and fractal method | |
CN109961012A (en) | A kind of underwater target tracking recognition methods | |
CN101719273A (en) | On-line self-adaptation extraction method of metallurgy strip surface defect based on one-dimension information entropy | |
Hassoon | Classification and Diseases Identification of Mango Based on Artificial Intelligence: A Review |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |