AU2020103260A4 - Rice blast grading system and method - Google Patents

Rice blast grading system and method Download PDF

Info

Publication number
AU2020103260A4
AU2020103260A4 AU2020103260A AU2020103260A AU2020103260A4 AU 2020103260 A4 AU2020103260 A4 AU 2020103260A4 AU 2020103260 A AU2020103260 A AU 2020103260A AU 2020103260 A AU2020103260 A AU 2020103260A AU 2020103260 A4 AU2020103260 A4 AU 2020103260A4
Authority
AU
Australia
Prior art keywords
image
grading
pixel area
rice blast
rice
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
AU2020103260A
Inventor
Chen CAI
Yan Cao
Peng HE
Liang Hu
Bo Lei
Yongbo Liu
Jiangyun Tang
Qingxiang Tang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Agricultural Information And Rural Economic Research Institute Of Sichuan Academy Of Agricultural Sciences
Original Assignee
Agricultural Information And Rural Economic Res Institute Of Sichuan Academy Of Agricultural Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agricultural Information And Rural Economic Res Institute Of Sichuan Academy Of Agricultural Science filed Critical Agricultural Information And Rural Economic Res Institute Of Sichuan Academy Of Agricultural Science
Priority to AU2020103260A priority Critical patent/AU2020103260A4/en
Application granted granted Critical
Publication of AU2020103260A4 publication Critical patent/AU2020103260A4/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30128Food products
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/68Food, e.g. fruit or vegetables

Abstract

The present invention provides a rice blast grading system and method. The grading system and method is capable of fast, stable and accurate rice blast grading, is easily operated, and can acquire in real time an accurate rice blast grading result by photographing using a mobile terminal, thus lowering identification requirements and improving the efficiency of identification operation. The grading system comprises: an image pre-processing unit; a lesion feature extracting unit, used for calculating a first pixel area ImgN of a whole leaf and a second pixel area Img-n having lesions removed, wherein the second pixel area Img_n having lesions removed is obtained by performing reverse threshold segmentation to obtain a second image having lesion regions removed and then performing OTSU binaryzation on the second image; an area proportion calculating unit, used for plugging the first pixel area ImgN and the second pixel area Img-n that are obtained by the lesion feature extracting unit into an area proportion formula to obtain an area proportion; and a grading unit. 1/3 Rice blast original imae Extract target leaf region ROIimage Image pre-processing BLUR image Global OTSU binarization HSV color space conversion Traverse and lop Color ana lys is In HlSV color over to count effective space to determine threshold Reverse threshold segmentation OTS U binarzation on image of regions outside lesions Traverse and loop over to count efective pixel area Img,.n 77ombine Img_N and I mg_:n %v ith nested loop to calculate thr ough a for mula to obtain area -pr portion result Plug the result into grading standardtooutputgradingIresult FIG. 1

Description

1/3
Rice blast original imae
Extract target leaf region ROIimage
Image pre-processing BLUR image
Global OTSU binarization HSV color space conversion
Traverse and lop Color ana lys is In HlSV color over to count effective space to determine threshold
Reverse threshold segmentation
OTS U binarzation on image of regions outside lesions
Traverse and loop over to count efective pixel area Img,.n
77ombine Img_N and I mg_:n %vith nested loop to calculate thr ough a for mula to obtain area -pr portion result
Plug the result into grading standardtooutputgradingIresult
FIG. 1
RICE BLAST GRADING SYSTEM AND METHOD
Technical Field
The present invention relates to the field of rice leaf blast identifying and grading, particularly to a rice blast grading system and method.
Background
Rice blast is one of the most common diseases, may cause a sever loss in rice yield and occurred in various rice planting regions all over the world. Rice is the major cereal crop in China. It is of great significance to study how to prevent and o cure rice blast. Rice blast is classified into seedling blast, leaf blast, node blast, neck blast, and grains blast according to different invasion sites. Lesions of different sizes would appear on leaves during the invasion of the disease. The harm degree of rice blast is divided into 5 levels by researchers according to the proportion of rice blast lesion area in leaf area. Currently, there are mainly two grading ways for courses of rice blast.
One is a common manual determining method, which depends on experienced farmers or researchers through visual observation, thus is high in subjectivity and low in accuracy, relies heavily on the experience level of the researchers, and can't accurately measure and objectively determine lesion regions.
The other one is a hyperspectral remote sensing based rice blast grading and detecting method, which realizes sensitive band based grading and determining of courses of rice blast and a chlorophyll content predicting model. However, this method has three disadvantages: (1) poor portability: the implementation of grading and determining of rice blast through this method have to rely on specialized instruments or equipment such as a spectrograph, a chlorophyll measuring instrument and the like, and massive equipment is required to carry to achieve the grading purpose; (2) high cost: ordinary people need to buy corresponding equipment and learn how to use same if they want to use the method, resulting in a great increase in learning and application cost; and (3) low efficiency: the result would not be obtained if a series of operations and analyses are not performed, requiring a long time to obtain the result.
Summary of the Invention
The object of the present invention is to provide a rice blast grading system and method. The grading system and method is capable of fast, stable and accurate rice blast grading, is easily operated, and can acquire in real time an accurate rice blast grading result by photographing using a mobile terminal, thus lowering identification requirements and improving the efficiency of identification o operation.
The embodiment of the present invention is implemented as follows:
A rice blast grading system, the grading system comprises:
an image pre-processing unit, used for extracting a target region in a rice leaf image and pre-processing the target region, the pre-processed image being a first image;
a lesion feature extracting unit, used for calculating a first pixel area Img_N of a whole leaf and a second pixel area Img-n having lesions removed, wherein the second pixel area Img-n having lesions removed is obtained by performing reverse threshold segmentation to obtain a second image having lesion regions removed and then performing OTSU binaryzation on the second image;
an area proportion calculating unit, used for plugging the first pixel area Img_N and the second pixel area Img-n that are obtained by the lesion feature extracting unit into an area proportion formula to obtain an area proportion;
and a grading unit, used for acquiring the area proportion and obtaining a grading result by comparing to the rice blast grading standard.
In the preferred embodiment of the present invention, the described reverse threshold segmentation is performed in a HSV color space.
In the preferred embodiment of the present invention, said image pre-processing unit uses GrabCut algorithm to extract the rice leaf image.
In the preferred embodiment of the present invention, said image
pre-processing unit uses a 5x5 convolution kerne-gaussian filter to perform image
pre-processing.
In the preferred embodiment of the present invention, the target region in the
image pre-processing unit is obtained by: a user manually framing on a mobile
terminal a reference region, and then the image pre-processing unit using the
GrabCut algorithm to perform rice leaf image extraction.
The present invention further provides a rice blast grading method, the
o grading method comprising the following steps that:
S1, a mobile terminal is used to shoot a rice leaf image, and a user frames,
according to the shot rice leaf image, a rice leaf image needing to be graded;
S2, the grading system acquires the rice leaf image for pre-processing,
extracts a target region in the rice leaf image and pre-processes the rice leaf image
to obtain an image of regions outside the lesions;
S3, the grading system performs lesion feature extraction on the first image
and calculates to respectively obtain a first pixel area Img_N of the whole leaf and
a second pixel area Img-n having lesions removed,
wherein the first pixel area Img_N is obtained through the following steps of:
S311, performing OTSU binaryzation on the global first image and
S312, traversing and looping over to count effective pixels in the first image
to obtain the first pixel area Img_N of the whole leaf,
and wherein the second pixel area Img-n is obtained through steps of:
S321, performing HSV color space conversion on the first image,
S322, performing color analysis according to the HSV color space to
determine threshold ranges,
S323, performing reverse threshold segmentation on the first image within the threshold ranges to obtain an image of regions outside the lesions,
S324, performing OTSU binaryzation on the image of regions outside the lesions, and
S325, traversing and looping over to count effective pixels to obtain the second pixel area Img-n;
S4, plugging the calculated first pixel area Img_N and second pixel area Img_n into an area proportion formula to obtain an area proportion; and
S5, obtaining a grading result by comparing the area proportion to the rice blast grading standard.
In the preferred embodiment of the present invention, the reverse threshold segmentation in S323 comprises the following steps: defining, according to the threshold ranges determined in S322, the minimum value of the segmentation ranges as Lower-green and the maximum value as Upper-green, producing an image portion obtained through the reverse threshold segmentation into a mask, and calculating the first image and the mask portion to obtain a leaf having lesion regions removed.
In the preferred embodiment of the present invention, the OTSU binaryzation comprises the following steps: separating the first image into a foreground image and a background image, denoting the proportion of foreground points as o0 and the proportion of background points as o1 respectively, denoting foreground gray average as p0 and background gray average as pl and thus obtaining I = 0 * p0 + Wl * p1,
the variance between the foreground and the background being:
g = WO(p0 - p)A2 + wl(p1l - p)2,
and plugging pinto g to obtain:
g = WoWo1(p0 - p11)^2.
In the preferred embodiment of the present invention, the area proportion formula is: result = ImgN-Imgn X100%. ImgN
In the preferred embodiment of the present invention, the GrabCut algorithm is used to extract the rice blast leaf in S2 and comprises the following steps of: considering each pixel in the rice leaf image to be one node of the image, adding two nodes in the rice leaf image, namely respectively F and B, wherein F represents foreground and B represents background, using one side to connect each two adjacent pixels, using one side to connect each pixel to point F and using one side to connect each pixel to point B; segmenting the image into two portions, and connecting pixels of the first portion to F as the foreground and o pixels of the second portion to B as the background; and expressing the Gibbs energy function of the whole rice leaf image as formula:
E(a, k, 0, z) = U(x,k, 0, z) + V(a, z)
. In said formula, the U function portion represents regional data items of the energy function and the V function represents smooth items (boundary items) of the energy function; parameter a represents the foreground or background after the separation and an equals to zero or one (background or foreground); parameter k is a vector, k={kl,k2...kn}, and each kn belongs to set{1,2...k} and corresponds to a number k of Gaussian components; 0 represents a parameter in each of the k Gaussian components; and parameter z is a vector, z={z1....zn} and each z represents the gray value of each pixel. The process of segmenting the image is a process decreasing the formula (1), and when the formula can't be decreased, i.e., approaching a constant value, it is indicated that the image segmenting is completed.
The beneficial effects of the embodiment of the present invention are: the present invention provides a rice blast grading system and method; the system is a rice blast grading and determining algorithmic model based on processings such as GrabCut, Gaussian filtering, OTSU binaryzation, color space conversion, threshold segmentation and the like; the algorithmic model is achieved by using OpenCV and python languages, separates a leaf and lesions by using reverse threshold segmentation as a core policy, and counts pixels in a traversing and looping over mode to obtain a lesion area proportion, thus achieving rapid and accurate grading of rice blast. Test results show that the results obtained by the algorithmic model of the present invention have a 95.77% match with the results obtained by professional researchers through manual determining and that the algorithmic model has higher stability and objectivity than manual determining. Besides, the grading system uses a mobile terminal, for example, a mobile phone APP, as an image acquisition port without requiring other instruments or equipment, and can acquire in real time an accurate rice blast grading result by photographing using the o mobile phone, thus lowering research requirements and improving the efficiency of scientific research.
Brief Description of Figures
In order to more clearly describe the technical solution of the embodiment of
the present invention, drawings required to use in the embodiment will be briefly
introduced below. It should be understood that the drawings described hereinafter
only illustrate a portion of embodiments of the present invention and thus shall
not be construed as limitative of the protection scope of the present invention. For
a person skilled in the art, other relevant drawings could also be derived without
creative efforts according to these drawings.
FIG. 1 shows an algorithm flow chart of the rice blast grading method of the
embodiment of the present invention;
FIG. 2 shows an operating interface of the mobile terminal of the
embodiment of the present invention;
FIG. 3 is an effect view of two groups of rice blast leaves of the embodiment
of the present invention separated by GrabCut algorithm;
FIG. 4 is a color histogram drawn in accordance with rice blast leaves of
levels 1, 2, 3 and 5 of the embodiment of the present invention; and
FIG. 5 shows a test result of the reverse threshold segmentation of the embodiment of the present invention.
Detailed Description
In order to make the objective, technical solutions and advantages of the present invention clearer, the technical solutions of the embodiment of the present invention will be described clearly and thoroughly with reference to the accompanying drawings. Apparently, the embodiment described is merely a portion rather than all of embodiments of the present invention. Usually, the components of the embodiment of the present invention described and shown in the drawings herein can be arranged and designed with various different o configurations.
Therefore, the detailed description below of the embodiment of the present invention provided in the drawings is not intended to limit the protection scope claimed of the present invention, but merely represents a selected embodiment of the present invention. Based on the embodiment of the present invention, all other embodiments derived by a person skilled in the art without any creative efforts shall fall within the protection scope of the present invention.
It should be noted that: the like reference numerals and letters will be used to designate the like elements. Therefore, once a particular element is defined in one drawing, no further definition and explanation in the succeeding drawings are required.
The terms "first", "second", "third" and the like are used only for distinguishing description not for indicating or suggesting the relative importance. A person skilled in the art could understand the specific meanings of the described terms in the present invention according to particular circumstances.
Embodiment 1
This embodiment provides a rice blast grading system. The system is a rice blast grading and determining algorithmic model based on processings such as GrabCut, Gaussian filtering, OTSU binaryzation, color space conversion, threshold segmentation and the like. The algorithmic model is achieved by using OpenCV and python languages, separates a leaf and lesions by using reverse threshold segmentation as a core policy, and counts pixels in a traversing and looping over mode to obtain a lesion area proportion, thus achieving rapid and accurate grading of rice blast. The grading system comprises: an image pre-processing unit, used for extracting a target region in a rice leaf image and pre-processing the target region, the pre-processed image being a first image; a lesion feature extracting unit, used for calculating a first pixel area Img_N of o the whole leaf and a second pixel area Img-n having lesions removed, wherein the second pixel area Img_n having lesions removed is obtained by performing reverse threshold segmentation to obtain a second image having lesion regions removed and then performing OTSU binaryzation on the second image; an area proportion calculating unit, used for plugging the first pixel area
Img_N and the second pixel area Img-n that are obtained by the lesion feature
extracting unit into an area proportion formula to obtain an area proportion;
and a grading unit, used for acquiring the area proportion and obtaining a
grading result by comparing to the rice blast grading standard.
The object of this embodiment is to acquire a lesion area proportion in the
total leaf area in a rice blast image through machine vision technology, and
compares the result to the rice blast grading standard to finally obtain the grading
result. First of all, a target region of interest in a test image is extracted to separate
a target rice leaf from the background; then, the target region image is
pre-processed to reduce noise interference on the target image, the de-noised image
is converted to a HSV channel, color distribution of the target image under the
HSV channel is analyzed to determine threshold ranges, and reverse threshold
segmentation is performed on the image to obtain lesion regions so as to calculate
an area proportion to obtain the grading result.
In this embodiment, said image pre-processing unit uses the GrabCut algorithm to extract the rice leaf image. Said image pre-processing unit uses a 5x5 convolution kerne-gaussian filter to perform image pre-processing. The target region in the image pre-processing unit is obtained by: a user manually framing on a mobile terminal a reference region, and then the image pre-processing unit using the GrabCut algorithm to perform the rice leaf image extraction.
The present invention further provides a rice blast grading method. FIG. 1
shows an algorithm flow chart of the rice blast grading method. The grading
method comprises the following steps that:
S1, a mobile terminal is used to shoot a rice leaf image, and a user frames,
o according to the shot rice leaf image, a rice leaf image needing to be graded, thus
primarily selecting an image needing to be identified and accelerating the
identification speed of the grading system;
S2, the grading system acquires the rice leaf image for pre-processing,
extracts a target region in the rice leaf image and pre-processes the rice leaf image
to obtain a first image to be processed;
S3, the grading system performs lesion feature extraction on the first image
and calculates to respectively obtain a first pixel area Img_N of the whole leaf and
a second pixel area Img-n having lesions removed,
wherein the first pixel area Img_N is obtained through the following steps of:
S311, performing OTSU binaryzation on the global first image and
S312, traversing and looping over to count effective pixels in the first image
to obtain the first pixel area Img_N of the whole leaf,
and wherein the second pixel area Img-n is obtained through steps of:
S321, performing HSV color space conversion on the first image,
S322, performing color analysis according to the HSV color space to
determine threshold ranges,
S323, performing reverse threshold segmentation on the first image within the threshold ranges to obtain an image of regions outside the lesions,
S324, performing OTSU binaryzation on the image of regions outside the lesions, and
S325, traversing and looping over to count effective pixels to obtain the second pixel area Img-n;
S4, plugging the calculated first pixel area Img_N and second pixel area Img_n into an area proportion formula to obtain an area proportion; and
S5, obtaining a grading result by comparing the area proportion to the rice blast grading standard.
In the prior art, seeing from a rice leaf image acquired by a high-definition camera, even a healthy rice leaf is not completely smooth and green, and has some subtle textures on the surface. Those textures later would produce a certain interference on image processing. The aim for image processing is to enhance information we needed and exclude the interfering. Conventional image pre-processings comprise Gaussian filtering, bilateral filtering, median filtering and the like. The edges of a rice leaf usually curve naturally, and there is no need to retain straight edges. Therefore, a 5x5 convolution kerne-gaussian filter with a better universality is selected in this embodiment.
In the prior art, conventional image target region segmentation methods comprise color threshold segmentation, watershed algorithm, GrabCut algorithm, contour detection and the like. Because of the small size, in order to ensure the quality of an acquisited image, a rice leaf is shot by a mobile phone in a macro mode. In the macro mode, the rice leaf as a target image will have a high definition and the distant view outside the leaf will be relatively blurry. In the present invention, the GrabCut algorithm is used to perform the rice blast leaf extraction. The GrabCut algorithm has beneficial effects in separating an image clear in foreground and blurry in background, and comprises the following steps of: considering each pixel in the rice leaf image to be one node of the image, adding two nodes in the rice leaf image, namely respectively F and B, wherein F represents the foreground and B represents the background, using one side to connect each two adjacent pixels, using one side to connect each pixel to point F and using one side to connect each pixel to point B; segmenting the image into two portions, and connecting pixels of the first portion to F as the foreground and pixels of the second portion to B as the background; and expressing the Gibbs energy function of the whole rice leaf image as formula:
E(a, k, 0, z) = U(x,k, 0, z) + V(a, z)
. In said formula, the U function portion represents regional data items of the energy function and the V function represents smooth items (boundary items) of o the energy function; parameter a represents the foreground or background after the separation and an equals to zero or one (background or foreground); parameter k is a vector, k={kl,k2...kn}, and each kn belongs to set{1,2...k} and corresponds to a number k of Gaussian components; 0 represents a parameter in each of the k Gaussian components; and parameter z is a vector, z={z1....zn} and each z represents the gray value of each pixel. The process of segmenting the image is a process decreasing the formula (1), and when the formula can't be decreased, i.e., approaching a constant value, it is indicated that the image segmenting is completed.
In this embodiment, the extraction of rice blast lesions is implemented in a HSV color space through threshold segmentation. Threshold segmentation usually is to capture, under the HSV color space, colors within a certain value range. However, it can be seen through observation that rice blast lesions are not only yellow. Lesions of a big size are black at edges and dry yellow in the interior, while those of a small size are faint yellow and do not have black edges. Segmentation directly on colors within the yellow range would result in a great error; therefore, this embodiment provides a reverse threshold segmentation method, which uses healthy green portions of a leaf as a threshold searching range, denotes a leaf image having lesion regions removed as img-n, denotes the overall image of the whole leaf as imgN, and operates a subtraction between img-n and imgN to obtain the lesion regions. The specific implementing steps are as follows:
1) Color Space Conversion. Color threshold segmentation is performed in the HSV color space, the image shot by means of the mobile phone usually is in RGB channel, and the cv2.cvtColor function in OpenCV may be used to realize the conversion of the image from RGB to HSV channel.
2) Reverse Threshold Segmentation. Threshold segmentation requires specifically designated threshold ranges, the minimum value of the segmentation ranges is defined as Lower-green and the maximum value is defined as Upper-green, a mask is produced by an image portion obtained through the o threshold segmentation, and the original image and the mask portion are imaged and calculated to obtain a leaf having lesion regions removed.
3) OSTU Binaryzation. OTSU algorithm is also called as method of maximum classes square error, and the principle thereof is to use a threshold to separate an original image into a foreground image and a background image. The proportion of foreground points is denoted as o0 and the proportion of background points is denoted as o1 respectively. The foreground gray average is denoted as p0 and the background gray average is denoted as p1. Thus,
= 0* [+twi* 1 is obtained.
The variance between the foreground and the background is:
g = Wo(I - p)2 + w1(p1 - p)A2 is obtained.
And [is plugged into g to obtain:
g = oWo1(o - p11)A2
In this embodiment, the OTSU algorithm mainly is used to binarize the image obtained through the reverse threshold segmentation to a white negative. Hence, the threshold in the OTSU algorithm is set to be zero, i.e., converting all the effective pixels in the image into white.
In this embodiment, the actual area of the image usually can't be directly calculated if a reference frame is not provided to the machine vision technology, but area parameters of the image could be obtained in abstract form, i.e., the number of pixels of the image. The binarized image is entirely white. The abscissa axis of pixels is set as i and the ordinate axis is set as j. [ij] represents a certain effect pixel. A nested loop is used to traverse each pixel, and the number of pixels increases by 1 when it is determined that one pixel is white. The pixel area ImgN of the whole leaf and the pixel area Img-n having lesions removed are respectively calculated using this method. The lesion area may be expressed as (ImgN-Img-n), and the area proportion result of lesions in the leaf may be expressed as: result = mgN-mgn Img-N X100%.
In this embodiment, the GrabCut algorithm is used for background separation. Tests show that the effect of GrabCut segmentation directly on the global original image is not satisfactory because of the existence of other interference information in addition to the rice leaf in the foreground image. In order to improve the accuracy of background separation, the user manually frames a rectangular region Rect (x,y,w,h) as a reference region when inputting the image on an image input end, wherein x,y is the coordinate of the starting pint of the rectangle and w,h is the relative coordinate of the starting point. The mobile phone operating interface is as shown in FIG. 2. The effects of separation of two groups of rice blast leaves through the GrabCut algorithm are as shown in FIG. 3. It can be seen from FIG. 3 that the information of leaf edges are better retained in the image constrained by the Rect region.
The threshold ranges in this embodiment are analyzed as follows: the value ranges in the HSV color space is [0,180], [0,255] and [0,255] for H, S and V respectively. Using the reverse threshold segmentation method to extract lesions needs threshold ranges of green in the HSV color space. For value ranges of green in a standard HSV space, H is [35,77], S is [43,255], and Value V has less influence on green and thus the searching range thereof may be properly expanded to [19-21]. Tests show that the segmentation result obtained using the standard threshold ranges for green has a greater error and green regions acquired are incomplete, and thus the lesion area obtained is greater than the actual area. In order to obtain accurate threshold ranges and improve the reliability of the model, 4 groups of rice blast leaf images manually graded to levels 1, 2, 3 and 5 are selected to draw the color histogram, wherein the ordinate axis H represents Hue and the abscissa axis S represents Saturation. The results are as shown in FIG. 4.
In FIG. 4a, the rice blast leaf lesions of level 1 are tiny, the color is concentratedly distributed in a green region with the H value of about 40 in the o histogram, and there is no apparent color boundary. In FIG. 4d, because disease course reaches the highest level 5 and lesions almost covers the whole leaf surface, the green region connects with the yellow lesion regions, and it is hard to distinguish an apparent color boundary, thus being low in reference value. It can be seen through observation from FIGS. 4b and 4c, the green portion is concentratedly distributed in a range with Hue value H of [30,60] and Saturation value S of [50,200]. A black shadow region exists when H value is above 30, and a concentrate color distribution exists when H value is 20. Through checking from the HSV color space distribution values, they are orange and yellow regions when H value is about 20. Therefore, it can be deduced that said color distribution regions are lesion distribution regions.
To verify this deduction, the value ranges of the reverse threshold segmentation in test are set as Hue H=[30,60], Saturation S=[50,200] and Value V=[0,255]. The set parameter model is used to respectively test different rice blast leaves and the test results are as shown in FIG. 5. It can be seen from FIG. 5 that the reverse threshold segmentation method has a good effect in this value ranges. The yellow lesions, black lesions and lesions of a big size are all successfully separated from the leaf. Moreover, it can be observed that when lesions are distributed intensely, the algorithm has a minor error in lesion edge control compared to the original image.
The comparison of grading results in this embodiment are as follows: after lesions are separated through the reverse threshold segmentation, binaryzation is performed to respectively calculate the pixel area Img_N of the whole leaf and the pixel area Img-n having lesions removed; the pixel area Img_N and the pixel area Img-n are plugged into the area proportion formula to obtain an area proportion; and the area proportion is compared to the rice blast grading standard to obtain a grading result. According to National Standard of the People's Republic of China GB/T1570-1995: Rules of Investigation and Forecast of the Rice Blast [Pyricularia oryzae (Cavara)], rice leaf blast is divided into 5 levels. o As shown in Table. 1, the grading standards are as follows: Table 1: Rice Blast Grading Standard
Level Feature Area Proportion of Lesion in Leaf
0 disease free 0%
1 few and small lesions 1%below
2 small and many lesions, or big and a few lesions 1%-5%
3 big and more lesions 5%-10%
4 big and many lesions 10%-50%
5 great lesion area proportion and whole leaf death 50% above
189 collected rice blast images of levels 1-5 are numbered to test the algorithm, and the comparison between the obtained grading results to manual determining results is as shown in Table 2:
Table 2: Sample Grading Test Results Level Manual Grading Algorithmic Grading Difference Error Rate Quantity Quantity 1 52 52 0 0%
2 47 45 +2 2.33%
3 39 41 -2 2.33%
4 22 16 +6 11.76%
5 29 35 -6 11.76%
Sum 189 189 8 4.23%
It can be seen from the algorithmic test results in Table 2 that for samples graded to level 1, the determined results are consistent between algorithmic grading and manual grading, and algorithmic grading thus has a high accuracy. In samples graded to levels 2 and 3, there are 2 groups being different in manual grading results and algorithm grading results. In samples graded to levels 4 and 5, there are 6 groups being different in manual grading results and algorithm grading results. The test data of those image samples having different manual and algorithmic grading results are separately listed in Table 3. According to Table 3, the lesion area proportion of samples No.1 and No. 2 is nearly 5%, close to the critical value of level 2 and level 3. Manual determining is subjective and has 1% difference from machine algorithm. The lesion area proportion of samples No.
4-8 is more than 50% through model operation, but said samples are manually determined as level 4. The applicant founds by comparing to the original images
o that the main body, excluding the lesion regions, of the leaves themselves in the
samples No. 4-8 are dry yellow but has no lesion on. The model identifies such
dry yellow regions as lesion regions, thus resulting in different grading results.
According to National Standard of the People's Republic of China
GB/T1570-1995, rice blast of level 5 shows a great lesion area proportion and
whole leaf death. Therefore, it complies with the standard that samples No. 4-8
are determined as level 5.
Table 3: Samples Having Different Algorithmic and Manual Grading Results
No. Pixel Area of Pixel Area Having Lesion Area Area Algorithmic Manual Whole Leaf Lesions Removed Proportion Grading Grading 1 41791 39252 2539 6.08% 3 2 2 28054 26506 1548 5.52% 3 2 3 34532 10358 24174 70.01% 5 4 4 30525 9684 20841 68.28% 5 4 5 37719 18639 19080 50.58% 5 4 6 34811 16576 18235 52.38% 5 4 7 40572 18268 22304 54.97% 5 4 8 39818 9803 30015 75.38% 5 4
To sum up, this embodiment provides a reverse threshold segmentation
method in a HSV color space on the basis of the computer vision technology, can
effectively achieve lesion separation and course grading and determining of rice
blast. It is verified through tests that the results obtained by the algorithmic model
have a 95.77% match with the results obtained by professional researchers through
manual determining. Compared to other grading and determining methods, this
technology has the following three advantages:
(1) High Grading Efficiency. The algorithmic model is achieved using the OpenCV algorithm library and python language, and a distributed system and a GPU turbo are relied on to carry out a real-time operation in the background. Rapid grading and determining are achieved, and the whole determining process will be completed in 2 seconds if a high speed Internet connection is provided.
(2) Convenient and Easy Use. Using this method to identify a rice blast course does not require other specialized equipment and instruments. After the algorithm is embedded into a mobile phone APP, both a researcher and a common user can rapidly obtain a grading and determining result by using the mobile phone APP to o take a photograph of an infected leaf. People using this method do not have to equip with professional knowledge and long-term working experience, greatly reducing economic cost and leaning cost.
(3) High Grading Accuracy. Manual determining and grading usually has high subjectivity, and if graded by different researchers, same leaf may be graded differently. Different from the roughness of manual determining, this method improves the accuracy of area calculation to pixel scale in the manner of traversing and looping over, better quantifies and classifies parameters and indicators, and has higher accuracy and objectivity over manual determining.
Described above in this description is merely an example of embodiments of the present invention, but is not intent to limit all the possible forms of the present invention. It should be understood that the embodiment in the description may be implemented in a variety of forms. The drawings are not necessarily drawn to scale, and some features may be zoomed in and out to show the details of a particular element. The disclosed specific structural and functional details shall not be construed as limitative explanations, but are merely representative bases for teaching a person skilled in the art to implement the present invention in a variety of forms. It should be understood by a person skilled in the art that multiple features specified and described in any one of the drawings could be combined with a feature specified in one or more other drawings to form an embodiment that is not clearly specified or described. The specified feature combination provides a representative embodiment for a typical application. However, a plurality of combinations and modifications of features providing the same teaching with the present invention may be used for particular applications or implementations according to needs.
Described above is merely the preferred embodiment of the present invention, but the present invention is not limited thereto. For a person skilled in the art, the present invention may have various modifications and variations. Any changes, equivalent substitutions and improvements made within the spirit and principle of o the present invention shall all fall within the protection scope of the present invention.

Claims (10)

  1. Claims 1. A rice blast grading system, characterized in that the grading system
    comprises:
    an image pre-processing unit, used for extracting a target region in a rice leaf
    image and pre-processing the target region, the pre-processed image being a first
    image;
    a lesion feature extracting unit, used for calculating a first pixel area ImgN of a
    whole leaf and a second pixel area Img-n having lesions removed, wherein the
    second pixel area Img-n having lesions removed is obtained by performing
    reverse threshold segmentation to obtain a second image having lesion regions
    removed and then performing OTSU binaryzation on the second image;
    an area proportion calculating unit, used for plugging the first pixel area ImgN
    and the second pixel area Img_n that are obtained by the lesion feature extracting
    unit into an area proportion formula to obtain an area proportion; and
    a grading unit, used for acquiring the area proportion and obtaining a grading
    result by comparing to the rice blast grading standard.
  2. 2. The rice blast grading system according to claim 1, characterized in that the
    reverse threshold segmentation is performed in a HSV color space.
  3. 3. The rice blast grading system according to claim 1, characterized in that the
    image pre-processing unit uses GrabCut algorithm to extract the rice leaf image.
  4. 4. The rice blast grading system according to claim 1, characterized in that the
    image pre-processing unit uses a 5x5 convolution kerne-gaussian filter to
    perform image pre-processing.
  5. 5. The rice blast grading system according to claim 1, characterized in that the
    target region in the image pre-processing unit is obtained by: a user manually framing on a mobile terminal a reference region, and then the image
    pre-processing unit using the GrabCut algorithm to perform rice leaf image
    extraction.
  6. 6. A rice blast grading method, characterized in that the grading method comprises the following steps that:
    S, a mobile terminal is used to shoot a rice leaf image, and a user frames, according to the shot rice leaf image, a rice leaf image needing to be graded;
    S2, the grading system acquires the rice leaf image for pre-processing, extracts a target region in the rice leaf image and pre-processes the rice leaf image to obtain a first image to be processed;
    S3, the grading system performs lesion feature extraction on the first image and calculates to respectively obtain a first pixel area ImgN of the whole leaf and a second pixel area Img-n having lesions removed,
    wherein the first pixel area ImgN is obtained through the following steps of:
    S311, performing OTSU binaryzation on the global first image and
    S312, traversing and looping over to count effective pixels in the first image to obtain the first pixel area ImgN of the whole leaf,
    and wherein the second pixel area Img-n is obtained through the following steps of:
    S321, performing HSV color space conversion on the first image,
    S322, performing color analysis according to the HSV color space to determine threshold ranges,
    S323, performing reverse threshold segmentation on the first image within the threshold ranges to obtain an image of regions outside the lesions,
    S324, performing OTSU binaryzation on the image of regions outside the lesions, and
    S325, traversing and looping over to count effective pixels to obtain the second pixel area Img-n;
    S4, plugging the calculated first pixel area ImgN and second pixel area Img_n into an area proportion formula to obtain an area proportion; and
    S5, obtaining a grading result by comparing the area proportion to the rice blast grading standard.
  7. 7. The rice blast grading method according to claim 6, characterized in that the reverse threshold segmentation in S323 comprises the following steps: defining, according to the threshold ranges determined in S322 , the minimum value of the segmentation ranges as Lower-green and the maximum value as Upper-green, producing an image portion obtained through the reverse threshold segmentation into a mask, and calculating the first image and the mask portion to obtain a leaf having lesion regions removed.
  8. 8. The rice blast grading method according to claim 6, characterized in that the OTSU binaryzation comprises the following steps: separating the first image into a foreground image and a background image, denoting the proportion of foreground points as o0 and the proportion of background points as o1 respectively, denoting foreground gray average as p0 and background gray average as pl and thus obtaining
    p = 0 * p0 + Wl * p1,
    the variance between the foreground and the background being:
    g= WO(p0- p)A2+wl( 1l-p)2,
    and plugging pinto g to obtain:
    g = WoWo1(p0 - p11)^2.
  9. 9. The rice blast grading method according to claim 6 characterized in that the area proportion formula is:
    result = ImgN-Imgn X1000. ImgN
  10. 10. The rice blast grading method according to claim 6, characterized in that the GrabCut algorithm is used to extract the rice blast leaf in S2 and comprises the following steps of: considering each pixel in the rice leaf image to be one node of the image, adding two nodes in the rice leaf image, namely respectively F and B, wherein F represents foreground and B represents background, using one side to connect each two adjacent pixels, using one side to connect each pixel to point F and using one side to connect each pixel to point B; segmenting the image into two portions, and connecting pixels of the first portion to F as the foreground and pixels of the second portion to B as the background; expressing the Gibbs energy function of the whole rice leaf image as formula:
    E(a, k, 0, z) = U(x,k, 0, z) + V(a, z)
    wherein the U function portion represents regional data items of the energy function and the V function represents boundary items of the energy function; and when E can't be decreased and approaches a constant value, completing the image segmenting.
AU2020103260A 2020-11-05 2020-11-05 Rice blast grading system and method Active AU2020103260A4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2020103260A AU2020103260A4 (en) 2020-11-05 2020-11-05 Rice blast grading system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2020103260A AU2020103260A4 (en) 2020-11-05 2020-11-05 Rice blast grading system and method

Publications (1)

Publication Number Publication Date
AU2020103260A4 true AU2020103260A4 (en) 2021-01-14

Family

ID=74103494

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2020103260A Active AU2020103260A4 (en) 2020-11-05 2020-11-05 Rice blast grading system and method

Country Status (1)

Country Link
AU (1) AU2020103260A4 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112785571A (en) * 2021-01-20 2021-05-11 浙江理工大学 Famous tea tender leaf recognition and segmentation method based on improved watershed
CN115953384A (en) * 2023-01-10 2023-04-11 杭州首域万物互联科技有限公司 On-line detection and prediction method for tobacco morphological parameters
CN116740378A (en) * 2023-07-03 2023-09-12 南通黄海药械有限公司 Garden plant diseases and insect pests evaluation system based on image processing
CN117789201A (en) * 2024-02-27 2024-03-29 南京农业大学 Rice root system nondestructive acquisition method, device, storage medium and system

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112785571A (en) * 2021-01-20 2021-05-11 浙江理工大学 Famous tea tender leaf recognition and segmentation method based on improved watershed
CN112785571B (en) * 2021-01-20 2024-04-12 浙江理工大学 Famous tea tender leaf identification and segmentation method based on improved watershed
CN115953384A (en) * 2023-01-10 2023-04-11 杭州首域万物互联科技有限公司 On-line detection and prediction method for tobacco morphological parameters
CN115953384B (en) * 2023-01-10 2024-02-02 杭州首域万物互联科技有限公司 Online detection and prediction method for morphological parameters of tobacco leaves
CN116740378A (en) * 2023-07-03 2023-09-12 南通黄海药械有限公司 Garden plant diseases and insect pests evaluation system based on image processing
CN116740378B (en) * 2023-07-03 2024-04-02 南通黄海药械有限公司 Garden plant diseases and insect pests evaluation system based on image processing
CN117789201A (en) * 2024-02-27 2024-03-29 南京农业大学 Rice root system nondestructive acquisition method, device, storage medium and system

Similar Documents

Publication Publication Date Title
AU2020103260A4 (en) Rice blast grading system and method
Aquino et al. vitisBerry: An Android-smartphone application to early evaluate the number of grapevine berries by means of image analysis
Bakar et al. Rice leaf blast disease detection using multi-level colour image thresholding
Qing et al. Automated counting of rice planthoppers in paddy fields based on image processing
CN109961426B (en) Method for detecting skin of human face
CN108596102B (en) RGB-D-based indoor scene object segmentation classifier construction method
CN108563979B (en) Method for judging rice blast disease conditions based on aerial farmland images
CN107464249B (en) Sheep contactless body ruler measurement method
US10395091B2 (en) Image processing apparatus, image processing method, and storage medium identifying cell candidate area
Gujjar et al. A method for identification of basmati rice grain of india and its quality using pattern classification
CN110599507B (en) Tomato identification and positioning method and system
CN110310291A (en) A kind of rice blast hierarchy system and its method
CN105067532A (en) Method for identifying early-stage disease spots of sclerotinia sclerotiorum and botrytis of rape
Revathi et al. Homogenous segmentation based edge detection techniques for proficient identification of the cotton leaf spot diseases
Al-Tarawneh An empirical investigation of olive leave spot disease using auto-cropping segmentation and fuzzy C-means classification
Ji et al. In-field automatic detection of maize tassels using computer vision
CN114067207A (en) Vegetable seedling field weed detection method based on deep learning and image processing
CN111199192A (en) Method for detecting integral maturity of field red globe grapes by adopting parallel line sampling
Kurniawati et al. Texture analysis for diagnosing paddy disease
Patki et al. Cotton leaf disease detection & classification using multi SVM
CN111768455A (en) Image-based wood region and dominant color extraction method
CN111665199A (en) Wire and cable color detection and identification method based on machine vision
CN113129281B (en) Wheat stem section parameter detection method based on deep learning
Sibi Chakkaravarthy et al. Automatic leaf vein feature extraction for first degree veins
CN106096527A (en) A kind of recognition methods of real-time high-precision online bank note face amount

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)