CN113298777A - Cotton leaf blight detection method and system based on color features and super-pixel clustering - Google Patents

Cotton leaf blight detection method and system based on color features and super-pixel clustering Download PDF

Info

Publication number
CN113298777A
CN113298777A CN202110558035.7A CN202110558035A CN113298777A CN 113298777 A CN113298777 A CN 113298777A CN 202110558035 A CN202110558035 A CN 202110558035A CN 113298777 A CN113298777 A CN 113298777A
Authority
CN
China
Prior art keywords
leaf
cotton
area
clustering
super
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110558035.7A
Other languages
Chinese (zh)
Inventor
杨公平
肖桃
孙启玉
李广阵
宋成秀
褚德峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Fengshi Information Technology Co ltd
Shandong University
Original Assignee
Shandong Fengshi Information Technology Co ltd
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Fengshi Information Technology Co ltd, Shandong University filed Critical Shandong Fengshi Information Technology Co ltd
Priority to CN202110558035.7A priority Critical patent/CN113298777A/en
Publication of CN113298777A publication Critical patent/CN113298777A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a cotton mosaic blight detection method and system based on color features and super-pixel clustering, and belongs to the technical field of crop disease detection and computer vision. The method clusters the original image through FCM, recombines RGB components to remove non-green parts such as soil and impurities, and performs adaptive threshold segmentation based on an Otsu method. Filling up the cavity of the segmented image by morphological processing, removing burrs, the edge of a petiole and isolated points, and smoothing the outline of the damaged blade. The area of the connected region is detected by the weed region part on the ground, and the problem can be effectively solved according to a threshold value. The superpixel algorithm distinguishes a dead leaf region and a normal leaf region for the extracted cotton leaf segmentation without considering edge segmentation. Setting a threshold value to detect a dead leaf area of each superpixel block, and processing according to the area ratio to greatly reduce errors caused by threshold value setting. Finally, the disease degree and the detection result of the cotton leaves are accurately given, so that the method has good practical application value.

Description

Cotton leaf blight detection method and system based on color features and super-pixel clustering
Technical Field
The invention belongs to the technical field of crop disease detection and computer vision, and particularly relates to a cotton leaf blight detection method and system based on color features and super-pixel clustering.
Background
The information in this background section is only for enhancement of understanding of the general background of the invention and is not necessarily to be construed as an admission or any form of suggestion that this information forms the prior art that is already known to a person of ordinary skill in the art.
In the agricultural field, plant diseases are various, and improper treatment can seriously reduce the quantity and quality of agricultural products. Leaf blight is the most common disease in the growth process of cotton, harms cotton leaves and cotton bolls, and causes huge economic loss every year, so that the detection in advance and the effective prevention of diseases are very important for ensuring the yield of cotton. The existing traditional method for manually detecting cotton plants consumes a large amount of time and economic cost, and is not suitable for large-scale cotton planting industry.
In cotton leaf blight detection, automated detection of disease by applying plant leaf symptoms can reduce a large amount of monitoring work in the field of large plants, and is a relatively inexpensive and more effective solution. However, the inventor finds that the single traditional image segmentation method is greatly influenced by factors such as illumination, shadow, weather and the like, and is difficult to be directly applied to cotton images in various complex natural environments, so that the detection of the leaf blight is more difficult.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a cotton leaf blight detection method and system based on color features and super-pixel clustering. The method disclosed by the invention is used for detecting the leaf blight by combining the super-pixel clustering algorithm according to the color characteristics of the cotton leaves, can be well suitable for the cotton leaves in the natural environment, and has good accuracy and robustness, so that the method has a good practical application value.
The invention is realized by the following technical scheme:
in a first aspect of the invention, a cotton leaf blight detection method based on color features and super-pixel clustering is provided, which comprises the following steps:
clustering the acquired cotton leaf images so as to increase the discrimination of a foreground region and a background region to which the leaf parts belong in the images;
recombining and enhancing the green part area of the image by using the RGB components;
performing self-adaptive threshold segmentation processing on the image to obtain an initial segmentation image;
and performing morphological processing and connected region area processing on the initial segmentation image, performing superpixel segmentation, detecting each superpixel block according to a threshold value, and outputting a detection result.
In a second aspect of the present invention, there is provided a cotton leaf blight detection system based on color features and super-pixel clustering, the system comprising:
the cotton leaf image acquisition module is used for acquiring cotton leaf image data;
the clustering module is used for clustering the cotton images so as to increase the distinguishing degree of a foreground region and a background region to which the leaf parts in the images belong;
the RGB component recombination module is used for enhancing the green part area of the image;
the adaptive threshold segmentation processing module is used for obtaining an initial segmentation image;
a morphological treatment and connected area treatment module for eliminating elongated protrusions, removing burrs and petiole edges; filling holes on the blades and smoothing damaged blade edges; and removing the weed part;
the super-pixel segmentation module is used for segmenting and distinguishing a dead leaf region and a normal leaf region of the extracted cotton leaves without considering edge segmentation;
and the result output module is used for detecting the dead leaf area of each superpixel block by setting a threshold value, processing according to the area ratio and outputting the disease degree and the detection result of the cotton leaves.
In a third aspect of the invention, a non-transitory computer program product is provided, comprising computer executable code segments stored on a computer readable medium for performing the steps performed by the above-described color feature and super-pixel clustering based cotton leaf blight detection method.
In a fourth aspect of the present invention, a computer readable storage medium is provided for storing computer instructions, which when executed by a processor, perform the steps of the cotton leaf blight detection method based on color features and super-pixel clustering.
Meanwhile, it should be noted that the cotton leaf blight is only exemplified in the present invention, and other similar diseases of cotton and other plants are all within the protection scope of the present application based on the inventive concept of the present invention.
The beneficial technical effects of one or more technical schemes are as follows:
according to the technical scheme, the original images are clustered through the FCM, the discrimination of the foreground and the background of the images is enhanced, the dead leaf area is divided into the leaves, and meanwhile, most of factors influencing segmentation are removed. The RGB components are recombined to remove non-green parts such as soil, impurities and the like, self-adaptive threshold segmentation is carried out based on an Otsu method, and the cotton leaf area can be effectively segmented. Filling up the cavity of the segmented image by morphological processing, removing burrs, the edge of a petiole and isolated points, and smoothing the outline of the damaged blade. The area of the connected region is detected by the weed region part on the ground, and the problem can be effectively solved according to a threshold value. The superpixel algorithm distinguishes a dead leaf region and a normal leaf region for the extracted cotton leaf segmentation without considering edge segmentation. Setting a threshold value to detect a dead leaf area of each superpixel block, and processing according to the area ratio to greatly reduce errors caused by threshold value setting. And finally, accurately giving the disease degree and the detection result of the cotton leaves.
In conclusion, the technical scheme is suitable for cotton leaves collected in natural environment, can accurately and effectively detect the leaf blight and has high practical value.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and together with the description serve to explain the invention and not to limit the invention.
FIG. 1 is a flow chart of the cotton leaf blight detection process of the present invention.
Detailed Description
It is to be understood that the following detailed description is exemplary and is intended to provide further explanation of the invention as claimed. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise. It is to be understood that the scope of the invention is not to be limited to the specific embodiments described below; it is also to be understood that the terminology used in the examples is for the purpose of describing particular embodiments only, and is not intended to limit the scope of the present invention.
Components that can be used to perform the disclosed methods and systems are disclosed. These and other components are disclosed herein, and it is understood that when combinations, sub-groups, interactions, groups, etc. of these components are disclosed that while specific reference of each various individual and collective combinations and permutations of these components may not be explicitly disclosed, each is specifically contemplated and described herein for all methods and systems. This applies to all aspects of the present application including, but not limited to, steps in the disclosed methods. Thus, if there are a variety of additional steps that can be performed, it is understood that each of these additional steps can be performed using any particular embodiment or combination of embodiments of the disclosed methods.
As mentioned earlier, the automated detection of disease by applying plant leaf symptoms in cotton leaf blight detection can reduce a lot of monitoring work in the field of macrophytes, and is a relatively inexpensive and more efficient solution. The quality of the disease leaf segmentation can directly influence the reliability and accuracy of plant disease detection, however, the single traditional image segmentation method is greatly influenced by factors such as illumination, shadow, weather and the like, and is difficult to directly apply to cotton images in various complex natural environments, so that the detection of leaf blight is more difficult.
In view of the above, in one embodiment of the present invention, a cotton leaf blight detection method based on color features and super-pixel clustering is provided, which includes:
clustering the acquired cotton leaf images so as to increase the discrimination of a foreground region and a background region to which the leaf parts belong in the images;
recombining and enhancing the green part area of the image by using the RGB components;
performing self-adaptive threshold segmentation processing on the image to obtain an initial segmentation image;
and performing morphological processing and connected region area processing on the initial segmentation image, performing superpixel segmentation, detecting each superpixel block according to a threshold value, and outputting a detection result.
In one embodiment of the invention, the clustering employs a fuzzy C-means clustering algorithm (FCM), which is a partition-based algorithm, the basic idea being to maximize the similarity between objects partitioned into the same cluster and minimize the similarity between different clusters. The fuzzy C-means algorithm is an improvement of a common C-means algorithm, the probability that a sample belongs to a certain class is represented by using the membership degree, and the obtained clustering result is more flexible.
In a specific embodiment of the invention, the clustering number is set to be 5, the parameter m is set to be 2, clustering is carried out, finally, the cotton leaves and the background can be well distinguished according to a clustering result, and meanwhile, dead leaf parts and normal leaf parts are classified into one type, which is very beneficial to the extraction of the following leaves.
The RGB component recombination specifically comprises the following steps: the super-green color index, ExG, is used, minus the super-red color index ExR, ExG-ExR. The method has good effect on extracting green plant images, shadows, hay, soil images and the like can be obviously inhibited, and the green part of the images is more prominent.
The adaptive threshold segmentation processing is performed using an Otsu adaptive threshold algorithm. The algorithm divides the image into a background part and a target part according to the gray characteristic of the image, and automatically selects a threshold value for segmentation; and other parameters do not need to be set manually, so that the method is simple to realize and stable in performance. The method can accurately and effectively segment the cotton leaves from the image.
The morphological processing and the communicated region area processing specifically comprise performing processing by adopting morphological opening operation, morphological closing operation and setting an area threshold;
wherein the opening operation can eliminate elongated protrusions, remove burrs and petiole edges. The closed operation can effectively fill the holes on the blades and simultaneously smooth the damaged blade edges. In order to make the effect more obvious, the number of iterations selected in the invention is 3.
For ground weeds, the area occupied in the segmented image will be much smaller than the blades, and the processing is performed by setting an area threshold. Specifically, a connected region in the entire segmented image is detected, and a weed part is determined if the area is smaller than a threshold value, and the weed part is removed from the segmented image. And calculating the area of the final processing result, namely the area of the blade in the original image.
The super-pixel segmentation adopts a linear iterative clustering (SLIC) method, which converts a color image into a CIELAB color space and 5-dimensional feature vectors under XY coordinates, and then constructs a distance measurement standard for the 5-dimensional feature vectors to perform local clustering on image pixels. The extracted blade is segmented by the SLIC algorithm, compact and approximately uniform superpixels can be generated, and meanwhile, the operation speed and the object contour keeping aspect are excellent.
The specific method for detecting each super-pixel block according to the threshold and outputting the detection result comprises the following steps: by executing the superpixel segmentation algorithm by setting the size of each superpixel in advance, the leaf extracted from the original image is segmented into a plurality of superpixel blocks having the same or similar characteristics in color, brightness, and texture, and is also excellent for edge processing of an object. By taking each super-pixel block, the conversion is to the hsv color space. The set dead leaf threshold range is detected as a dead leaf portion if the pixel is therein.
And detecting the proportion occupied by the dead leaf area of the superpixel block, if the proportion is less than 20%, discarding the superpixel block, if the proportion is more than 80%, detecting the superpixel block as a dead leaf part, and only keeping the part detected as the dead leaf between the superpixel block and the superpixel block. By this method, errors caused by threshold setting can be effectively avoided. And finally, summarizing to obtain a dead leaf area of the whole cotton leaf, drawing the outline of a disease area in an original drawing, and simultaneously giving the disease degree of the leaf. The calculation here is the ratio of the area of the dead leaf region to the entire leaf (excluding the background).
In one embodiment of the present invention, the size of one super pixel set in advance is 500; the set dead leaf threshold range is (32,25,25), (78,255,255).
In one embodiment of the invention, a cotton leaf blight detection system based on color features and super-pixel clustering is provided, the system comprising:
the cotton leaf image acquisition module is used for acquiring cotton leaf image data;
the clustering module is used for clustering the cotton images so as to increase the distinguishing degree of a foreground region and a background region to which the leaf parts in the images belong;
the RGB component recombination module is used for enhancing the green part area of the image;
the adaptive threshold segmentation processing module is used for obtaining an initial segmentation image;
a morphological treatment and connected area treatment module for eliminating elongated protrusions, removing burrs and petiole edges; filling holes on the blades and smoothing damaged blade edges; and removing the weed part;
the super-pixel segmentation module is used for segmenting and distinguishing a dead leaf region and a normal leaf region of the extracted cotton leaves without considering edge segmentation;
and the result output module is used for detecting the dead leaf area of each superpixel block by setting a threshold value, processing according to the area ratio and outputting the disease degree and the detection result of the cotton leaves.
In a specific embodiment of the present invention, a non-transitory computer program product is provided, comprising computer executable code segments stored on a computer readable medium for performing the steps performed by the color feature and super pixel clustering based cotton leaf blight detection method described above.
In one embodiment of the present invention, a computer-readable storage medium is provided for storing computer instructions, which when executed by a processor, perform the steps of the cotton leaf blight detection method based on color features and super-pixel clustering.
The invention is further illustrated by the following examples, which are not to be construed as limiting the invention thereto. It should be understood that these examples are for illustrative purposes only and are not intended to limit the scope of the present invention.
Example 1
A cotton leaf blight detection method based on color features and super-pixel clustering comprises the following steps:
1. and (3) clustering the cotton leaves by using a fuzzy C-means clustering algorithm, and increasing the distinguishing degree of the leaves and the background in the image.
Cotton images in natural environments have a lot of interference factors such as illumination, shielding and shadow, and results obtained by directly applying a segmentation algorithm are usually poor. The leaves to be detected are extracted in advance, and the irrelevant background is removed and then the leaves are segmented, so that the segmentation accuracy can be effectively improved.
The fuzzy C-means clustering algorithm (FCM) is a partition-based algorithm, and the basic idea is to maximize the similarity between objects partitioned into the same cluster, and minimize the similarity between different clusters. The fuzzy C-means algorithm is an improvement of a common C-means algorithm, the probability that a sample belongs to a certain class is represented by using the membership degree, and the obtained clustering result is more flexible. According to the method, the number of clusters is set to be 5, the parameter m is set to be 2, the cotton leaves can be well distinguished from the background by the final clustering result, and meanwhile, the dead leaf part and the normal leaf part are classified into one type, so that the method is very beneficial to the extraction of the following leaves.
FCM by minimizing an objective function JmAnd (3) realizing clustering with the constraint conditions (2):
Figure BDA0003077890530000081
Figure BDA0003077890530000082
wherein c represents the number of clustering centers, and n is the number of samples. m is a membership factor representing how important the sample belongs to a class. u. ofijIndicating the degree of membership of sample j to class i. x is the number ofjIs the jth sample set, ciIs the cluster center of the i cluster, and | x | represents the euclidean distance.
The specific implementation flow of the FCM algorithm is as follows:
step 1: randomly selecting a random number with a value between 0 and 1, initializing a membership matrix U to meet the constraint condition in the formula (2), namely setting the membership sum of each sample to each class to be 1, and simultaneously setting the clustering number c and the parameter m.
Step 2: c of c clustering centers is calculated according to the membership matrix Ui
Figure BDA0003077890530000091
And step 3: updating the membership value u according to the obtained clustering centerij
Figure BDA0003077890530000092
And 4, step 4: comparing the membership degree matrix between two iterations by using a matrix norm, stopping the iteration if a given threshold condition is met, otherwise, circulating the steps 2 and 3, and continuously iterating and calculating the membership degree uijAnd cluster center ciAnd reaching the optimal state until the termination condition is met.
Termination conditions of the iteration:
||Uk+1-UK||<ε (5)
where k is the number of iteration steps and epsilon is the error threshold.
2. The RGB components were used to reconstruct the ultragreen index ExG, minus the ultrared index ExR, to enhance the contrast between the green part of the leaf and the background.
In the clustered image, the partial green area and the background area are not obvious, and some irrelevant areas exist, and are solved through RGB component recombination. For an RGB color image, the RGB color image is separated into three independent primary color planes of R, G and B, and each pixel point in the image is converted through different color characteristic combinations, so that the aim of enhancing the contrast ratio of a target crop and a background in the image can be fulfilled. The RGB components adopted in the invention are recombined into an ultragreen color index ExG, and an ultrared color index ExR is subtracted, namely ExG-ExR. The method has good effect on extracting green plant images, shadows, hay, soil images and the like can be obviously inhibited, and the green part of the images is more prominent.
ExG=2G-R-B (6)
ExR=1.4R-G-B (7)
3. And performing adaptive threshold segmentation on the image based on Otsu to obtain an initial segmented image.
In order to segment the cotton image obtained by the ExG-ExR in the previous step, whether each pixel belongs to the leaf or the background can be further judged by setting a fixed threshold. The method has the defects that a fixed threshold value cannot be applied to all images, and the segmentation effects of different images have certain difference. The Otsu self-adaptive threshold algorithm is used for segmenting the image, the image is divided into a background part and a target part according to the gray characteristic of the image, the threshold is automatically selected by the algorithm for segmentation, other parameters do not need to be set manually, and the method is simple to implement and stable in performance. The method can accurately and effectively segment the cotton leaves from the image.
4. The morphological method processes the holes and isolated points of the segmented image and removes the parts with smaller area of the connected region.
The background of the cotton image taken in the natural environment usually includes various weeds which are irregularly distributed and occupy a certain area of the image. Because the color of the weeds is very similar to the color of the leaves, the image preprocessing cannot be solved through clustering and RGB component recombination, so that the weeds are mistaken for the leaves by the Otsu algorithm and also occupy a part in the initial segmentation image, and therefore the weeds need to be processed. Morphological opening operations can eliminate long and thin protrusions, remove burrs and petiole edges. The closed operation can effectively fill the holes on the blades and simultaneously smooth the damaged blade edges. For the effect to be more pronounced, the number of iterations is chosen to be 3. For ground weeds, the area occupied in the segmented image will be much smaller than the blades, and the processing is performed by setting an area threshold. And detecting a connected region in the whole segmentation image, and removing the weed part if the area is smaller than the threshold value. And calculating the area of the final processing result, namely the area of the blade in the original image.
5. And extracting a blade part according to the segmentation image, and executing a superpixel segmentation algorithm.
From the processed segmented image, the leaf portion is extracted by applying to the original image, while the background is set to black by default. For the convenience of the following processing, the black background is replaced with green, and the background is considered to be a normal green part, which is more advantageous for further processing.
Superpixels can reduce computational overhead by replacing the standard pixel grid by combining pixels into original regions that are more perceptually meaningful than individual pixels, and improve the performance of the segmentation algorithm by reducing irrelevant details. The invention selects a linear iterative clustering (SLIC) method with simple thought and convenient realization, which carries out local clustering on image pixels by converting a color image into a CIELAB color space and 5-dimensional characteristic vectors under XY coordinates and then constructing a distance measurement standard for the 5-dimensional characteristic vectors. The extracted blade is segmented by the SLIC algorithm, compact and approximately uniform superpixels can be generated, and meanwhile, the operation speed and the object contour keeping aspect are excellent.
The specific implementation flow of the SLIC algorithm is as follows:
step 1: an initialization algorithm: let (r, g, b) be the three color components of the pixel, (x, y) be the two spatial coordinates of the pixel, ntpTotal number of pixels in the image, nspThe images are obtained by sampling regular networks with a space of s units, where s is [ n ]to/nsp]1/2
And sampling the image according to the step length s of the standard grid, and calculating an initial super-pixel clustering center.
mi=[ri,gi,bi,xi,yi]T,i=1,2,...nsp (8)
The cluster center is moved to the minimum gradient position in the 3x3 domain, and for each pixel position p in the image, the label l (p) ═ 1 and d (p) ═ infinity of the distance are set.
Step 2: assigning samples to cluster centers: for each cluster mi,i=1,2,...,nspIn a respect of miIn the neighborhood of 2sx2s, m is calculatediA distance D from each pixel pi(p) of the formula (I). Then, for each p and i, 1,2spIf D isi<D (p), then D (p) is equal to DiAnd l (p) ═ i.
And step 3: updating a clustering center: let CiRepresenting a set of pixels in the image with a label L (p) ═ i, updating mi
Figure BDA0003077890530000121
Wherein, | CiIs set CiWherein z is [ r, g, b, x, y ═ r]T
And 4, step 4: and (3) testing the convergence: the euclidean norm of the difference between the average vectors in the current step and the previous step is calculated. Calculating the residual E, i.e. nspSum of the individual norms. If E<T, where T is a defined non-negative threshold, step 5 is entered, otherwise step 2 is returned.
And 5: post-processing super-pixel area: each region CiIn which all super-pixels are replaced by their mean value mi
SLIC superpixels correspond to clusters in a space, this space coordinate is a color and space variable, and the space distance and the color distance need to be processed separately. First by normalizing the distances of the individual components and then combining them into a single measure. Let dcAnd dsThe color distance and the spatial Euclidean distance between two points in the cluster are respectively:
dc=[(rj-ri)2+(gj-gi)2+(bj-bi)2]1/2 (10)
ds=[(xj-xi)2+(yj-yi)2]1/2 (11)
then define D as the composite distance:
D=[(dc/dcm)2+(ds/dsm)2]1/2 (12)
in the formula dcmAnd dsmIs dcAnd dsThe maximum expected value of (c). The maximum spatial distance should correspond to the sampling interval, i.e. dsm=s=[ntp/nsp]1/2. The maximum color distance may vary from cluster to cluster or from image to image. The solution is that dcmSet to a constant c such that equation (12) is:
D=[(dc/c)2+(ds/s)2]1/2 (13)
in three dimensions the superpixel becomes a superpixel, dsIs defined as:
ds=[(xj-xi)2+(yj-yi)2+(zj-zi)2]1/2 (14)
where z is the coordinate of the third spatial direction, and a third spatial variable is added to equation 9, i.e., z ═ r, g, b, x, y, z]T
6. And detecting whether each superpixel block is a disease area or not, and giving out the degree of the leaf diseases.
By executing the superpixel segmentation algorithm by setting the size of each superpixel in advance (500 in the present invention), the leaf extracted from the original image is segmented into a plurality of superpixel blocks having the same or similar characteristics in color, brightness, and texture, and is also excellent for edge processing of an object. By taking each super-pixel block, the conversion is to the hsv color space. The set dead-leaf threshold range is (32,25,25), (78,255,255), and if the pixel is among them, a dead-leaf portion is detected.
And detecting the proportion occupied by the dead leaf area of the superpixel block, if the proportion is less than 20%, discarding the superpixel block, if the proportion is more than 80%, detecting the superpixel block as a dead leaf part, and only keeping the part detected as the dead leaf between the superpixel block and the superpixel block. By this method, errors caused by threshold setting can be effectively avoided. And finally, summarizing to obtain a dead leaf area of the whole cotton leaf, drawing the outline of a disease area in an original drawing, and simultaneously giving the disease degree of the leaf. The calculation here is the ratio of the area of the dead leaf region to the entire leaf (excluding the background).
Example 2
A cotton leaf blight detection system based on color features and superpixel clustering, the system comprising:
the cotton leaf image acquisition module is used for acquiring cotton leaf image data;
the clustering module is used for clustering the cotton images so as to increase the distinguishing degree of a foreground region and a background region to which the leaf parts in the images belong;
the RGB component recombination module is used for enhancing the green part area of the image;
the adaptive threshold segmentation processing module is used for obtaining an initial segmentation image;
a morphological treatment and connected area treatment module for eliminating elongated protrusions, removing burrs and petiole edges; filling holes on the blades and smoothing damaged blade edges; and removing the weed part;
the super-pixel segmentation module is used for segmenting and distinguishing a dead leaf region and a normal leaf region of the extracted cotton leaves without considering edge segmentation;
and the result output module is used for detecting the dead leaf area of each superpixel block by setting a threshold value, processing according to the area ratio and outputting the disease degree and the detection result of the cotton leaves.
Example 3
An electronic device includes a memory, a processor, and a computer instruction stored in the memory and running on the processor, where the computer instruction is executed by the processor to complete each operation in the method of embodiment 1, and for brevity, details are not described here again.
The electronic device may be a mobile terminal and a non-mobile terminal, the non-mobile terminal includes a desktop computer, and the mobile terminal includes a Smart Phone (such as an Android Phone and an IOS Phone), Smart glasses, a Smart watch, a Smart bracelet, a tablet computer, a notebook computer, a personal digital assistant, and other mobile internet devices capable of performing wireless communication.
It is to be understood that in the present invention, the processor may be a central processing unit CPU, but may also be other general purpose processors, digital signal processors DSP, application specific integrated circuits ASIC, off-the-shelf programmable gate arrays FPGA or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory may include both read-only memory and random access memory, and may provide instructions and data to the processor, and a portion of the memory may also include non-volatile random access memory. For example, the memory may also store device type information.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor, or in a combination of the hardware and software modules within the processor. The software modules may be located in ram, flash, rom, prom, or eprom, registers, among other storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor. To avoid repetition, it is not described in detail here. Those of ordinary skill in the art will appreciate that the various illustrative elements, i.e., algorithm steps, described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is merely a division of one logic function, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some interfaces, and may be in an electrical, mechanical or other form.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It should be noted that the above examples are only used to illustrate the technical solutions of the present invention and not to limit them. Although the present invention has been described in detail with reference to the examples given, those skilled in the art can modify the technical solution of the present invention as needed or equivalent substitutions without departing from the spirit and scope of the technical solution of the present invention.

Claims (10)

1. A cotton leaf blight detection method based on color features and super-pixel clustering is characterized by comprising the following steps:
clustering the obtained cotton leaf images;
recombining and enhancing the green part area of the image by using the RGB components;
performing self-adaptive threshold segmentation processing on the image to obtain an initial segmentation image;
and performing morphological processing and connected region area processing on the initial segmentation image, performing superpixel segmentation, detecting each superpixel block according to a threshold value, and outputting a detection result.
2. The cotton leaf blight detection method according to claim 1, wherein the clustering employs a fuzzy C-means clustering algorithm; preferably, the clustering is performed by setting the number of clusters to 5 and the parameter m to 2.
3. The method for detecting cotton leaf blight according to claim 1, wherein the RGB component recombination specifically comprises: the super-green color index, ExG, is used, minus the super-red color index ExR, ExG-ExR.
4. The method for detecting cotton leaf blight according to claim 1, wherein the adaptive threshold segmentation process is performed using an Otsu adaptive threshold algorithm.
5. The method for detecting cotton leaf blight according to claim 1, wherein the morphological treatment and the area treatment of the connected region specifically comprise performing morphological opening and closing operations and setting an area threshold value;
preferably, the opening operation is used for eliminating slender protrusions, removing burrs and petiole edges;
preferably, the closing operation is used for filling holes in the blade and smoothing damaged blade edges;
preferably, an area threshold is set for treatment for removing ground weeds; the specific method comprises the following steps: and detecting a connected region in the whole segmentation image, and removing the weed part if the area is smaller than the threshold value.
6. The method for detecting cotton leaf blight according to claim 1, wherein the superpixel segmentation adopts a linear iterative clustering method.
7. The cotton leaf blight detection method according to claim 1, wherein the specific method of detecting each super-pixel block according to a threshold and outputting a detection result comprises: by executing a superpixel segmentation algorithm by setting the size of each superpixel in advance, a leaf extracted from an original image is segmented into a plurality of superpixel blocks having the same or similar characteristics in color, brightness, and texture, and is also excellent for edge processing of an object; converting each super pixel block into an hsv color space by taking out each super pixel block; setting a dead leaf threshold range, and detecting a dead leaf part if the pixel is in the dead leaf threshold range;
meanwhile, detecting the proportion occupied by the dead leaf area of the superpixel block, if the proportion is less than 20%, discarding the superpixel block, if the proportion is more than 80%, detecting the superpixel block as a dead leaf part, and only keeping the part detected as the dead leaf part between the superpixel block and the superpixel block;
preferably, the preset size of one super pixel is 500;
preferably, the set dead leaf threshold range is (32,25,25), (78,255,255).
8. A cotton leaf blight detection system based on color features and super-pixel clustering, the system comprising:
the cotton leaf image acquisition module is used for acquiring cotton leaf image data;
the clustering module is used for clustering the cotton images so as to increase the distinguishing degree of a foreground region and a background region to which the leaf parts in the images belong;
the RGB component recombination module is used for enhancing the green part area of the image;
the adaptive threshold segmentation processing module is used for obtaining an initial segmentation image;
a morphological treatment and connected area treatment module for eliminating elongated protrusions, removing burrs and petiole edges; filling holes on the blades and smoothing damaged blade edges; and removing the weed part;
the super-pixel segmentation module is used for segmenting and distinguishing a dead leaf area and a normal leaf area for the extracted cotton leaves;
and the result output module is used for detecting the dead leaf area of each superpixel block by setting a threshold value, processing according to the area ratio and outputting the disease degree and the detection result of the cotton leaves.
9. A non-transitory computer program product comprising computer executable code segments stored on a computer readable medium for performing the steps of the color feature and super pixel clustering based cotton leaf blight detection method of any one of claims 1-7.
10. A computer readable storage medium for storing computer instructions which, when executed by a processor, perform the steps of the method for detecting cotton leaf blight based on color features and super-pixel clustering according to any one of claims 1 to 7.
CN202110558035.7A 2021-05-21 2021-05-21 Cotton leaf blight detection method and system based on color features and super-pixel clustering Pending CN113298777A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110558035.7A CN113298777A (en) 2021-05-21 2021-05-21 Cotton leaf blight detection method and system based on color features and super-pixel clustering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110558035.7A CN113298777A (en) 2021-05-21 2021-05-21 Cotton leaf blight detection method and system based on color features and super-pixel clustering

Publications (1)

Publication Number Publication Date
CN113298777A true CN113298777A (en) 2021-08-24

Family

ID=77323677

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110558035.7A Pending CN113298777A (en) 2021-05-21 2021-05-21 Cotton leaf blight detection method and system based on color features and super-pixel clustering

Country Status (1)

Country Link
CN (1) CN113298777A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113744161A (en) * 2021-09-16 2021-12-03 北京顺势兄弟科技有限公司 Enhanced data acquisition method and device, data enhancement method and electronic equipment
CN114332199A (en) * 2021-12-23 2022-04-12 福州大学 Single-leaf area detection method and system suitable for healthy leaves and diseased leaves
CN115861988A (en) * 2023-03-01 2023-03-28 四川省农业机械研究设计院 Tea leaf picking method and system based on RGB discrimination
CN116049121A (en) * 2023-03-06 2023-05-02 睿至科技集团有限公司 Sharing method and system for energy data of Internet of things
CN117237298A (en) * 2023-09-15 2023-12-15 广州乾丰印花有限公司 Printed fabric defect inspection method, device and computing equipment
CN117876646A (en) * 2024-03-11 2024-04-12 陕西仙喜辣木茯茶有限公司 Fuzhuan tea flowering image acquisition method
CN117953491A (en) * 2024-03-27 2024-04-30 泰安市农业科学院(山东省农业科学院泰安市分院) Leaf vegetable disease diagnosis method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106127735A (en) * 2016-06-14 2016-11-16 中国农业大学 A kind of facilities vegetable edge clear class blade face scab dividing method and device
CN110120042A (en) * 2019-05-13 2019-08-13 哈尔滨工业大学 A kind of crop map based on SLIC super-pixel and automatic threshold segmentation is as pest and disease damage method for extracting region
CN111681253A (en) * 2020-06-09 2020-09-18 山东大学 Leaf image segmentation method and system based on color and morphological characteristics
US20200320682A1 (en) * 2016-05-13 2020-10-08 Basf Se System and Method for Detecting Plant Diseases

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200320682A1 (en) * 2016-05-13 2020-10-08 Basf Se System and Method for Detecting Plant Diseases
CN106127735A (en) * 2016-06-14 2016-11-16 中国农业大学 A kind of facilities vegetable edge clear class blade face scab dividing method and device
CN110120042A (en) * 2019-05-13 2019-08-13 哈尔滨工业大学 A kind of crop map based on SLIC super-pixel and automatic threshold segmentation is as pest and disease damage method for extracting region
CN111681253A (en) * 2020-06-09 2020-09-18 山东大学 Leaf image segmentation method and system based on color and morphological characteristics

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
XUEBING BAI: "A fuzzy clustering segmentation method based on neighborhood grayscale information for defining cucumber leaf spot disease images", 《COMPUTERS AND ELECTRONICS IN AGRICULTURE》 *
XUEBING BAI: "A three-dimensional threshold algorithm based on histogram reconstruction and dimensionality reduction for registering cucumber powdery mildew", 《COMPUTERS AND ELECTRONICS IN AGRICULTURE》 *
张建华 等: "改进自适应分水岭方法分割棉花叶部粘连病斑", 《农业工程学报》 *
许帅涛: "基于计算机视觉的烟叶早期病斑分割及分类识别", 《中国优秀博硕士论文全文数据库(硕士) 农业科技辑》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113744161B (en) * 2021-09-16 2024-03-29 北京顺势兄弟科技有限公司 Enhanced data acquisition method and device, data enhancement method and electronic equipment
CN113744161A (en) * 2021-09-16 2021-12-03 北京顺势兄弟科技有限公司 Enhanced data acquisition method and device, data enhancement method and electronic equipment
CN114332199A (en) * 2021-12-23 2022-04-12 福州大学 Single-leaf area detection method and system suitable for healthy leaves and diseased leaves
CN115861988A (en) * 2023-03-01 2023-03-28 四川省农业机械研究设计院 Tea leaf picking method and system based on RGB discrimination
CN115861988B (en) * 2023-03-01 2023-05-09 四川省农业机械研究设计院 Tea picking method and system based on RGB (red, green and blue) distinction degree
CN116049121A (en) * 2023-03-06 2023-05-02 睿至科技集团有限公司 Sharing method and system for energy data of Internet of things
CN116049121B (en) * 2023-03-06 2023-08-01 睿至科技集团有限公司 Sharing method and system for energy data of Internet of things
CN117237298A (en) * 2023-09-15 2023-12-15 广州乾丰印花有限公司 Printed fabric defect inspection method, device and computing equipment
CN117237298B (en) * 2023-09-15 2024-05-14 广州乾丰印花有限公司 Printed fabric defect inspection method, device and computing equipment
CN117876646A (en) * 2024-03-11 2024-04-12 陕西仙喜辣木茯茶有限公司 Fuzhuan tea flowering image acquisition method
CN117876646B (en) * 2024-03-11 2024-05-28 陕西仙喜辣木茯茶有限公司 Fuzhuan tea flowering image acquisition method
CN117953491A (en) * 2024-03-27 2024-04-30 泰安市农业科学院(山东省农业科学院泰安市分院) Leaf vegetable disease diagnosis method and system
CN117953491B (en) * 2024-03-27 2024-06-04 泰安市农业科学院(山东省农业科学院泰安市分院) Leaf vegetable disease diagnosis method and system

Similar Documents

Publication Publication Date Title
CN113298777A (en) Cotton leaf blight detection method and system based on color features and super-pixel clustering
Dias et al. Multispecies fruit flower detection using a refined semantic segmentation network
CN109596634B (en) Cable defect detection method and device, storage medium and processor
CN109636784B (en) Image saliency target detection method based on maximum neighborhood and super-pixel segmentation
CN110120042B (en) Crop image pest and disease damage area extraction method based on SLIC super-pixel and automatic threshold segmentation
CN109978848B (en) Method for detecting hard exudation in fundus image based on multi-light-source color constancy model
CN111259925B (en) K-means clustering and width mutation algorithm-based field wheat spike counting method
Masood et al. Plants disease segmentation using image processing
CN109871900A (en) The recognition positioning method of apple under a kind of complex background based on image procossing
US11880981B2 (en) Method and system for leaf age estimation based on morphological features extracted from segmented leaves
CN114677525B (en) Edge detection method based on binary image processing
CN115578660B (en) Land block segmentation method based on remote sensing image
CN110175650A (en) A kind of power equipment automatic identifying method and device
CN115731257A (en) Leaf form information extraction method based on image
Devi et al. Analysis of segmentation scheme for diseased rice leaves
Septiarini et al. Image processing techniques for tomato segmentation applying k-means clustering and edge detection approach
CN110544262A (en) cervical cell image segmentation method based on machine vision
Sibi Chakkaravarthy et al. Automatic leaf vein feature extraction for first degree veins
CN112419335B (en) Shape loss calculation method of cell nucleus segmentation network
CN113723314A (en) Sugarcane stem node identification method based on YOLOv3 algorithm
Shire et al. A review paper on: agricultural plant leaf disease detection using image processing
Zeng et al. Detecting and measuring fine roots in minirhizotron images using matched filtering and local entropy thresholding
CN114581660A (en) Plant leaf segmentation identification method and system
CN114119634A (en) Automatic building extraction method and system combining vegetation elimination and image feature consistency constraint
Patil et al. An advanced method for chilli plant disease detection using image processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210824