CN112101265B - Robust crop disease diagnosis system - Google Patents

Robust crop disease diagnosis system Download PDF

Info

Publication number
CN112101265B
CN112101265B CN202011005483.6A CN202011005483A CN112101265B CN 112101265 B CN112101265 B CN 112101265B CN 202011005483 A CN202011005483 A CN 202011005483A CN 112101265 B CN112101265 B CN 112101265B
Authority
CN
China
Prior art keywords
characteristic
convolution
crops
crop disease
local
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011005483.6A
Other languages
Chinese (zh)
Other versions
CN112101265A (en
Inventor
雷印杰
陈浩楠
王浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan University
Original Assignee
Sichuan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan University filed Critical Sichuan University
Priority to CN202011005483.6A priority Critical patent/CN112101265B/en
Publication of CN112101265A publication Critical patent/CN112101265A/en
Application granted granted Critical
Publication of CN112101265B publication Critical patent/CN112101265B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Molecular Biology (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Investigating Or Analysing Biological Materials (AREA)

Abstract

The invention provides a robust crop disease diagnosis system, which comprises the following steps: s1, a user shoots leaves of crops to be detected through a smart phone to obtain leaf pictures of the crops to be detected; s2, processing the picture obtained in the step S1 by using a Canny edge detection algorithm to obtain a binary image of the edge of the leaf surface; and S3, sending the picture obtained in the step S1 into a non-local space attention convolution branch of the plant disease and insect pest detection module for extraction to obtain a first characteristic. The robust crop disease diagnosis system provided by the invention judges the health state of plants through the plant stem and leaf pictures acquired by the camera through a computer algorithm, diagnoses the types of diseases if the diseases exist, belongs to the technical field of computer vision, and aims to realize a deep nerve plant disease and insect pest detection model which can operate at a mobile end and has real-time performance and accuracy, and replaces the existing manual identification mode.

Description

Robust crop disease diagnosis system
Technical Field
The invention relates to the technical field of computer vision, in particular to a robust crop disease diagnosis system.
Background
Agriculture is an important supporting industry, and the sustainable development of regional agricultural economy is seriously influenced by the yield reduction of grain crops and economic crops caused by the damage of plant diseases and insect pests for a long time. Crop diseases and insect pests are one of main agricultural disasters in China, and diagnosis and treatment of diseases are helpful for timely preventing and controlling the diseases and insect pests, improving crop yield and reducing economic loss. However, crop diseases and insect pests are very various, and it is difficult for the ordinary person to diagnose the disease plant accurately only by the appearance.
In the traditional method, crop growers usually diagnose diseases and insect pests of crops directly, and the method is time-consuming and labor-consuming, and because of numerous and complicated disease types, the direct diagnosis has a high probability of error, so that diseased crops cannot be treated properly. The pictures of the diseased crops are obtained through the mobile phone camera and are processed by combining with a computer algorithm, so that an accurate diagnosis result can be obtained quickly.
In recent years, with the continuous development of artificial intelligence, automatic detection and identification technology is increasingly applied to various aspects of industrial production, social security, agricultural monitoring and the like. Because the mobile equipment has the advantages of no fatigue, low power consumption, accurate calculation and the like, the automatic detection and identification technology based on the mobile terminal brings great convenience to the life of people. However, in the aspect of crop identification, no high-accuracy mobile terminal automation scheme is yet proposed at present internationally.
Disclosure of Invention
The invention mainly aims to provide a robust crop disease diagnosis system which can effectively solve the problems in the background technology.
In order to achieve the above purpose, the technical scheme adopted by the invention is as follows:
a robust crop disease diagnostic system comprising the steps of:
s1, a user shoots leaves of crops to be detected through a smart phone to obtain leaf pictures of the crops to be detected;
s2, processing the picture obtained in the step S1 by using a Canny edge detection algorithm to obtain a binary image of the edge of the leaf surface;
s3, sending the picture obtained in the step S1 into a non-local space attention convolution branch of a disease and insect detection module for extraction to obtain a first characteristic;
s4, sending the binary image obtained in the step S2 into a rapid downsampling branch for extraction to obtain a second characteristic;
s5, fusing the first characteristic obtained in the step S3 and the second characteristic obtained in the step S4 by using convolution of 1 multiplied by 1 to obtain a third characteristic;
s6, according to the third characteristic obtained in the step S5, a bilinear pooling layer is used for processing the third characteristic, and a fourth characteristic is obtained;
and S7, processing the fourth characteristic obtained in the step S6 by using a full connection layer to obtain a final classification vector.
Preferably, in the step S1, the pixels occupied by the leaf surface in the photo are not less than 100×100, and the included angle between the shooting direction of the camera and the plane of the leaf surface is 90±10 degrees.
Preferably, the non-local spatial attention convolution branch in the step S3 is formed by the res net18 with the full connection layer removed and the non-local spatial attention convolution layer; assuming that the input features are X and the output features are Y, the formula of the designed non-local spatial attention convolution layer is as follows:
Figure GDA0004129011730000021
spacial(X)=Sigmoid(W e ·concat(maxpool(X),avgpool(X))) (2)
Y=X+spacial(X)*X+nonlocal(X) (3)
wherein nonlocal (X) represents non-local attention, spatial (X) represents local attention, W a 、W b 、W c 、W d Represents the weight of a 1 x 1 convolution, W e Representing the weight of a 7 x 7 convolution, concat represents the stitching operation, maxpool represents the max pooling operation, and avgpool represents the average pooling operation.
Preferably, the fast downsampling branch in step S4 is formed by 5 fast downsampling modules, and each fast downsampling module is formed by two convolution layers and a maximum pooling layer.
Compared with the prior art, the invention has the following beneficial effects:
1. the mobile phone single camera completes the acquisition work, and has good adaptability to illumination and resolution of scenes.
2. The accuracy is high, and the identification accuracy is as high as 98% under the condition that the detection false detection rate is 0.6%.
3. The method has the advantages that the speed is high, the total time consumed by a single picture from the acquisition of the final result to the recording is within 5 milliseconds, the disease types suffered by crops can be accurately judged, and finally reliable crop state estimation is obtained.
Drawings
FIG. 1 is a general flow diagram of the present invention;
FIG. 2 is a schematic diagram of a program including main modules according to the present invention;
FIG. 3 is a schematic flow chart of the spatial attention module of the present invention;
FIG. 4 is a flow chart of a non-local attention module according to the present invention;
FIG. 5 is a schematic flow diagram of a non-local spatial attention module according to the present invention;
FIG. 6 is a schematic diagram of an effective acquisition area of a smart phone according to the present invention;
fig. 7 is a diagram of a crop pest identification network model of the present invention.
Detailed Description
The invention is further described in connection with the following detailed description, in order to make the technical means, the creation characteristics, the achievement of the purpose and the effect of the invention easy to understand.
A robust crop disease diagnostic system comprising the steps of:
s1, a user shoots leaves of crops to be detected through a smart phone to obtain leaf pictures of the crops to be detected;
s2, processing the picture obtained in the step S1 by using a Canny edge detection algorithm to obtain a binary image of the edge of the leaf surface;
s3, sending the picture obtained in the step S1 into a non-local space attention convolution branch of a disease and insect detection module for extraction to obtain a first characteristic;
s4, sending the binary image obtained in the step S2 into a rapid downsampling branch for extraction to obtain a second characteristic;
s5, fusing the first characteristic obtained in the step S3 and the second characteristic obtained in the step S4 by using convolution of 1 multiplied by 1 to obtain a third characteristic;
s6, according to the third characteristic obtained in the step S5, a bilinear pooling layer is used for processing the third characteristic, and a fourth characteristic is obtained;
and S7, processing the fourth characteristic obtained in the step S6 by using a full connection layer to obtain a final classification vector.
In this embodiment, in step S1, in order to ensure that the pixels occupied by the leaf surface in the photo are not less than 10000 (100×100), the shooting position of the smart phone needs to be adjusted according to parameters such as the distance between the leaf surface and the mobile phone camera, and the resolution of the mobile phone camera, and the included angle between the shooting direction of the camera and the plane of the leaf surface is 90±10 degrees.
In this embodiment, the non-local spatial attention convolution branch in step S3 is composed of a res net18 with the full connection layer removed and a non-local spatial attention convolution layer, and the network structure of the res net18 is first proposed in the CVPR2016, and compared with the conventional convolution network, the residual structure is introduced into the res net18, so that the classification performance of the model is improved; assuming that the input features are X and the output features are Y, the formula of the designed non-local spatial attention convolution layer is as follows:
Figure GDA0004129011730000041
spacial(X)=Sigmoid(W e ·concat(maxpool(X),avgpool(X))) (2)
Y=X+spacial(X)*X+nonlocal(X) (3)
wherein nonlocal (X) represents non-local attention, spatial (X) represents local attention, W a 、W b 、W c 、W d Represents the weight of a 1 x 1 convolution, W e Representing the weight of a 7 x 7 convolution, concat represents the stitching operation, maxpool represents the max pooling operation, and avgpool represents the average pooling operation.
In this embodiment, the fast downsampling branch in step S4 is composed of 5 fast downsampling modules, each of which is composed of two convolution layers and one maximum pooling layer.
It should be noted that, according to the robust crop disease diagnosis system, the mobile phone camera facing the crops is used for shooting the pictures of the leaf surfaces of the crops possibly suffering from diseases, and then the machine learning technology is used for identifying different kinds of diseases from the pictures; the method fully considers pathological connection of disease state parts and disease parts and health parts on plant leaf surfaces, designs a novel convolution layer, learns the characteristic of more space discrimination through non-local spatial attention, and realizes a high-precision crop disease and insect recognition model.
According to the method, a user needs to use a camera on a smart phone to photograph crops to be identified, and after a photographed photo is obtained, a Canny edge detection algorithm is used for rapidly detecting a binary image of the edges of stems and leaves of plants; and then inputting the original image into a non-local space attention convolution branch to extract features, inputting the edge detection result into a rapid downsampling sub-extraction auxiliary feature, fusing the two features through 1X 1 convolution, outputting the features with stronger expression capacity through a bilinear pooling layer by the fused features, and finally outputting the optimal result of crop disease classification through a full-connection layer.
The foregoing has shown and described the basic principles and main features of the present invention and the advantages of the present invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, and that the above embodiments and descriptions are merely illustrative of the principles of the present invention, and various changes and modifications may be made without departing from the spirit and scope of the invention, which is defined in the appended claims. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (4)

1. A robust crop disease diagnostic system characterized by: the method comprises the following steps:
s1, a user shoots leaves of crops to be detected through a smart phone to obtain leaf pictures of the crops to be detected;
s2, processing the picture obtained in the step S1 by using a Canny edge detection algorithm to obtain a binary image of the edge of the leaf surface;
s3, sending the picture obtained in the step S1 into a non-local space attention convolution branch of a disease and insect detection module for extraction to obtain a first characteristic;
s4, sending the binary image obtained in the step S2 into a rapid downsampling branch for extraction to obtain a second characteristic;
s5, fusing the first characteristic obtained in the step S3 and the second characteristic obtained in the step S4 by using convolution of 1 multiplied by 1 to obtain a third characteristic;
s6, according to the third characteristic obtained in the step S5, a bilinear pooling layer is used for processing the third characteristic, and a fourth characteristic is obtained;
and S7, processing the fourth characteristic obtained in the step S6 by using a full connection layer to obtain a final classification vector.
2. A robust crop disease diagnostic system according to claim 1, characterized in that: in the step S1, the pixels occupied by the leaf surface in the photo are not less than 100 x 100, and the included angle between the shooting direction of the camera and the plane of the leaf surface is 90+/-10 degrees.
3. A robust crop disease diagnostic system according to claim 1, characterized in that: the non-local space attention convolution branch in the step S3 is formed by ResNet18 without a full connection layer and a non-local space attention convolution layer; assuming that the input features are X and the output features are Y, the formula of the designed non-local spatial attention convolution layer is as follows:
Figure FDA0004129011720000011
spacial(X)=Sigmoid(W e ·concat(maxpool(X),avgpool(X))) (2)
Y=X+spacial(X)*X+nonlocal(X) (3)
wherein nonlocal (X) represents non-local attention, spatial (X) represents local attention, W a 、W b 、W c 、W d Represents the weight of a 1 x 1 convolution, W e Representing the weight of a 7 x 7 convolution, concat represents the stitching operation, maxpool represents the max pooling operation, and avgpool represents the average pooling operation.
4. A robust crop disease diagnostic system according to claim 1, characterized in that: the fast downsampling branch in the step S4 is composed of 5 fast downsampling modules, and each fast downsampling module is composed of two convolution layers and a maximum pooling layer.
CN202011005483.6A 2020-09-22 2020-09-22 Robust crop disease diagnosis system Active CN112101265B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011005483.6A CN112101265B (en) 2020-09-22 2020-09-22 Robust crop disease diagnosis system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011005483.6A CN112101265B (en) 2020-09-22 2020-09-22 Robust crop disease diagnosis system

Publications (2)

Publication Number Publication Date
CN112101265A CN112101265A (en) 2020-12-18
CN112101265B true CN112101265B (en) 2023-04-25

Family

ID=73754935

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011005483.6A Active CN112101265B (en) 2020-09-22 2020-09-22 Robust crop disease diagnosis system

Country Status (1)

Country Link
CN (1) CN112101265B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108921814A (en) * 2018-05-16 2018-11-30 华南农业大学 A kind of Citrus Huanglongbing pathogen on-line quick detection system and method based on deep learning
CN110009043A (en) * 2019-04-09 2019-07-12 广东省智能制造研究所 A kind of pest and disease damage detection method based on depth convolutional neural networks
CN110020681A (en) * 2019-03-27 2019-07-16 南开大学 Point cloud feature extracting method based on spatial attention mechanism
CN110188635A (en) * 2019-05-16 2019-08-30 南开大学 A kind of plant pest recognition methods based on attention mechanism and multi-level convolution feature
CN110502987A (en) * 2019-07-12 2019-11-26 山东农业大学 A kind of plant pest recognition methods and system based on deep learning
CN110738146A (en) * 2019-09-27 2020-01-31 华中科技大学 target re-recognition neural network and construction method and application thereof
WO2020047739A1 (en) * 2018-09-04 2020-03-12 安徽中科智能感知大数据产业技术研究院有限责任公司 Method for predicting severe wheat disease on the basis of multiple time-series attribute element depth features
CN111259982A (en) * 2020-02-13 2020-06-09 苏州大学 Premature infant retina image classification method and device based on attention mechanism
CN111507271A (en) * 2020-04-20 2020-08-07 北京理工大学 Airborne photoelectric video target intelligent detection and identification method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108921814A (en) * 2018-05-16 2018-11-30 华南农业大学 A kind of Citrus Huanglongbing pathogen on-line quick detection system and method based on deep learning
WO2020047739A1 (en) * 2018-09-04 2020-03-12 安徽中科智能感知大数据产业技术研究院有限责任公司 Method for predicting severe wheat disease on the basis of multiple time-series attribute element depth features
CN110020681A (en) * 2019-03-27 2019-07-16 南开大学 Point cloud feature extracting method based on spatial attention mechanism
CN110009043A (en) * 2019-04-09 2019-07-12 广东省智能制造研究所 A kind of pest and disease damage detection method based on depth convolutional neural networks
CN110188635A (en) * 2019-05-16 2019-08-30 南开大学 A kind of plant pest recognition methods based on attention mechanism and multi-level convolution feature
CN110502987A (en) * 2019-07-12 2019-11-26 山东农业大学 A kind of plant pest recognition methods and system based on deep learning
CN110738146A (en) * 2019-09-27 2020-01-31 华中科技大学 target re-recognition neural network and construction method and application thereof
CN111259982A (en) * 2020-02-13 2020-06-09 苏州大学 Premature infant retina image classification method and device based on attention mechanism
CN111507271A (en) * 2020-04-20 2020-08-07 北京理工大学 Airborne photoelectric video target intelligent detection and identification method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Edna Chebet Too 等.A comparative study of fine-tuning deep learning models for plant disease identification.Computers and Electronics in Agriculture.2018,第161卷272-279. *
孙鹏 等.基于注意力卷积神经网络的大豆害虫图像识别.中国农机化学报.2020,第41卷(第02期),171-176. *
李晓振 等.基于注意力神经网络的番茄叶部病害识别系统.江苏农业学报.2020,第36卷(第03期),561-568. *
项圣凯 等.使用密集弱注意力机制的图像显著性检测.中国图象图形学报.2020,第25卷(第01期),136-147. *

Also Published As

Publication number Publication date
CN112101265A (en) 2020-12-18

Similar Documents

Publication Publication Date Title
US10789455B2 (en) Liveness test method and apparatus
CN112949565A (en) Single-sample partially-shielded face recognition method and system based on attention mechanism
CN109271884A (en) Face character recognition methods, device, terminal device and storage medium
CN108765278A (en) A kind of image processing method, mobile terminal and computer readable storage medium
CN112801015B (en) Multi-mode face recognition method based on attention mechanism
TW200834459A (en) Video object segmentation method applied for rainy situations
CN103116749A (en) Near-infrared face identification method based on self-built image library
CN106991380A (en) A kind of preprocess method based on vena metacarpea image
CN110263768A (en) A kind of face identification method based on depth residual error network
CN109993103A (en) A kind of Human bodys' response method based on point cloud data
CN112257702A (en) Crop disease identification method based on incremental learning
CN110033487A (en) Vegetables and fruits collecting method is blocked based on depth association perception algorithm
CN115689960A (en) Illumination self-adaptive infrared and visible light image fusion method in night scene
CN116030498A (en) Virtual garment running and showing oriented three-dimensional human body posture estimation method
CN110222647B (en) Face in-vivo detection method based on convolutional neural network
CN112101265B (en) Robust crop disease diagnosis system
CN107959756B (en) System and method for automatically turning off electronic equipment during sleeping
CN112700409A (en) Automatic retinal microaneurysm detection method and imaging method
CN116403004A (en) Cow face fusion feature extraction method based on cow face correction
Balakrishna et al. Tomato Leaf Disease Detection Using Deep Learning: A CNN Approach
CN206363347U (en) Based on Corner Detection and the medicine identifying system that matches
CN102156879A (en) Human target matching method based on weighted terrestrial motion distance
CN115358961A (en) Multi-focus image fusion method based on deep learning
CN109214222A (en) Based on Embedded 32 cigarette laser code identifying systems and its recognition methods
CN114821239A (en) Method for detecting plant diseases and insect pests in foggy environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant