CN114332849A - Crop growth state combined monitoring method and device and storage medium - Google Patents

Crop growth state combined monitoring method and device and storage medium Download PDF

Info

Publication number
CN114332849A
CN114332849A CN202210255333.3A CN202210255333A CN114332849A CN 114332849 A CN114332849 A CN 114332849A CN 202210255333 A CN202210255333 A CN 202210255333A CN 114332849 A CN114332849 A CN 114332849A
Authority
CN
China
Prior art keywords
monitoring
growth
global
network
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210255333.3A
Other languages
Chinese (zh)
Other versions
CN114332849B (en
Inventor
万亚东
钱浩
张超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Innotitan Intelligent Equipment Technology Tianjin Co Ltd
Original Assignee
Innotitan Intelligent Equipment Technology Tianjin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Innotitan Intelligent Equipment Technology Tianjin Co Ltd filed Critical Innotitan Intelligent Equipment Technology Tianjin Co Ltd
Priority to CN202210255333.3A priority Critical patent/CN114332849B/en
Publication of CN114332849A publication Critical patent/CN114332849A/en
Application granted granted Critical
Publication of CN114332849B publication Critical patent/CN114332849B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
    • Y02A40/25Greenhouse technology, e.g. cooling systems therefor

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Image Processing (AREA)

Abstract

The application relates to a crop growth state combined monitoring method, a device and a storage medium, wherein the method comprises the following steps: constructing a growth monitoring module, and detecting the growth state period category and coordinate information of each monitoring target in the original image by using the growth monitoring module; the growth monitoring module adopts YOLOv5 as a basic network, and embeds a global perception network for extracting global context information of the monitoring target in a network structure of YOLOv 5. The growth monitoring module constructed by the method can obviously improve the monitoring accuracy of the growth state of the crops.

Description

Crop growth state combined monitoring method and device and storage medium
Technical Field
The application relates to the technical field of crop monitoring, in particular to a crop growth state combined monitoring method and device and a storage medium.
Background
With the continuous development of modern agriculture and the increasing demand of agricultural products, the proportion of greenhouse planting in agricultural production is larger and larger. In current agricultural greenhouse planting, the growth condition of crops is mainly analyzed and judged by manpower and manual experience, so that the working efficiency is low, scientific guidance is lacked, and the quality of the crops is difficult to guarantee. Therefore, an intelligent crop growth state monitoring means is urgently needed to provide accurate and reliable monitoring data for scientific management of greenhouse crops, so that the yield and quality of the crops are effectively improved, and rapid development of agricultural economy is promoted.
Disclosure of Invention
In order to solve the technical problems mentioned in the background art or at least partially solve the technical problems, the present application provides a method, an apparatus and a storage medium for jointly monitoring the growth state of crops, which can significantly improve the accuracy of monitoring the growth state of crops.
In a first aspect, the present application provides a method for jointly monitoring the growth status of crops, comprising:
constructing a growth monitoring module, and detecting the growth state period category and coordinate information of each monitoring target in the original image by using the growth monitoring module; the growth monitoring module adopts YOLOv5 as a basic network, and embeds a global perception network for extracting global context information of the monitoring target in a network structure of YOLOv 5.
Preferably, the using YOLOv5 as a base network and embedding a global sensing network for extracting global context information of the monitoring target in the network structure of YOLOv5 specifically include:
and taking the first output feature map of the backbone network of the YOLOv5 as the input of a global perception network, wherein the global perception network multiplies a feature map output after sequentially performing convolution and global average pooling on the first output feature map by a second output feature map at the prediction end of the YOLOv5 to obtain a global feature map, the backbone network is used for extracting the image features of the original image in a layer-by-layer convolution mode, and the first output feature map contains global context information of the monitoring target.
Preferably, the global sensing network comprises a first convolution module layer, a second convolution module layer and a global average pooling layer which are connected in sequence, and the first output feature map is input into the global average pooling layer for global average pooling after features are extracted from the first convolution module layer and the second convolution module layer in sequence.
Preferably, the method for jointly monitoring the growth state of the crops further comprises the following steps: and the prediction end of the YOLOv5 classifies and positions the monitoring target based on the global feature map, and outputs the growth period category of the monitoring target and the coordinate information of the monitoring target in the original image.
Preferably, the method for jointly monitoring the growth state of the crops further comprises the following steps:
constructing a nutrition monitoring module, a disease monitoring module and a pest monitoring module which adopt the same network structure;
cutting the global feature map according to the first size of the original image, the second size of the global feature map and the coordinate information of the monitoring target in the original image to obtain a target feature map of the monitoring target;
and inputting the target characteristic diagram into the nutrition monitoring module, the disease monitoring module and the insect pest monitoring module respectively, and detecting the nutrition state type, the disease state type and the insect pest state type of the monitoring target respectively.
Preferably, the network structures of the nutrition monitoring module, the disease monitoring module and the insect pest monitoring module respectively comprise a first convolution module layer, a second convolution module layer, a first dimension full-connection layer and a second dimension full-connection layer which are sequentially connected;
the first convolution module layer and the second convolution module layer adopt convolution kernels with different sizes to extract different types of feature representations in the target feature map to obtain an intermediate feature map;
and the first dimension full-connection layer and the second dimension full-connection layer classify the intermediate characteristic diagram to obtain a state classification result of the monitoring target, wherein the classification result is a nutrition state classification or a disease state classification or an insect pest state classification.
Preferably, the method for jointly monitoring the growth state of the crops further comprises the following steps:
acquiring a plurality of original images of a monitoring target in the whole growth period, wherein the original images comprise a plurality of monitoring targets of the same type;
and carrying out multi-label labeling on the original image to obtain a multi-label image training set, wherein the multi-label image training set comprises training images labeled with a plurality of sample labels.
Preferably, the method for jointly monitoring the growth state of the crops further comprises the following steps:
selecting a training image labeled with a first sample label in the multi-label image training set to train the growth monitoring module, wherein the first sample label comprises category information and coordinate information of the growth period of the monitoring target;
selecting a training image labeled with a second sample label in the multi-label image training set to train the nutrition monitoring module, wherein the second sample label comprises nutrition state category information of the monitoring target;
selecting a training image labeled with a third sample label in the multi-label image training set to train the disease monitoring module, wherein the third sample label comprises disease state category information of the monitoring target;
selecting a training image labeled with a fourth sample label in the multi-label image training set to train the insect pest monitoring module, wherein the fourth sample label comprises insect pest state category information of the monitoring target.
In a second aspect, the present application further provides a crop growth state joint monitoring device, including:
a memory for storing program instructions;
a processor for calling the program instructions stored in the memory to implement the method for jointly monitoring the growth status of crops according to any one of the above aspects.
In a third aspect, the present application further provides a computer-readable storage medium, which stores program codes for implementing the method for jointly monitoring the growth status of crops according to any one of the first aspect.
Compared with the prior art, the technical scheme provided by the embodiment of the application has the following advantages: on the first hand, the method has the advantages that the global perception network is embedded into the YOLOv5 network and serves as a crop growth monitoring module, the growth monitoring module with the network structure can extract global context information of target crops, growth state characteristics of the surrounding crops are obtained to assist growth state judgment of the target crops, and accordingly monitoring accuracy of the network on the crop growth state can be remarkably improved;
in the second aspect, based on the output result of the growth monitoring module, the application further provides a combined monitoring method consisting of the growth monitoring module, the nutrition monitoring module, the disease monitoring module and the insect pest monitoring module, compared with the current greenhouse planting method which mainly relies on artificial experience to analyze the growth condition of crops, the combined monitoring method completely relies on an industrial camera and a deep learning algorithm to carry out automatic growth state monitoring, not only can save a large amount of labor cost, but also can give out an accurate and real-time monitoring result in the whole growth period, thereby guiding the precise management of greenhouse crops and realizing the yield increase of the crops.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic flow chart of a method for jointly monitoring the growth status of crops according to an embodiment of the present disclosure;
fig. 2 is a schematic network structure diagram of a growth monitoring module according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an AI joint monitoring network according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
For convenience of understanding, the following detailed description is provided for a crop growth state combined monitoring method provided in an embodiment of the present application, and the method includes the following steps:
constructing a growth monitoring module, and detecting the growth state period category and coordinate information of each monitoring target in the original image by using the growth monitoring module; the growth monitoring module adopts YOLOv5 as a basic network, and embeds a global perception network for extracting global context information of the monitoring target in a network structure of YOLOv 5.
In some embodiments of the present application, the target detection network YOLOv5 has good performance of leading similar detectors in detection precision and inference speed, and meets the application requirement of the precise real-time monitoring task of crop growth state monitoring, so YOLOv5 is adopted as the basic network of the growth monitoring module. And further improves the network Structure of YOLOv5, and a Global aware network (GAS) is embedded in the network Structure of YOLOv 5.
The global perception network is embedded into the YOLOv5 network and serves as a crop growth monitoring module, the growth monitoring module with the network structure can extract global context information of a target crop (namely a monitoring target), growth state judgment of the target crop is assisted by obtaining growth state characteristics of crops around the growth monitoring module, and accordingly monitoring accuracy of the network on the crop growth state can be remarkably improved.
In some specific embodiments of the present application, the embedding a global awareness network for extracting global context information of the monitoring target in the network structure of YOLOv5, using YOLOv5 as a base network, specifically includes:
and taking the first output feature map of the backbone network of the YOLOv5 as the input of a global perception network, wherein the global perception network multiplies a feature map output after sequentially performing convolution and global average pooling on the first output feature map by a second output feature map at the prediction end of the YOLOv5 to obtain a global feature map, the backbone network is used for extracting the image features of the original image in a layer-by-layer convolution mode, and the first output feature map contains global context information of the monitoring target.
In some embodiments of the present application, the YOLOv5 network is composed of four parts, i.e., an input end, a backbone network, a feature fusion network and a prediction end, the backbone network is composed of three parts, i.e., B1, B2 and B3, and output feature maps of the three parts are used as inputs of the feature fusion network.
In the deep neural network, a backbone network extracts feature representation in an image in a layer-by-layer convolution mode, a feature map (namely a first output feature map) of the highest layer of the backbone network is adopted for designing a global perception network, the feature map positioned at the high layer contains more and richer global context information than the feature map positioned at the low layer, the global feature information contained in the global feature map is more comprehensive, and the prediction accuracy is favorably improved.
The prediction end of the YOLOv5 network comprises three output feature maps with different sizes, and the global feature map and each output feature map of the prediction end are subjected to matrix multiplication operation, so that a group of global feature maps containing global feature information is output.
In some embodiments of the present application, the global sensing network includes a first convolution module layer, a second convolution module layer, and a global average pooling layer, which are connected in sequence, and the first output feature map is input to the global average pooling layer for global average pooling after features are extracted from the first convolution module layer and the second convolution module layer in sequence.
In some embodiments of the present application, the output feature map of B3 is input into a convolution module for further feature abstraction, the convolution module comprises a convolution operation with convolution kernel 1 × 1 and convolution kernel number 512, and a leak Relu activation function; then, inputting the feature graph into a convolution module, wherein the dimension of the feature graph is adjusted from 32 × 512 to 32 × 255, so that the global feature embedding operation is carried out at the prediction end in the subsequent step, and the module comprises a convolution operation with convolution kernels of 1 × 1 and the number of the convolution kernels of 255 and a Leaky Relu activation function; finally, in order to extract the feature representation of the global level, the output feature map of the previous step is subjected to a global average pooling operation, so that a global feature with a dimension of 1 × 255 is output.
In some embodiments of the present application, the method for jointly monitoring the growth status of crops further comprises: and the prediction end of the YOLOv5 classifies and positions the monitoring target based on the global feature map, and outputs the growth period category of the monitoring target and the coordinate information of the monitoring target in the original image.
The prediction end of the YOLOv5 network contains three output feature maps of different sizes (128 × 255, 64 × 255, and 32 × 255, respectively), and the global feature is matrix-multiplied with each output feature map, thereby outputting a set of global feature maps containing global feature information. Then, the prediction end carries out the classification and the positioning of the Chinese cabbage plants based on the group of global characteristic maps, and outputs the category of a series of Chinese cabbage plants and the coordinate information in the original graph.
In some embodiments of the present application, the method for jointly monitoring the growth status of crops further comprises:
constructing a nutrition monitoring module, a disease monitoring module and a pest monitoring module which adopt the same network structure;
cutting the global feature map according to the first size of the original image, the second size of the global feature map and the coordinate information of the monitoring target in the original image to obtain a target feature map of the monitoring target;
and inputting the target characteristic diagram into the nutrition monitoring module, the disease monitoring module and the insect pest monitoring module respectively, and detecting the nutrition state type, the disease state type and the insect pest state type of the monitoring target respectively.
The size of the global feature map is smaller than that of the original image, the coordinate position of the monitoring target in the global feature map can be obtained according to the coordinate information of the monitoring target in the original image and the size proportional relation between the global feature map and the original image, the global feature map is cut according to the calculated coordinate position, namely the feature area of each monitoring target is cut from the global feature map, and the target feature maps of the monitoring targets are obtained through sequential output.
In some embodiments of the present application, the nutrition monitoring module is configured to process the input target feature map to obtain a nutrition status category of the monitored target; similarly, the disease monitoring module and the pest monitoring module have the same principle.
In order to further realize more comprehensive and more accurate growth state monitoring of crop plants, a nutrition monitoring module, a disease monitoring module and a pest monitoring module are designed based on a target characteristic diagram. The three monitoring modules adopt the same network structure so as to facilitate subsequent network training and parameter optimization.
In some embodiments of the present application, the network structure of the nutrition monitoring module, the disease monitoring module, and the pest monitoring module each include a first convolutional module layer, a second convolutional module layer, a first dimensional fully-connected layer, and a second dimensional fully-connected layer connected in sequence;
the first convolution module layer and the second convolution module layer adopt convolution kernels with different sizes to extract different types of feature representations in the target feature map to obtain an intermediate feature map;
and the first dimension full-connection layer and the second dimension full-connection layer classify the intermediate characteristic diagram to obtain a state classification result of the monitoring target, wherein the classification result is a nutrition state classification or a disease state classification or an insect pest state classification.
In some embodiments of the present application, the method for jointly monitoring the growth status of crops further comprises:
acquiring a plurality of original images of a monitoring target in the whole growth period, wherein the original images comprise a plurality of monitoring targets of the same type;
and carrying out multi-label labeling on the original image to obtain a multi-label image training set, wherein the multi-label image training set comprises training images labeled with a plurality of sample labels.
In some embodiments of the present application, a plurality of images of a monitoring target (i.e., a crop) in the whole growth cycle are collected, and the original image is labeled with multiple labels by using labeling software, so as to obtain a corresponding labeled file; and dividing the obtained image and the corresponding labeling file into a training set (namely a multi-label image training set) and a test set according to a proportion, thereby obtaining a multi-label monitoring data set of the monitoring target.
In some embodiments of the present application, after obtaining raw image data of a sufficient scale, a labeling software is used to perform multi-label labeling on the raw images, and coordinate information of each monitoring target, the type of the growth cycle (including a germination stage, a seedling stage, a rosette stage, and a nodulation stage), the type of nutritional state (including health, nitrogen deficiency, phosphorus deficiency, potassium deficiency, and calcium deficiency), the type of disease state (including health, black spot, white spot, black rot, and downy mildew), and the type of pest state (including health, insect eye, aphid, cabbage caterpillar, and borer) are labeled in each raw image, so as to obtain corresponding labeled files.
In some embodiments of the present application, the method for jointly monitoring the growth status of crops further comprises:
selecting a training image labeled with a first sample label in the multi-label image training set to train the growth monitoring module, wherein the first sample label comprises category information and coordinate information of the growth period of the monitoring target;
selecting a training image labeled with a second sample label in the multi-label image training set to train the nutrition monitoring module, wherein the second sample label comprises nutrition state category information of the monitoring target;
selecting a training image labeled with a third sample label in the multi-label image training set to train the disease monitoring module, wherein the third sample label comprises disease state category information of the monitoring target;
selecting a training image labeled with a fourth sample label in the multi-label image training set to train the insect pest monitoring module, wherein the fourth sample label comprises insect pest state category information of the monitoring target.
The category information of the growth period of the monitoring target, the category information of the nutritional state of the monitoring target, the category information of the disease state of the monitoring target, and the category information of the pest state of the monitoring target can be referred to the description of the above embodiments, and are not described herein again.
In still other embodiments of the present application, there is provided a combined crop growth monitoring device, see the drawings, including:
a memory for storing program instructions;
a processor for calling the program instructions stored in the memory to implement the crop growth state joint monitoring method according to any of the above embodiments.
In further embodiments of the present application, there is also provided a computer-readable storage medium storing program codes for implementing the crop growth state joint monitoring method according to any one of the above embodiments.
For convenience of understanding, the following takes chinese cabbage as an example of the monitoring target, and the crop growth state joint monitoring method is further described in detail, referring to fig. 1, and the specific steps are as follows:
step one, establishing a vegetable plant multi-label monitoring data set. Firstly, dividing a cabbage nursery for image acquisition into a plurality of groups, and adopting different cultivation modes to carry out cabbage planting management so as to ensure that cabbage samples in different nutrition states and disease and pest states are obtained;
simultaneously, the arm AGV dolly that adopts to carry on industry camera carries out image acquisition line by line in the big-arch shelter, and an preferred collection mode can be: starting the first collection on the 5 th day after sowing, collecting once every week until the whole growth period of the Chinese cabbage is reached, and ensuring that the shooting angle is vertical to the ground and the distance between a lens and the ground is unchanged in the collection process;
after image data of sufficient scale is obtained, multi-label labeling is carried out on the white cabbage images by using labeling software, and the coordinate information of each cabbage plant, the type of the growth cycle (including the germination period, the seedling period, the rosette period and the nodulation period), the type of the nutritional state (including health, nitrogen deficiency, phosphorus deficiency, potassium deficiency and calcium deficiency), the type of the disease state (including health, black spot, white spot, black rot and downy mildew) and the type of the pest state (including health, insect eye, aphid, cabbage caterpillar and borer) are labeled in each image, so that corresponding labeling files are obtained;
and finally, dividing the obtained image and the corresponding label file into a training set and a testing set according to a proportion, thereby obtaining a vegetable plant multi-label monitoring data set.
And step two, designing a growth monitoring module. As the target detection network YOLOv5 has good performance of leading the same kind of detectors in terms of detection precision and inference speed, and meets the application requirement of the precise real-time monitoring task of crop growth state monitoring, the YOLOv5 is adopted as a basic network of a growth monitoring module.
The same kind of crops in the same greenhouse are generally sown in the same period, and the growth cycles of the crops are synchronous; and the nutrition state and the pest and disease damage state of the crops in the same soil area are relatively close. Therefore, the growth state of the surrounding crops is acquired, that is, the global context information of the monitoring target is extracted, so that the growth state judgment of the target crops can be assisted.
In order to improve the monitoring accuracy of crops, a Global Awareness network (GAS) is provided and is embedded into a YOLOv5 network to serve as a growth monitoring module of the crops, and according to the network Structure characteristics of the growth monitoring module, the corresponding network can also be called as a YOLOv5 with GAS network. As shown in fig. 2, taking an example of inputting a cabbage image to be detected (with a size of 1024 × 3) in the present invention, a design process of the growth monitoring module is shown:
the YOLOv5 network consists of four parts, namely an input end, a backbone network, a feature fusion network and a prediction end, wherein the backbone network consists of three parts, namely B1, B2 and B3, and output feature graphs of the three parts are used as the input of the feature fusion network.
In the deep neural network, the backbone network extracts feature representations in the image in a layer-by-layer convolution mode, so that feature maps positioned at a high layer contain more abundant global context information than feature maps positioned at a low layer. Therefore, the design of the global sensing network is performed by using the characteristic diagram of the highest layer of the backbone network, namely the output characteristic diagram (with the size of 32 × 512).
Firstly, inputting the output feature map of B3 into a convolution module for further feature abstraction, wherein the module comprises a convolution operation with convolution kernel of 1 × 1 and convolution kernel number of 512 and a Leaky Relu activation function;
then, inputting the feature graph into a convolution module, wherein the dimension of the feature graph is adjusted from 32 × 512 to 32 × 255, so that the global feature embedding operation is carried out at the prediction end in the subsequent step, and the module comprises a convolution operation with convolution kernels of 1 × 1 and the number of the convolution kernels of 255 and a Leaky Relu activation function;
finally, in order to extract the feature representation of the global level, the output feature map of the previous step is subjected to a global average pooling operation, so that a global feature with a dimension of 1 × 255 is output.
The prediction end of the YOLOv5 network contains three output feature maps of different sizes (128 × 255, 64 × 255, and 32 × 255, respectively), and the global feature is matrix-multiplied with each output feature map, thereby outputting a set of global feature maps containing global feature information.
Then, the prediction end carries out the classification and the positioning of the Chinese cabbage plants based on the group of global characteristic maps, and outputs the category of a series of Chinese cabbage plants and the coordinate information in the original graph.
And at this moment, the construction of the growth monitoring module and the classification of the cabbage image to be detected by using the growth monitoring module are completed.
And then, cutting out the characteristic areas of the Chinese cabbage plants in the global characteristic diagram according to the size ratio of the original diagram and the global characteristic diagram and the coordinate information of the Chinese cabbage plants in the original diagram, and sequentially outputting the characteristic areas. Designing a nutrition monitoring module, a disease monitoring module and a pest monitoring module, and obtaining an Artificial Intelligence (AI) combined monitoring network, wherein for convenience of description, the AI combined monitoring network is simply referred to as the AI combined monitoring network.
The growth monitoring module in the previous step outputs the characteristic region of the crop plant in the figure, and for convenience of description, the characteristic region is named as a plant characteristic diagram (which is equivalent to the target characteristic diagram in the above embodiment). In order to realize the accurate growth state monitoring of crop plants, a nutrition monitoring module, a disease monitoring module and a pest monitoring module are designed based on a plant characteristic diagram. The three monitoring modules adopt the same network structure so as to facilitate subsequent network training and parameter optimization.
As shown in fig. 3, a cabbage plant feature map (size 12 × 255) output by the growth monitoring module of the present invention is taken as an example to show the design process of the three monitoring modules:
firstly, inputting a Chinese cabbage plant characteristic diagram into a convolution module of 3 × 3, wherein the convolution module comprises a convolution operation with a convolution kernel of 3 × 3 and a Leaky Relu activation function, and the size of an output characteristic diagram is 10 × 255;
next, inputting the output feature map into a 1 × 1 convolution module, wherein the module comprises a convolution operation with a convolution kernel of 1 × 1 and a leakage Relu activation function, and the size of the output feature map is kept unchanged, and the two steps are used for extracting different types of feature representations in the feature map by adopting convolution kernels with different sizes;
and then, sequentially inputting the output characteristic diagram into two full-connection layers with dimensions of 255 and 5 (each monitoring module corresponds to five different growth state categories) for category classification, and finally obtaining corresponding state categories.
Next, an AI joint monitoring network is constructed based on the four monitoring modules. As shown in fig. 3, taking an example of inputting a cabbage image to be detected (size is 1024 × 3) in the present invention, the construction process of the AI joint monitoring network is shown:
firstly, inputting an image to be detected into a growth monitoring module, and outputting coordinate information of a group of Chinese cabbage plants, the type of the growing period of the Chinese cabbage plants and a Chinese cabbage plant characteristic diagram;
and inputting the group of Chinese cabbage plant characteristic diagrams into the nutrition monitoring module, the disease monitoring module and the insect pest monitoring module in parallel, and outputting the current nutrition state type, disease state type and insect pest state type of each Chinese cabbage plant by the three monitoring modules respectively.
And step four, training the AI joint monitoring network in a staged manner, and obtaining an AI joint monitoring model.
In the first stage, a training set in a vegetable plant multi-label monitoring data set is adopted to train a growth monitoring module. The labels used are coordinate information and growth cycle categories (including germination, seedling, rosette and nodulation), and the loss function is that of YOLOv 5.
And in the second stage, training AI (artificial intelligence) joint monitoring network is trained by adopting a training set in the vegetable plant multi-label monitoring data set, and the network parameters of the growth monitoring module are fixed and are not updated in the stage. The labels used are nutritional state categories (including health, nitrogen deficiency, phosphorus deficiency, potassium deficiency and calcium deficiency), disease state categories (including health, black spot, white spot, black rot and downy mildew) and pest state categories (including health, moth's eye, aphid, cabbage caterpillar and borer), which respectively correspond to the output categories of the nutritional monitoring module, the disease monitoring module and the pest monitoring module. The loss functions adopted by the three monitoring modules are all binary cross entropy loss functions, and the loss function of the whole network is the sum of the three loss functions.
And in the third stage, the parameter fixation of a growth monitoring module is cancelled, and the whole AI combined monitoring network is trained by adopting a training set in the vegetable plant multi-label monitoring data set until the Loss converges to an optimal value. All the label information is adopted in the training process, and the loss function of the whole network is the sum of the first-stage loss function and the second-stage loss function.
And step five, detecting the growth state of the greenhouse crops based on the trained AI combined monitoring model. And acquiring images line by line in the greenhouse by adopting a mechanical arm AGV carrying an industrial camera, and inputting the images to be detected into an AI (Artificial intelligence) joint monitoring model to obtain joint detection results of all the Chinese cabbage plants. If the nutrition and pest and disease damage monitoring results of all crops in the image are healthy, outputting that the crops are in a healthy state; otherwise, sending out an alarm and outputting the coordinate information of the unhealthy crops and various monitoring results.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present invention, which enable those skilled in the art to understand or practice the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A method for jointly monitoring the growth state of crops is characterized by comprising the following steps:
constructing a growth monitoring module, and detecting the growth state period category and coordinate information of each monitoring target in the original image by using the growth monitoring module; the growth monitoring module adopts YOLOv5 as a basic network, and embeds a global perception network for extracting global context information of the monitoring target in a network structure of YOLOv 5.
2. The crop growth state joint monitoring method according to claim 1, wherein the adopting YOLOv5 as a base network and embedding a global perception network for extracting global context information of the monitoring target in a network structure of the YOLOv5 specifically comprises:
and taking the first output feature map of the backbone network of the YOLOv5 as the input of a global perception network, wherein the global perception network multiplies a feature map output after sequentially performing convolution and global average pooling on the first output feature map by a second output feature map at the prediction end of the YOLOv5 to obtain a global feature map, the backbone network is used for extracting the image features of the original image in a layer-by-layer convolution mode, and the first output feature map contains global context information of the monitoring target.
3. The crop growth state joint monitoring method according to claim 2, wherein the global perception network comprises a first convolution module layer, a second convolution module layer and a global average pooling layer which are connected in sequence, and the first output feature map is input into the global average pooling layer for global average pooling after being subjected to feature extraction through the first convolution module layer and the second convolution module layer in sequence.
4. The method for jointly monitoring the growth status of crops as claimed in claim 2, further comprising: and the prediction end of the YOLOv5 classifies and positions the monitoring target based on the global feature map, and outputs the growth period category of the monitoring target and the coordinate information of the monitoring target in the original image.
5. The method for jointly monitoring the growth status of crops as claimed in claim 3, further comprising:
constructing a nutrition monitoring module, a disease monitoring module and a pest monitoring module which adopt the same network structure;
cutting the global feature map according to the first size of the original image, the second size of the global feature map and the coordinate information of the monitoring target in the original image to obtain a target feature map of the monitoring target;
and inputting the target characteristic diagram into the nutrition monitoring module, the disease monitoring module and the insect pest monitoring module respectively, and detecting the nutrition state type, the disease state type and the insect pest state type of the monitoring target respectively.
6. The crop growth state joint monitoring method according to claim 5, wherein the network structures of the nutrition monitoring module, the disease monitoring module and the pest monitoring module each comprise a first convolution module layer, a second convolution module layer, a first dimension full-connection layer and a second dimension full-connection layer which are connected in sequence;
the first convolution module layer and the second convolution module layer adopt convolution kernels with different sizes to extract different types of feature representations in the target feature map to obtain an intermediate feature map;
and the first dimension full-connection layer and the second dimension full-connection layer classify the intermediate characteristic diagram to obtain a state classification result of the monitoring target, wherein the classification result is a nutrition state classification or a disease state classification or an insect pest state classification.
7. The method for jointly monitoring the growth status of crops as claimed in claim 6, further comprising:
acquiring a plurality of original images of a monitoring target in the whole growth period, wherein the original images comprise a plurality of monitoring targets of the same type;
and carrying out multi-label labeling on the original image to obtain a multi-label image training set, wherein the multi-label image training set comprises training images labeled with a plurality of sample labels.
8. The method for jointly monitoring the growth status of crops as claimed in claim 7, further comprising:
selecting a training image labeled with a first sample label in the multi-label image training set to train the growth monitoring module, wherein the first sample label comprises category information and coordinate information of the growth period of the monitoring target;
selecting a training image labeled with a second sample label in the multi-label image training set to train the nutrition monitoring module, wherein the second sample label comprises nutrition state category information of the monitoring target;
selecting a training image labeled with a third sample label in the multi-label image training set to train the disease monitoring module, wherein the third sample label comprises disease state category information of the monitoring target;
selecting a training image labeled with a fourth sample label in the multi-label image training set to train the insect pest monitoring module, wherein the fourth sample label comprises insect pest state category information of the monitoring target.
9. A combined monitoring device for crop growth status, comprising:
a memory for storing program instructions;
a processor for invoking the program instructions stored in the memory to implement a crop growth state joint monitoring method according to any one of claims 1 to 8.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores program code for implementing the crop growth status joint monitoring method according to any one of claims 1 to 8.
CN202210255333.3A 2022-03-16 2022-03-16 Crop growth state combined monitoring method and device and storage medium Active CN114332849B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210255333.3A CN114332849B (en) 2022-03-16 2022-03-16 Crop growth state combined monitoring method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210255333.3A CN114332849B (en) 2022-03-16 2022-03-16 Crop growth state combined monitoring method and device and storage medium

Publications (2)

Publication Number Publication Date
CN114332849A true CN114332849A (en) 2022-04-12
CN114332849B CN114332849B (en) 2022-08-16

Family

ID=81033771

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210255333.3A Active CN114332849B (en) 2022-03-16 2022-03-16 Crop growth state combined monitoring method and device and storage medium

Country Status (1)

Country Link
CN (1) CN114332849B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115131670A (en) * 2022-09-02 2022-09-30 广州艾米生态人工智能农业有限公司 Intelligent auditing method, system, device and equipment for rice pictures
CN117152620A (en) * 2023-10-30 2023-12-01 江西立盾光电科技有限公司 Plant growth control method and system following plant state change

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109829888A (en) * 2018-12-27 2019-05-31 北京林业大学 Wild animal monitoring analysis system and method based on depth convolutional neural networks
CN109858569A (en) * 2019-03-07 2019-06-07 中国科学院自动化研究所 Multi-tag object detecting method, system, device based on target detection network
CN110717903A (en) * 2019-09-30 2020-01-21 天津大学 Method for detecting crop diseases by using computer vision technology
CN111178177A (en) * 2019-12-16 2020-05-19 西京学院 Cucumber disease identification method based on convolutional neural network
CN111723736A (en) * 2020-06-19 2020-09-29 中国农业科学院农业信息研究所 Fruit tree flowering phase monitoring method and device, computer equipment and storage medium
CN111814726A (en) * 2020-07-20 2020-10-23 南京工程学院 Detection method for visual target of detection robot
CN112069868A (en) * 2020-06-28 2020-12-11 南京信息工程大学 Unmanned aerial vehicle real-time vehicle detection method based on convolutional neural network
CN112381764A (en) * 2020-10-23 2021-02-19 西安科锐盛创新科技有限公司 Crop disease and insect pest detection method
CN112446388A (en) * 2020-12-05 2021-03-05 天津职业技术师范大学(中国职业培训指导教师进修中心) Multi-category vegetable seedling identification method and system based on lightweight two-stage detection model
CN112465790A (en) * 2020-12-03 2021-03-09 天津大学 Surface defect detection method based on multi-scale convolution and trilinear global attention
CN112651438A (en) * 2020-12-24 2021-04-13 世纪龙信息网络有限责任公司 Multi-class image classification method and device, terminal equipment and storage medium
CN112668445A (en) * 2020-12-24 2021-04-16 南京泓图人工智能技术研究院有限公司 Vegetable type detection and identification method based on yolov5
CN112906718A (en) * 2021-03-09 2021-06-04 西安电子科技大学 Multi-target detection method based on convolutional neural network
CN113255589A (en) * 2021-06-25 2021-08-13 北京电信易通信息技术股份有限公司 Target detection method and system based on multi-convolution fusion network
CN113378976A (en) * 2021-07-01 2021-09-10 深圳市华汉伟业科技有限公司 Target detection method based on characteristic vertex combination and readable storage medium
CN113627472A (en) * 2021-07-05 2021-11-09 南京邮电大学 Intelligent garden defoliating pest identification method based on layered deep learning model
CN113761976A (en) * 2020-06-04 2021-12-07 华为技术有限公司 Scene semantic analysis method based on global guide selective context network
CN113920107A (en) * 2021-10-29 2022-01-11 西安工程大学 Insulator damage detection method based on improved yolov5 algorithm
CN113920474A (en) * 2021-10-28 2022-01-11 成都信息工程大学 Internet of things system and method for intelligently monitoring citrus planting situation
CN114037678A (en) * 2021-11-05 2022-02-11 浙江工业大学 Urine visible component detection method and device based on deep learning
CN114092808A (en) * 2021-11-17 2022-02-25 南京工程学院 Crop disease and insect pest detection and prevention device and method based on image and deep learning
CN114140665A (en) * 2021-12-06 2022-03-04 广西师范大学 Dense small target detection method based on improved YOLOv5

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109829888A (en) * 2018-12-27 2019-05-31 北京林业大学 Wild animal monitoring analysis system and method based on depth convolutional neural networks
CN109858569A (en) * 2019-03-07 2019-06-07 中国科学院自动化研究所 Multi-tag object detecting method, system, device based on target detection network
CN110717903A (en) * 2019-09-30 2020-01-21 天津大学 Method for detecting crop diseases by using computer vision technology
CN111178177A (en) * 2019-12-16 2020-05-19 西京学院 Cucumber disease identification method based on convolutional neural network
CN113761976A (en) * 2020-06-04 2021-12-07 华为技术有限公司 Scene semantic analysis method based on global guide selective context network
CN111723736A (en) * 2020-06-19 2020-09-29 中国农业科学院农业信息研究所 Fruit tree flowering phase monitoring method and device, computer equipment and storage medium
CN112069868A (en) * 2020-06-28 2020-12-11 南京信息工程大学 Unmanned aerial vehicle real-time vehicle detection method based on convolutional neural network
CN111814726A (en) * 2020-07-20 2020-10-23 南京工程学院 Detection method for visual target of detection robot
CN112381764A (en) * 2020-10-23 2021-02-19 西安科锐盛创新科技有限公司 Crop disease and insect pest detection method
CN112465790A (en) * 2020-12-03 2021-03-09 天津大学 Surface defect detection method based on multi-scale convolution and trilinear global attention
CN112446388A (en) * 2020-12-05 2021-03-05 天津职业技术师范大学(中国职业培训指导教师进修中心) Multi-category vegetable seedling identification method and system based on lightweight two-stage detection model
CN112651438A (en) * 2020-12-24 2021-04-13 世纪龙信息网络有限责任公司 Multi-class image classification method and device, terminal equipment and storage medium
CN112668445A (en) * 2020-12-24 2021-04-16 南京泓图人工智能技术研究院有限公司 Vegetable type detection and identification method based on yolov5
CN112906718A (en) * 2021-03-09 2021-06-04 西安电子科技大学 Multi-target detection method based on convolutional neural network
CN113255589A (en) * 2021-06-25 2021-08-13 北京电信易通信息技术股份有限公司 Target detection method and system based on multi-convolution fusion network
CN113378976A (en) * 2021-07-01 2021-09-10 深圳市华汉伟业科技有限公司 Target detection method based on characteristic vertex combination and readable storage medium
CN113627472A (en) * 2021-07-05 2021-11-09 南京邮电大学 Intelligent garden defoliating pest identification method based on layered deep learning model
CN113920474A (en) * 2021-10-28 2022-01-11 成都信息工程大学 Internet of things system and method for intelligently monitoring citrus planting situation
CN113920107A (en) * 2021-10-29 2022-01-11 西安工程大学 Insulator damage detection method based on improved yolov5 algorithm
CN114037678A (en) * 2021-11-05 2022-02-11 浙江工业大学 Urine visible component detection method and device based on deep learning
CN114092808A (en) * 2021-11-17 2022-02-25 南京工程学院 Crop disease and insect pest detection and prevention device and method based on image and deep learning
CN114140665A (en) * 2021-12-06 2022-03-04 广西师范大学 Dense small target detection method based on improved YOLOv5

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
WEI LIU 等: "PARSENET: LOOKING WIDER TO SEE BETTER", 《ARXIV》 *
杨其晟 等: "改进 YOLOv5的苹果花生长状态检测方法", 《计算机工程与应用》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115131670A (en) * 2022-09-02 2022-09-30 广州艾米生态人工智能农业有限公司 Intelligent auditing method, system, device and equipment for rice pictures
CN115131670B (en) * 2022-09-02 2022-12-20 广州艾米生态人工智能农业有限公司 Intelligent auditing method, system, device and equipment for rice pictures
CN117152620A (en) * 2023-10-30 2023-12-01 江西立盾光电科技有限公司 Plant growth control method and system following plant state change
CN117152620B (en) * 2023-10-30 2024-02-13 江西立盾光电科技有限公司 Plant growth control method and system following plant state change

Also Published As

Publication number Publication date
CN114332849B (en) 2022-08-16

Similar Documents

Publication Publication Date Title
CN114332849B (en) Crop growth state combined monitoring method and device and storage medium
De Luna et al. Automated image capturing system for deep learning-based tomato plant leaf disease detection and recognition
CN110110595B (en) Farmland image and medicine hypertrophy data analysis method based on satellite remote sensing image
CN109886155B (en) Single-plant rice detection and positioning method, system, equipment and medium based on deep learning
CN114239756B (en) Insect pest detection method and system
CN114818909B (en) Weed detection method and device based on crop growth characteristics
Barreto et al. Automatic UAV-based counting of seedlings in sugar-beet field and extension to maize and strawberry
CN111476149A (en) Plant cultivation control method and system
CN112461828A (en) Intelligent pest and disease damage forecasting and early warning system based on convolutional neural network
CN111967441A (en) Crop disease analysis method based on deep learning
Kwaghtyo et al. Smart farming prediction models for precision agriculture: a comprehensive survey
CN116129260A (en) Forage grass image recognition method based on deep learning
Anitha Mary et al. Scope and recent trends of artificial intelligence in Indian agriculture
Bashier et al. Sesame Seed Disease Detection Using Image Classification
Hati et al. AI-driven pheno-parenting: a deep learning based plant phenotyping trait analysis model on a novel soilless farming dataset
Monica et al. Soil NPK prediction using enhanced genetic algorithm
FAISAL A pest monitoring system for agriculture using deep learning
Jovanovska et al. Integrated iot system for prediction of diseases in the vineyards
CN116453003B (en) Method and system for intelligently identifying rice growth vigor based on unmanned aerial vehicle monitoring
Abdulghani et al. Cyber-Physical System Based Data Mining and Processing Toward Autonomous Agricultural Systems
Kushwaha et al. Identification of Tomato Leaf Disease Prediction Using CNN
Mishra et al. Advancing Agriculture Predictive Models for Farming Suitability Using Machine Learning
Krishna et al. Design and Development of an Agricultural Mobile Application using Machine Learning
Day Computer applications in agriculture and horticulture: a view
Adarsh et al. Deep Learning: A Futuristic Approach to Agriculture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant