CN111369494A - Winter wheat ear density detection method and device - Google Patents

Winter wheat ear density detection method and device Download PDF

Info

Publication number
CN111369494A
CN111369494A CN202010082618.2A CN202010082618A CN111369494A CN 111369494 A CN111369494 A CN 111369494A CN 202010082618 A CN202010082618 A CN 202010082618A CN 111369494 A CN111369494 A CN 111369494A
Authority
CN
China
Prior art keywords
density
sub
winter wheat
ear
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010082618.2A
Other languages
Chinese (zh)
Other versions
CN111369494B (en
Inventor
马浚诚
杜克明
郑飞翔
孙忠富
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Environment and Sustainable Development in Agriculturem of CAAS
Original Assignee
Institute of Environment and Sustainable Development in Agriculturem of CAAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Environment and Sustainable Development in Agriculturem of CAAS filed Critical Institute of Environment and Sustainable Development in Agriculturem of CAAS
Priority to CN202010082618.2A priority Critical patent/CN111369494B/en
Publication of CN111369494A publication Critical patent/CN111369494A/en
Application granted granted Critical
Publication of CN111369494B publication Critical patent/CN111369494B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides a method and a device for detecting winter wheat ear density, wherein the method comprises the following steps: acquiring a canopy image to be detected after the flowering phase of winter wheat, and dividing the canopy image to be detected into a plurality of sub-images to be detected; inputting the sub-graphs to be detected into a convolutional neural network model, and outputting a sub-density graph; determining the density of the winter wheat ears according to the sub-density map; the convolutional neural network model is obtained by training based on the sample canopy image and a pre-determined spike density map corresponding to the sample canopy image. The method and the device for detecting the density of the winter wheat ears, provided by the embodiment of the invention, utilize an artificial intelligence technology to carry out detection, have high automation degree, can effectively reduce the manual intervention of the estimation of the ear density, reduce the application cost and the complexity degree, and effectively improve the accuracy and the real-time of the estimation of the density of the winter wheat ears.

Description

Winter wheat ear density detection method and device
Technical Field
The embodiment of the invention relates to the technical field of system engineering and information, in particular to a method and a device for detecting winter wheat ear density.
Background
The winter wheat head density is an important index for winter wheat yield estimation and phenotype analysis, and has important practical significance. The traditional spike density calculation method mainly depends on manual counting, and the method not only needs to consume a large amount of manpower and material resources, but also has low efficiency and cannot meet the requirements of large-area yield estimation and phenotype analysis. The fringe density estimation method based on computer vision has the advantages of low cost and easy use, and is the most important fringe density estimation means at present. The fringe density estimation method based on computer vision mainly comprises the following steps: the method comprises the following steps of wheat ear segmentation, binary graphic morphology optimization and connected domain number calculation, and then the number of wheat ears in an image is obtained. The ear image segmentation is the core content of an ear density estimation method based on computer vision, and generally performs ear extraction at a pixel level or a super-pixel level based on bottom layer characteristics such as color and texture of the ear.
Because winter wheat canopy image that gathers under the field environment is easily disturbed by illumination condition and complicated background, the ear of wheat image segmentation accuracy rate based on the bottom characteristic is easily disturbed by the noise, and the generalization ability is lower, is difficult to get good effect in practical application. In addition, the ear density in the winter wheat canopy image acquired in the field environment is high, so that the ears are relatively seriously shielded, and the ear size and shape change greatly, which brings a challenge to the ear density estimation. How to overcome the influence of illumination conditions and complex backgrounds, solve the problems that the current spike density estimation method is easily interfered by noise, and has large changes to the spike overlapping and the spike appearance, and the like, and realize the accurate estimation of the spike density is a problem to be solved urgently.
Disclosure of Invention
The embodiment of the invention provides a method and a device for detecting winter wheat ear density, which are used for solving the technical problems in the prior art.
In order to solve the technical problem, in one aspect, an embodiment of the present invention provides a method for detecting winter wheat ear density, including:
acquiring a canopy image to be detected after the flowering phase of winter wheat, and dividing the canopy image to be detected into a plurality of sub-images to be detected;
inputting the sub-graphs to be detected into a convolutional neural network model, and outputting a sub-density graph;
determining the density of the winter wheat ears according to the sub-density map;
the convolutional neural network model is obtained by training based on the sample canopy image and a pre-determined spike density map corresponding to the sample canopy image.
Further, the determining the winter wheat ear density according to the sub-density map specifically comprises:
calculating the position index of the sub-density map according to the number of pooling layers in the convolutional neural network model;
fusing the output sub-density maps according to the sub-density map position indexes to obtain an ear density map;
and determining the winter wheat ear density according to the ear density map.
Further, the training process of the convolutional neural network model comprises the following steps:
acquiring a sample canopy image after the flowering phase of winter wheat;
recording the position information of the wheat ears by adopting point marks;
generating an ear density graph of the sample according to the position information of the ear;
dividing the sample canopy image and the corresponding spike density map into a plurality of sample subgraphs and corresponding subgraph maps by adopting a sliding window;
and training the convolutional neural network by using the plurality of sample subgraphs and the corresponding sub-density graphs to obtain the convolutional neural network model.
Further, the position information of the ear is the coordinate of the ear center point in the image.
Further, the generating of the ear density map of the sample according to the position information of the ear specifically includes:
and generating an ear density map of the sample by adopting a geometric self-adaptive method according to the position information of the ear.
Further, the training of the convolutional neural network by using the plurality of sample subgraphs and the corresponding sub-density graph to obtain the convolutional neural network model specifically comprises:
taking the plurality of sample subgraphs and the corresponding sub-density graphs as input layers of a convolutional neural network model;
constructing a feature extractor of a convolutional neural network model;
and the input layer, the feature extractor, the superposition layer for fusing the output results of the feature extractor and the output layer for fusing a plurality of channels in the superposition layer are sequentially connected to complete the establishment of the convolutional neural network model.
Furthermore, the feature extractor comprises four parallel processing modules, and each processing module comprises three processing units connected in sequence.
Furthermore, the first processing unit of each processing module comprises a convolution layer, a modified linear unit ReLU layer and a pooling layer which are sequentially connected; the second processing units respectively comprise at least 3 sequentially connected convolution layers and modified linear unit ReLU layers, only the first modified linear unit ReLU layer is connected with a pooling layer, and the third processing units respectively comprise at least 3 sequentially connected convolution layers and modified linear unit ReLU layers;
the size of the convolution kernel in each convolution layer in each processing unit is kept unchanged, the number of the convolution kernels in each convolution layer in the second processing unit is sequentially increased by taking 2 as a multiplier, the number of the convolution kernels in each convolution layer in the third processing unit is sequentially decreased by taking 0.5 as a multiplier, the number of the convolution layers in the second processing unit is equal to the number of the convolution layers in the third processing unit, the size of the convolution kernel in the convolution layer in the first processing unit is larger than that of the convolution kernel in each convolution layer in the second processing unit, and the size of the convolution kernel in the convolution layer in the second processing unit is larger than that of the convolution kernel in each convolution layer in the third processing unit.
Further, the sizes of convolution kernels in convolution layers in the first processing unit of each processing module decrease in sequence.
On the other hand, the embodiment of the invention provides a device for detecting the density of winter wheat ears, which is characterized by comprising the following components:
the acquisition module is used for acquiring a canopy image to be detected after the flowering phase of the winter wheat and dividing the canopy image to be detected into a plurality of sub-images to be detected;
the output module is used for inputting the sub-graphs to be detected into the convolutional neural network model and outputting a sub-density graph;
the detection module is used for determining the density of the winter wheat ears according to the sub-density map;
the convolutional neural network model is obtained by training based on the sample canopy image and a pre-determined spike density map corresponding to the sample canopy image.
The method and the device for detecting the density of the winter wheat ears, provided by the embodiment of the invention, utilize an artificial intelligence technology to carry out detection, have high automation degree, can effectively reduce the manual intervention of the estimation of the ear density, reduce the application cost and the complexity degree, and effectively improve the accuracy and the real-time of the estimation of the density of the winter wheat ears.
Drawings
FIG. 1 is a schematic diagram of a winter wheat head density detection method provided by an embodiment of the present invention;
FIG. 2 is a flow chart of the winter wheat head density detection logic provided by the embodiment of the present invention;
FIG. 3 is a schematic view of a winter wheat head density detection device provided by the embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a schematic view of a winter wheat ear density detection method according to an embodiment of the present invention, and as shown in fig. 1, an embodiment of the present invention provides a winter wheat ear density detection method, whose main implementation body is a winter wheat ear density detection device. The method comprises the following steps:
s101, acquiring a canopy image to be detected after the flowering phase of winter wheat, and dividing the canopy image to be detected into a plurality of sub-images to be detected.
Specifically, fig. 2 is a logical flow chart of winter wheat ear density detection provided in the embodiment of the present invention, and as shown in fig. 2, first, a canopy image to be detected after winter wheat blossom is obtained, and the canopy image to be detected is divided into a plurality of sub-images to be detected.
For example, the canopy image to be detected can be acquired with a pixel size of 2500 × 2500 × 3, and divided into several sub-images to be detected with a sliding window of 600 × 600 pixel size, 300 pixel steps.
And dividing the canopy image to be detected into a plurality of subgraphs to be detected, and recording the position index of the subgraph to be detected.
The position index of the subgraph to be detected comprises the following steps: the coordinates of the pixel points at the upper left corner of the sub-image to be detected in the image to be detected can be extracted by using a first formula:
wherein the first formula is:
Index(i,j)=I(i,j)
Index(i,j)for the index of the jth sub-image of the ith image to be detected, I(i,j)And the coordinates of the pixel point at the upper left corner of the jth sub-image of the ith image to be detected in the image to be detected are obtained.
And S102, inputting the sub-graphs to be detected into a convolutional neural network model, and outputting a sub-density graph.
Specifically, after a plurality of subgraphs to be detected are obtained, the plurality of subgraphs to be detected are input to the convolutional neural network model, and a subgraph density graph is output.
The convolutional neural network model is obtained by training based on the sample canopy image and a pre-determined spike density map corresponding to the sample canopy image.
And S103, determining the density of the winter wheat head according to the sub-density map.
Specifically, after obtaining the sub-density map, the winter wheat ear density is determined from the sub-density map.
The winter wheat ear density detection method provided by the embodiment of the invention utilizes an artificial intelligence technology for detection, has high automation degree, can effectively reduce the manual intervention of ear density estimation, reduces the application cost and complexity, and effectively improves the accuracy and real-time of winter wheat ear density estimation.
Based on any one of the above embodiments, further, the determining the winter wheat ear density according to the sub-density map specifically includes:
calculating the position index of the sub-density map according to the number of pooling layers in the convolutional neural network model;
fusing the output sub-density maps according to the sub-density map position indexes to obtain an ear density map;
and determining the winter wheat ear density according to the ear density map.
Specifically, in the embodiment of the present invention, the determination of the winter wheat ear density according to the sub-density map specifically includes the following steps:
first, the sub-density map location index is calculated based on the number of pooling layers in the convolutional neural network model.
Calculating the position index of the to-be-detected sub-density map, firstly extracting a correction factor by using a second formula, and then extracting the position index of the to-be-detected sub-density map by using a third formula;
wherein the second formula is:
Figure BDA0002380858830000061
f is a correction factor, and N is the number of pooling layers in the convolutional neural network.
The third formula is:
Index_m(i,j)=f*Index(i,j)
Index(i,j)index of jth sub-picture of ith picture to be detected, Index _ m(i,j)And the index is the jth sub-density map index of the ith image to be detected.
And then, fusing the output sub-density maps according to the position indexes of the sub-density maps to obtain the spike density map.
And finally, determining the winter wheat ear density according to the ear density map.
The winter wheat ear density detection method provided by the embodiment of the invention utilizes an artificial intelligence technology for detection, has high automation degree, can effectively reduce the manual intervention of ear density estimation, reduces the application cost and complexity, and effectively improves the accuracy and real-time of winter wheat ear density estimation.
Based on any one of the above embodiments, further, the training process of the convolutional neural network model includes the following steps:
acquiring a sample canopy image after the flowering phase of winter wheat;
recording the position information of the wheat ears by adopting point marks;
generating an ear density graph of the sample according to the position information of the ear;
dividing the sample canopy image and the corresponding spike density map into a plurality of sample subgraphs and corresponding subgraph maps by adopting a sliding window;
and training the convolutional neural network by using the plurality of sample subgraphs and the corresponding sub-density graphs to obtain the convolutional neural network model.
Specifically, a canopy image of winter wheat after flowering collected in a field environment is first acquired as a sample.
After the sample canopy image is acquired, a pre-process may also be performed to resize the sample canopy image, for example, to 2500 × 2500 × 3 pixels.
Then, the position information of the ear of wheat was recorded using the dot marks.
In a specific application, preferably, the position information of the ear of wheat includes: coordinates of the center point of the ear in the image.
Then, an ear density map of the sample is generated based on the position information of the ear.
And generating the spike density map of the sample by adopting a geometric self-adaptive method.
And then, dividing the sample canopy image and the corresponding spike density map into a plurality of sample subgraphs and corresponding subgraph maps by adopting a sliding window.
A sliding window sample image and its corresponding ear density map, in 300 pixel steps, can be divided into subgraphs and corresponding sub-density maps using a 600 × 600 pixel size.
And finally, training the convolutional neural network by using the plurality of sample subgraphs and the corresponding sub-density graphs to obtain the convolutional neural network model.
In a specific application, preferably, the convolutional neural network structure includes 1 input layer, 1 feature extractor, 1 superposition layer, and 1 output layer. The feature extractor comprises 4 parallel processing modules, and each processing module comprises 3 processing units which are connected in sequence.
In a specific application, preferably, the 1 st processing unit of the 1 st processing module includes 1 convolutional layer, 1 modified linear unit ReLU layer and 1 pooling layer, wherein 8 convolution kernels of 11 × 11 are adopted in the convolutional layer, the pooling layer adopts a maximum pooling function of 2 × 2, the 2 nd processing unit includes 3 convolutional layers, 3 modified linear unit ReLU layers and 1 pooling layer, wherein 9 × 9 convolution kernels are adopted in the convolutional layers, the number of the convolution kernels is 16, 32 and 64 respectively, the pooling layers are arranged after the first modified linear unit ReLU layer, the maximum pooling function of 2 × 2 is adopted, and the 3 rd processing unit includes 3 convolutional layers, wherein 1 × 1 convolution kernels are adopted in the convolutional layers, and the number of the convolution kernels is 32, 16 and 8 respectively.
In a specific application, preferably, the 1 st processing unit of the 2 nd processing module includes 1 convolutional layer, 1 modified linear unit ReLU layer and 1 pooling layer, wherein 16 convolution kernels of 9 × 9 are used in the convolutional layer, the pooling layer uses a maximum pooling function of 2 × 2, the 2 nd processing unit includes 3 convolutional layers, 3 modified linear unit ReLU layers and 1 pooling layer, wherein 7 × 7 convolution kernels are used in the convolutional layers, the number of the convolution kernels is 32, 64 and 128 respectively, the pooling layers are located after the first modified linear unit ReLU layer, the maximum pooling function of 2 × 2 is used, and the 3 rd processing unit includes 3 convolutional layers, wherein 1 × 1 convolution kernel is used in the convolutional layers, the number of the convolution kernels is 64, 32 and 16 respectively.
In a specific application, preferably, the 1 st processing unit of the 3 rd processing module includes 1 convolutional layer, 1 modified linear unit ReLU layer and 1 pooling layer, wherein 32 convolution kernels of 7 × 7 are used in the convolutional layer, the pooling layer uses a maximum pooling function of 2 × 2, the 2 nd processing unit includes 3 convolutional layers, 3 modified linear unit ReLU layers and 1 pooling layer, wherein each convolutional layer uses convolution kernels of 5 × 5 and is 64, 128 and 256 respectively, the pooling layers are located after the first modified linear unit ReLU layer and are used for a maximum pooling function of 2 × 2, and the 3 rd processing unit includes 3 convolutional layers, wherein each convolutional layer uses convolution kernels of 1 × 1 and is 128, 64 and 32 respectively.
In a specific application, preferably, the 1 st processing unit of the 4 th processing module includes 1 convolutional layer, 1 modified linear unit ReLU layer and 1 pooling layer, wherein 64 convolution kernels of 5 × 5 are used in the convolutional layer, the pooling layer uses a maximum pooling function of 2 × 2, the 2 nd processing unit includes 3 convolutional layers, 3 modified linear unit ReLU layers and 1 pooling layer, wherein each convolutional layer uses convolution kernels of 3 × 3 and is 128, 256 and 512 in number, the pooling layers are located after the first modified linear unit ReLU layer and are used for a maximum pooling function of 2 × 2, and the 3 rd processing unit includes 3 convolutional layers, wherein each convolutional layer uses convolution kernels of 1 × 1 and is 256, 128 and 64 in number.
In a specific application, the superposition layer preferably superposes feature maps of 4 processing module results, the output layer comprises 1 convolution layer, and 1 convolution kernel of 1 × 1 is adopted to fuse the feature maps output by the superposition layer, so as to output the spike density map.
The method for detecting the density of the winter wheat ears provided by the embodiment of the invention comprises the steps of preprocessing a canopy image of winter wheat after flowering, which is acquired from a field environment, dividing the preprocessed image into a sample set and a set to be detected, and recording position information of the wheat ears by adopting point marks; generating an ear density graph of the sample set image according to the ear position information; dividing the winter wheat canopy image and the corresponding spike density map into a subgraph and a corresponding sub-density map by adopting a sliding window, dividing the image of the set to be detected into subgraphs, and recording the position index of the subgraph of the set to be detected; taking the sample set graph and the corresponding sub-density graph data as the input of a model, constructing a convolutional neural network model for estimating the fringe density, and performing model training and verification by adopting an adaptive moment estimation algorithm; testing by using a to-be-detected collective graph to obtain a to-be-detected collective density graph; calculating the position index of the to-be-detected set sub-density map according to the number of pooling layers in the convolutional neural network model; and fusing the generated collector density map to be detected according to the position index of the collector density map to be detected to obtain a collector image ear density map to be detected, and further obtaining the corresponding ear density. The method can effectively reduce manual intervention of ear density estimation, reduce application cost and complexity, and effectively improve accuracy and instantaneity of winter wheat ear density estimation. The method disclosed by the invention can greatly contribute to the estimation of the density of the winter wheat ears on the basis of the combination of image processing, deep learning and other counting.
Based on any of the above embodiments, fig. 3 is a schematic diagram of a winter wheat ear density detection device provided in an embodiment of the present invention, and as shown in fig. 3, an embodiment of the present invention provides a winter wheat ear density detection device, which includes an acquisition module 301, an output module 302, and a detection module 303, where:
the acquisition module 301 is used for acquiring a canopy image to be detected after the flowering phase of winter wheat and dividing the canopy image to be detected into a plurality of sub-images to be detected; the output module 302 is configured to input the multiple sub-graphs to be detected to the convolutional neural network model, and output a sub-density graph; the detection module 303 is used for determining the winter wheat head density according to the sub-density map; the convolutional neural network model is obtained by training based on the sample canopy image and a pre-determined spike density map corresponding to the sample canopy image.
Embodiments of the present invention provide a device for detecting winter wheat ear density, which is used to perform the method described in any one of the above embodiments, and the specific steps of performing the method described in one of the above embodiments by using the device provided in this embodiment are the same as those in the corresponding embodiments, and are not described herein again.
The winter wheat ear density detection device provided by the embodiment of the invention utilizes an artificial intelligence technology to carry out detection, has high automation degree, can effectively reduce the manual intervention of ear density estimation, reduces the application cost and complexity, and effectively improves the accuracy and real-time of winter wheat ear density estimation.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, and as shown in fig. 4, the electronic device includes: a processor (processor)401, a communication Interface (communication Interface)402, a memory (memory)403 and a communication bus 404, wherein the processor 401, the communication Interface 402 and the memory 403 complete communication with each other through the communication bus 404. The processor 401 and the memory 402 communicate with each other via a bus 403. Processor 401 may call logic instructions in memory 403 to perform the following method:
acquiring a canopy image to be detected after the flowering phase of winter wheat, and dividing the canopy image to be detected into a plurality of sub-images to be detected;
inputting the sub-graphs to be detected into a convolutional neural network model, and outputting a sub-density graph;
determining the density of the winter wheat ears according to the sub-density map;
the convolutional neural network model is obtained by training based on the sample canopy image and a pre-determined spike density map corresponding to the sample canopy image.
In addition, the logic instructions in the memory may be implemented in the form of software functional units and may be stored in a computer readable storage medium when sold or used as a stand-alone product. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Further, embodiments of the present invention provide a computer program product comprising a computer program stored on a non-transitory computer-readable storage medium, the computer program comprising program instructions which, when executed by a computer, enable the computer to perform the steps of the above-described method embodiments, for example, including:
acquiring a canopy image to be detected after the flowering phase of winter wheat, and dividing the canopy image to be detected into a plurality of sub-images to be detected;
inputting the sub-graphs to be detected into a convolutional neural network model, and outputting a sub-density graph;
determining the density of the winter wheat ears according to the sub-density map;
the convolutional neural network model is obtained by training based on the sample canopy image and a pre-determined spike density map corresponding to the sample canopy image.
Further, an embodiment of the present invention provides a non-transitory computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above method embodiments, for example, including:
acquiring a canopy image to be detected after the flowering phase of winter wheat, and dividing the canopy image to be detected into a plurality of sub-images to be detected;
inputting the sub-graphs to be detected into a convolutional neural network model, and outputting a sub-density graph;
determining the density of the winter wheat ears according to the sub-density map;
the convolutional neural network model is obtained by training based on the sample canopy image and a pre-determined spike density map corresponding to the sample canopy image.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A method for detecting winter wheat ear density is characterized by comprising the following steps:
acquiring a canopy image to be detected after the flowering phase of winter wheat, and dividing the canopy image to be detected into a plurality of sub-images to be detected;
inputting the sub-graphs to be detected into a convolutional neural network model, and outputting a sub-density graph;
determining the density of the winter wheat ears according to the sub-density map;
the convolutional neural network model is obtained by training based on the sample canopy image and a pre-determined spike density map corresponding to the sample canopy image.
2. The method for detecting winter wheat ear density according to claim 1, wherein the determining winter wheat ear density according to the sub-density map specifically comprises:
calculating the position index of the sub-density map according to the number of pooling layers in the convolutional neural network model;
fusing the output sub-density maps according to the sub-density map position indexes to obtain an ear density map;
and determining the winter wheat ear density according to the ear density map.
3. The method for detecting winter wheat ear density of claim 1, wherein the training process of the convolutional neural network model comprises the following steps:
acquiring a sample canopy image after the flowering phase of winter wheat;
recording the position information of the wheat ears by adopting point marks;
generating an ear density graph of the sample according to the position information of the ear;
dividing the sample canopy image and the corresponding spike density map into a plurality of sample subgraphs and corresponding subgraph maps by adopting a sliding window;
and training the convolutional neural network by using the plurality of sample subgraphs and the corresponding sub-density graphs to obtain the convolutional neural network model.
4. The method of claim 3, wherein the position information of the ear is coordinates of an ear center point in the image.
5. The method for detecting the density of winter wheat ears as claimed in claim 3, wherein the generating of the ear density map of the sample according to the position information of the ears comprises:
and generating an ear density map of the sample by adopting a geometric self-adaptive method according to the position information of the ear.
6. The method for detecting winter wheat ear density of claim 3, wherein the training of the convolutional neural network by using the plurality of sample subgraphs and the corresponding sub-density maps to obtain the convolutional neural network model specifically comprises:
taking the plurality of sample subgraphs and the corresponding sub-density graphs as input layers of a convolutional neural network model;
constructing a feature extractor of a convolutional neural network model;
and the input layer, the feature extractor, the superposition layer for fusing the output results of the feature extractor and the output layer for fusing a plurality of channels in the superposition layer are sequentially connected to complete the establishment of the convolutional neural network model.
7. The method of claim 6, wherein the feature extractor comprises four parallel processing modules, and each processing module comprises three processing units connected in sequence.
8. The method for detecting winter wheat ear density of claim 7, wherein the first processing unit of each processing module comprises a convolution layer, a modified linear unit ReLU layer and a pooling layer which are connected in sequence; the second processing units respectively comprise at least 3 sequentially connected convolution layers and modified linear unit ReLU layers, only the first modified linear unit ReLU layer is connected with a pooling layer, and the third processing units respectively comprise at least 3 sequentially connected convolution layers and modified linear unit ReLU layers;
the size of the convolution kernel in each convolution layer in each processing unit is kept unchanged, the number of the convolution kernels in each convolution layer in the second processing unit is sequentially increased by taking 2 as a multiplier, the number of the convolution kernels in each convolution layer in the third processing unit is sequentially decreased by taking 0.5 as a multiplier, the number of the convolution layers in the second processing unit is equal to the number of the convolution layers in the third processing unit, the size of the convolution kernel in the convolution layer in the first processing unit is larger than that of the convolution kernel in each convolution layer in the second processing unit, and the size of the convolution kernel in the convolution layer in the second processing unit is larger than that of the convolution kernel in each convolution layer in the third processing unit.
9. The method of claim 8, wherein the convolution kernels in the convolution layer in the first processing unit of each processing module decrease in size sequentially.
10. The utility model provides a winter wheat ear density detection device which characterized in that includes:
the acquisition module is used for acquiring a canopy image to be detected after the flowering phase of the winter wheat and dividing the canopy image to be detected into a plurality of sub-images to be detected;
the output module is used for inputting the sub-graphs to be detected into the convolutional neural network model and outputting a sub-density graph;
the detection module is used for determining the density of the winter wheat ears according to the sub-density map;
the convolutional neural network model is obtained by training based on the sample canopy image and a pre-determined spike density map corresponding to the sample canopy image.
CN202010082618.2A 2020-02-07 2020-02-07 Winter wheat spike density detection method and device Active CN111369494B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010082618.2A CN111369494B (en) 2020-02-07 2020-02-07 Winter wheat spike density detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010082618.2A CN111369494B (en) 2020-02-07 2020-02-07 Winter wheat spike density detection method and device

Publications (2)

Publication Number Publication Date
CN111369494A true CN111369494A (en) 2020-07-03
CN111369494B CN111369494B (en) 2023-05-02

Family

ID=71211420

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010082618.2A Active CN111369494B (en) 2020-02-07 2020-02-07 Winter wheat spike density detection method and device

Country Status (1)

Country Link
CN (1) CN111369494B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112017986A (en) * 2020-10-21 2020-12-01 季华实验室 Semiconductor product defect detection method and device, electronic equipment and storage medium
CN112115988A (en) * 2020-09-03 2020-12-22 中国农业大学 Wheat ear counting method and device and self-walking trolley

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105842164A (en) * 2016-05-06 2016-08-10 北京农业信息技术研究中心 Wheat spike number measuring device and use method thereof
WO2018234733A1 (en) * 2017-06-19 2018-12-27 Earlham Institute Data processing of images of a crop
CN109145848A (en) * 2018-08-30 2019-01-04 西京学院 A kind of wheat head method of counting
CN109740721A (en) * 2018-12-19 2019-05-10 中国农业大学 Wheat head method of counting and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105842164A (en) * 2016-05-06 2016-08-10 北京农业信息技术研究中心 Wheat spike number measuring device and use method thereof
WO2018234733A1 (en) * 2017-06-19 2018-12-27 Earlham Institute Data processing of images of a crop
CN109145848A (en) * 2018-08-30 2019-01-04 西京学院 A kind of wheat head method of counting
CN109740721A (en) * 2018-12-19 2019-05-10 中国农业大学 Wheat head method of counting and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
DAOYONG WANG ET.AL: "Combined Use of FCN and Harris Corner Detection for Counting Wheat Ears in Field Conditions" *
JUNCHENG MA ET.AL: "Segmenting ears of winter wheat at flowering stage using digital images and deep learning" *
张领先等: "基于卷积神经网络的冬小麦麦穗检测计数系统" *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112115988A (en) * 2020-09-03 2020-12-22 中国农业大学 Wheat ear counting method and device and self-walking trolley
CN112115988B (en) * 2020-09-03 2024-02-02 中国农业大学 Wheat ear counting method and device and self-walking trolley
CN112017986A (en) * 2020-10-21 2020-12-01 季华实验室 Semiconductor product defect detection method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111369494B (en) 2023-05-02

Similar Documents

Publication Publication Date Title
Zhang et al. Saliency detection based on self-adaptive multiple feature fusion for remote sensing images
CN106327507B (en) A kind of color image conspicuousness detection method based on background and foreground information
CN107909015A (en) Hyperspectral image classification method based on convolutional neural networks and empty spectrum information fusion
CN112861690B (en) Multi-method fused remote sensing image change detection method and system
CN106530271B (en) A kind of infrared image conspicuousness detection method
CN111738344A (en) Rapid target detection method based on multi-scale fusion
CN113627257B (en) Detection method, detection system, device and storage medium
CN110910408A (en) Image segmentation method and device, electronic equipment and readable storage medium
CN108830856B (en) GA automatic segmentation method based on time series SD-OCT retina image
CN111080670A (en) Image extraction method, device, equipment and storage medium
WO2023151237A1 (en) Face pose estimation method and apparatus, electronic device, and storage medium
CN111369494B (en) Winter wheat spike density detection method and device
CN110298817A (en) Object statistical method, device, equipment and storage medium based on image procossing
Yu et al. Segmentation and density statistics of mariculture cages from remote sensing images using mask R-CNN
CN110007764B (en) Gesture skeleton recognition method, device and system and storage medium
Sun et al. Martial arts routine difficulty action technology VR image target real-time extraction simulation
CN118096584A (en) Ultrasonic image enhancement method based on context awareness and multi-scale diffusion network
CN108597589B (en) Model generation method, target detection method and medical imaging system
Yu et al. An automatic detection and counting method for fish lateral line scales of underwater fish based on improved YOLOv5
CN112183366A (en) High-voltage power line bird nest detection method, system and machine readable medium
CN117351414A (en) Crowd density estimation method based on deep neural network
CN107292340A (en) Lateral line scales recognition methods based on convolutional neural networks
CN111723688A (en) Human body action recognition result evaluation method and device and electronic equipment
CN116823983A (en) One-to-many style handwriting picture generation method based on style collection mechanism
CN103955936B (en) Significant object detection method based on stack-typed denoising self-coding machine

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant