CN112597855A - Crop lodging degree identification method and device - Google Patents

Crop lodging degree identification method and device Download PDF

Info

Publication number
CN112597855A
CN112597855A CN202011484271.0A CN202011484271A CN112597855A CN 112597855 A CN112597855 A CN 112597855A CN 202011484271 A CN202011484271 A CN 202011484271A CN 112597855 A CN112597855 A CN 112597855A
Authority
CN
China
Prior art keywords
area
aerial vehicle
unmanned aerial
remote sensing
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011484271.0A
Other languages
Chinese (zh)
Other versions
CN112597855B (en
Inventor
苏伟
陶万成
王新盛
黄健熙
谢茈萱
张颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Agricultural University
Original Assignee
China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Agricultural University filed Critical China Agricultural University
Priority to CN202011484271.0A priority Critical patent/CN112597855B/en
Publication of CN112597855A publication Critical patent/CN112597855A/en
Application granted granted Critical
Publication of CN112597855B publication Critical patent/CN112597855B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a method and a device for identifying the lodging degree of crops, wherein the method comprises the following steps: acquiring the remote sensing image characteristics of the area to be identified according to the remote sensing image of the area to be identified; inputting the remote sensing image characteristics of the area to be identified into a remote sensing image identification model, and acquiring the lodging degree of crops in the area to be identified; the remote sensing image recognition model is obtained by training based on the remote sensing image characteristics of the sample area and the crop lodging degree of the sample area; the crop lodging degree of the sample area is obtained in advance according to the unmanned aerial vehicle image of the sample area. The crop lodging degree identification method and the crop lodging degree identification device can quickly and conveniently acquire the remote sensing image of a large-area through a remote sensing technology, can identify the remote sensing image based on a trained remote sensing image identification model, can identify the crop lodging degree of the large-area more efficiently, and can reduce the time cost for identifying the crop lodging degree of the large-area.

Description

Crop lodging degree identification method and device
Technical Field
The invention relates to the technical field of image processing, in particular to a crop lodging degree identification method and device.
Background
Lodging is the phenomenon that a vertically growing crop is inclined in a sheet manner and even the whole plant creeps over the ground, and can reduce the yield and the quality of the crop. The lodging degree can be divided into a plurality of grades according to the inclination angle of the crop stem. Timely and accurate acquisition of the lodging degree of the crops is very important for estimating the yield loss of the crops, making production decisions and the like.
The conventional method for obtaining the lodging degree of the crops can be divided into two types, namely, manual measurement is carried out in an area to be identified to obtain the lodging degree of the crops; and secondly, acquiring the image of the area to be identified by using the unmanned aerial vehicle, and then acquiring the lodging degree of the crops by using an image identification technology. When the area of the area to be identified is large, the former needs to spend high time cost for manual measurement; in the latter, because the number of the images acquired by the unmanned aerial vehicle is positively correlated with the area size of the region to be identified, if the area of the region to be identified is large, a large number of images of the region to be identified need to be acquired by the unmanned aerial vehicle, and high time cost is also needed. In summary, the two methods for obtaining the lodging degree of the crops have lower efficiency.
Disclosure of Invention
The invention provides a crop lodging degree identification method and device, which are used for solving the defect of low efficiency of obtaining crop lodging degree when the area of a region to be identified is large in the prior art and realizing high-efficiency identification of crop lodging degree.
The invention provides a crop lodging degree identification method, which comprises the following steps:
acquiring the remote sensing image characteristics of the area to be identified according to the remote sensing image of the area to be identified;
inputting the remote sensing image characteristics of the area to be identified into a remote sensing image identification model to obtain the lodging degree of crops in the area to be identified;
the remote sensing image recognition model is obtained by training based on the remote sensing image characteristics of the sample area and the crop lodging degree of the sample area; the crop lodging degree of the sample area is obtained in advance according to the unmanned aerial vehicle image of the sample area.
According to the crop lodging degree identification method provided by the invention, before the remote sensing image characteristics of the area to be identified are input into the remote sensing image identification model to obtain the crop lodging degree of the area to be identified, the method further comprises the following steps:
acquiring unmanned aerial vehicle image characteristics of the sample region according to the unmanned aerial vehicle image of the sample region;
inputting the unmanned aerial vehicle image characteristics of the sample area into an unmanned aerial vehicle image identification model, and acquiring a crop lodging spatial distribution map of the sample area;
acquiring the lodging degree of the crops in the sample area according to the lodging space distribution map of the crops;
the unmanned aerial vehicle image recognition model is obtained by training unmanned aerial vehicle image features and corresponding unmanned aerial vehicle image recognition labels in a modeling area; the unmanned aerial vehicle image identification tag is predetermined according to the unmanned aerial vehicle image in the modeling area and corresponds to the unmanned aerial vehicle image in the modeling area one to one.
According to the crop lodging degree identification method provided by the invention, the crop lodging degree of the sample area is obtained according to the crop lodging spatial distribution map, and the method specifically comprises the following steps:
acquiring the size of a window according to the remote sensing image of the sample region and the resolution of the unmanned aerial vehicle image of the sample region;
and sliding the window with the size in the crop lodging space distribution map according to a preset step length, and acquiring the crop lodging degree of the sample area according to the lodging degree of crops in the window after each sliding.
According to the crop lodging degree identification method provided by the invention, the unmanned aerial vehicle image characteristics of the sample area are obtained according to the unmanned aerial vehicle image of the sample area, and the method specifically comprises the following steps:
acquiring unmanned aerial vehicle image characteristics of the original sample region according to the unmanned aerial vehicle image of the original sample region;
screening the original sample regions through a box line graph based on unmanned aerial vehicle image characteristics of the original sample regions, and screening out a plurality of regions in the original sample regions as the sample regions;
and acquiring the unmanned aerial vehicle image characteristics of the sample region according to the unmanned aerial vehicle image characteristics of the original sample region.
According to the crop lodging degree identification method provided by the invention, the unmanned aerial vehicle image characteristics comprise:
any number of vegetation index features, texture features, and raw band reflectance values.
According to the crop lodging degree identification method provided by the invention, the texture features of the original sample region are obtained according to the unmanned aerial vehicle image of the original sample region, and the method specifically comprises the following steps:
acquiring an unmanned aerial vehicle principal component image of the original sample region according to the unmanned aerial vehicle image and the transformation matrix of the original sample region;
acquiring a gray level co-occurrence matrix and a pixel shape index according to the unmanned aerial vehicle main component image of the original sample region, and acquiring texture characteristics of the original sample region according to the gray level co-occurrence matrix and the pixel shape index;
wherein the transformation matrix is obtained based on drone imagery of the modeled region.
According to the crop lodging degree identification method provided by the invention, before the unmanned aerial vehicle image features of the sample area are input into the unmanned aerial vehicle image identification model and the crop lodging spatial distribution map of the sample area is obtained, the method further comprises the following steps:
acquiring unmanned aerial vehicle image characteristics of an original modeling area according to the unmanned aerial vehicle image of the original modeling area;
screening the original modeling area through a box plot based on the unmanned aerial vehicle image characteristics of the original modeling area, and screening out a plurality of areas in the original modeling area as the modeling area;
acquiring unmanned aerial vehicle image characteristics of an original modeling area according to the unmanned aerial vehicle image characteristics of the modeling area;
training according to the unmanned aerial vehicle image features of the modeling area and the corresponding unmanned aerial vehicle image recognition labels based on an Xgboost classification method to obtain the unmanned aerial vehicle image recognition model.
The invention also provides a device for identifying the lodging degree of crops, which comprises:
the image acquisition module is used for acquiring the remote sensing image characteristics of the area to be identified according to the remote sensing image of the area to be identified;
the image identification module is used for inputting the remote sensing image characteristics of the area to be identified into a remote sensing image identification model and acquiring the lodging degree of crops in the area to be identified;
the remote sensing image recognition model is obtained by training based on the remote sensing image characteristics of the sample area and the crop lodging degree of the sample area; the crop lodging degree of the sample area is obtained in advance according to the unmanned aerial vehicle image of the sample area.
The invention also provides an electronic device, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the program to realize the steps of the crop lodging degree identification method.
The present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method for identifying a degree of crop lodging as in any of the above-mentioned methods.
The invention provides a crop lodging degree identification method and device, wherein an unmanned aerial vehicle image of a sample area is obtained through an unmanned aerial vehicle, the crop lodging degree of the sample area is obtained according to the unmanned aerial vehicle image of the sample area, a remote sensing image identification model is obtained through training based on the crop lodging degree of the sample area and the remote sensing image characteristics of the sample area, the crop lodging degree of the area to be identified is obtained through inputting the remote sensing image characteristics of the area to be identified into the trained remote sensing image identification model, the remote sensing image of the large area can be rapidly and conveniently obtained through a remote sensing technology, the remote sensing image can be identified based on the trained remote sensing image identification model, the crop lodging degree of the large area can be identified more efficiently, and the time cost required for identifying the crop lodging degree of the large area can be reduced.
Drawings
In order to more clearly illustrate the technical solutions of the present invention or the prior art, the drawings needed for the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart of a crop lodging degree identification method provided by the invention;
FIG. 2 is a schematic structural view of a crop lodging degree recognition device provided by the invention;
fig. 3 is a schematic structural diagram of an electronic device provided in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
In order to overcome the problems in the prior art, the invention provides a crop lodging recognition method and a crop lodging recognition device.
Fig. 1 is a schematic flow chart of a crop lodging degree identification method provided by the invention. The crop lodging recognition method of the present invention is described below with reference to fig. 1. As shown in fig. 1, the method includes: step 101, obtaining the remote sensing image characteristics of the area to be identified according to the remote sensing image of the area to be identified.
The remote sensing technology has the advantages of wide observation range, large information quantity, quick information acquisition, short updating period, manpower and material resource saving, few man-made interference factors and the like. The remote sensing image of the area to be identified can be conveniently, quickly and accurately obtained through the remote sensing technology.
The high-resolution remote sensing image is a remote sensing image with high resolution. The high-resolution remote sensing image also has the characteristics of rich ground feature texture information, multiple imaging spectrum wave bands, short revisit time and the like. In agricultural remote sensing application, the high-resolution remote sensing image can be used for crop growth analysis, monitoring of surface change of a crop growth area and the like.
Preferably, the remote sensing image of the area to be identified is a high-resolution remote sensing image.
Specifically, a high-resolution remote sensing image of the area to be identified can be acquired through a high-resolution satellite. The high resolution satellite may be: IKONOS satellites, GeoEye satellites, Quickbird satellites, WorldView series satellites, high-grade series satellites made in China and the like.
In the embodiment of the invention, the remote sensing image can be obtained through a high-grade first-grade B star (GF 1B).
It should be noted that the region to be identified is a region with a relatively large area, and the region to be identified may include a certain crop whose growth state is that the crop does not fall or falls.
It is to be noted that the crop in the practice of the present invention may be corn, but also rice, wheat or other crops.
Specifically, after an original multispectral remote sensing image is obtained through a high-resolution first-star PMS sensor, image preprocessing is carried out on the original multispectral remote sensing image, and the multispectral remote sensing image of a region to be identified is obtained.
The method for image preprocessing of the original multispectral remote sensing image can comprise the following steps: image fusion, image framing, linear stretching, image filtering processing and the like.
The original remote sensing images with different spatial resolutions and spectral resolutions can be converted into remote sensing images with high spatial resolutions and high spectral resolutions through image fusion, and spatial information of high-resolution images and spectral features of low-resolution multispectral images of the original remote sensing images can be reserved. The original remote sensing image can be divided into a plurality of remote sensing images with fixed sizes through image framing. The original remote sensing image can be subjected to image enhancement through linear stretching, so that the characteristics of the remote sensing image can be acquired more accurately according to the remote sensing image. Through image filtering processing, the original remote sensing image can be subjected to noise suppression, and meanwhile, the detail characteristics in the original remote sensing image can be well kept.
The embodiment of the invention can perform image fusion on the original multispectral remote sensing image through the Gram-Schmidt Pan Sharpening algorithm and resample the multispectral remote sensing image into the remote sensing image of the area to be identified with the resolution of 2 m.
After the multispectral remote sensing image of the area to be identified is subjected to image preprocessing, the remote sensing image of the area to be identified can be obtained. According to the remote sensing image of the area to be identified, the remote sensing image characteristics of the area to be identified can be obtained.
Different characteristic parameters can be selected as the remote sensing image characteristics of the area to be identified according to actual conditions. The specific type of the remote sensing image feature of the region to be identified is not particularly limited in the embodiment of the present invention.
The method for obtaining the remote sensing image characteristics of the to-be-identified region according to the remote sensing image of the to-be-identified region can be determined according to the specific type of the remote sensing image characteristics of the to-be-identified region, and is not particularly limited in the embodiment of the present invention. For example, if the remote-sensing image feature of the area to be identified includes a vegetation index, the normalized vegetation index, the green-band vegetation index, the expanded enhanced vegetation index, the difference vegetation index and the normalized greenness vegetation index of the area to be identified may be calculated from the remote-sensing image of the area to be identified, and the various vegetation indexes may be used as the remote-sensing image feature of the area to be identified.
And 102, inputting the remote sensing image characteristics of the area to be identified into a remote sensing image identification model, and acquiring the lodging degree of crops in the area to be identified.
The remote sensing image recognition model is obtained by training based on the remote sensing image characteristics of the sample area and the lodging degree of crops in the sample area; the crop lodging degree of the sample area is obtained in advance according to the unmanned aerial vehicle image of the sample area.
When the area of the area to be identified is large, the remote sensing image of the area to be identified can be rapidly and conveniently acquired through the remote sensing technology. If when the unmanned aerial vehicle image of the regional waiting to discern that the area is great is obtained through the mode that unmanned aerial vehicle shot, then need shoot a large amount of unmanned aerial vehicle images through unmanned aerial vehicle, compare in obtaining the remote sensing image of waiting to discern the region through remote sensing technology, need invest more time cost and equipment cost.
On the other hand, compared with remote sensing images, the unmanned aerial vehicle images have the advantages of high definition, high space-time resolution, large scale, small area, high occurrence, strong operability in the shooting process and the like. To the less region of area, can obtain the crops lodging degree in this region more accurately according to unmanned aerial vehicle image.
According to the embodiment of the invention, after the unmanned aerial vehicle shoots the unmanned aerial vehicle image of the sample area with a smaller area, the crop lodging degree of the sample area is more accurately obtained according to the unmanned aerial vehicle image of the sample area, and the remote sensing image identification model is obtained based on the crop lodging degree of the sample area and the remote sensing image characteristic training of the sample area. Based on the trained remote sensing image recognition model, a large number of remote sensing images acquired through a remote sensing technology can be recognized, and therefore the lodging degree of crops in a large area to be recognized can be acquired more efficiently.
It should be noted that, in the embodiment of the present invention, the drone image of the sample area is a multispectral drone image. By image preprocessing of the multispectral unmanned aerial vehicle image of the sample region, the unmanned aerial vehicle image of the sample region can be obtained.
It should be noted that the sample area is a relatively small area, and the sample area may include a crop whose growth state is not lodging or lodging, and the crop growing in the sample area is of the same variety as the crop growing in the area to be identified.
The crop lodging degree of the sample area can be obtained in advance according to the unmanned aerial vehicle image of the sample area based on a deep learning algorithm. It can also be pre-acquired by drone image visual interpretation of the sample area.
It should be noted that the degree of crop lodging in the sample region can be described by the ratio of the area of crop lodging in the sample region to the area of the sample region.
Based on the characteristics between the remote sensing images of different areas and the unmanned aerial vehicle image, after the crop lodging degree of the sample area is obtained, the crop lodging degree of the sample area can be corresponding to the remote sensing image characteristics of the sample area, and the identification label corresponding to the remote sensing image characteristics of the sample area is obtained.
And training according to the remote sensing image characteristics of the sample region and the identification labels corresponding to the remote sensing image characteristics of the sample region based on a deep learning algorithm to obtain the remote sensing image identification model.
It should be noted that the remote sensing image recognition model can be obtained by training according to the remote sensing image characteristics of the sample region and the recognition labels corresponding to the remote sensing image characteristics of the sample region based on the Xgboost classification method. It can be understood that the remote sensing image recognition model can also be obtained based on training of other deep learning algorithms.
The remote sensing image recognition model describes the relation between the remote sensing image characteristics and the crop lodging degree, the remote sensing image characteristics of the area to be recognized are input into the trained remote sensing image recognition model, and the crop lodging degree of the area to be recognized can be obtained.
According to the embodiment of the invention, the unmanned aerial vehicle image of the sample area is obtained through the unmanned aerial vehicle, the crop lodging degree of the sample area is obtained according to the unmanned aerial vehicle image of the sample area, the remote sensing image recognition model is obtained through training based on the crop lodging degree of the sample area and the remote sensing image characteristics of the sample area, the crop lodging degree of the area to be recognized is obtained through inputting the remote sensing image characteristics of the area to be recognized into the trained remote sensing image recognition model, the remote sensing image of the large area can be rapidly and conveniently obtained through a remote sensing technology, the remote sensing image can be recognized based on the trained remote sensing image recognition model, the crop lodging degree of the large area can be more efficiently recognized, and the time cost for recognizing the crop lodging degree of the large area can be reduced.
Based on the content of each embodiment, the remote sensing image characteristics of the region to be identified are input into the remote sensing image identification model, and before the crop lodging degree of the region to be identified is obtained, the method further comprises the following steps: and acquiring the unmanned aerial vehicle image characteristics of the sample region according to the unmanned aerial vehicle image of the sample region.
In the embodiment of the invention, the original multispectral unmanned aerial vehicle image is spliced and resampled into the unmanned aerial vehicle image of the sample area with the resolution of 0.05 m.
Different parameters can be selected as the unmanned aerial vehicle image characteristics of the sample area according to actual conditions. The specific type of the drone image feature of the sample region is not particularly limited in the embodiments of the present invention.
The method for obtaining the image feature of the unmanned aerial vehicle in the sample region according to the unmanned aerial vehicle image in the sample region can be determined according to the specific type of the image feature of the unmanned aerial vehicle in the sample region, and is not particularly limited in the embodiment of the present invention. For example, if the drone image feature of the sample area includes a vegetation index, the normalized vegetation index, the green band vegetation index, the extended enhanced vegetation index, the difference vegetation index, and the normalized greenness vegetation index of the sample area may be calculated from the drone image of the sample area, and the various vegetation indexes may be used as the drone image feature of the sample area.
And inputting the unmanned aerial vehicle image characteristics of the sample area into the unmanned aerial vehicle image recognition model, and acquiring the crop lodging spatial distribution map of the sample area.
The unmanned aerial vehicle image recognition model is obtained by training unmanned aerial vehicle image features and corresponding unmanned aerial vehicle image recognition labels in a modeling area; the unmanned aerial vehicle image identification tag is predetermined according to the unmanned aerial vehicle image in the modeling area and corresponds to the unmanned aerial vehicle image in the modeling area one by one.
It should be noted that the unmanned aerial vehicle image in the modeling area in the embodiment of the present invention is a multispectral unmanned aerial vehicle image captured by an unmanned aerial vehicle. By image preprocessing of the multispectral unmanned aerial vehicle image in the modeling area, the unmanned aerial vehicle image in the modeling area can be obtained.
In the embodiment of the invention, the original multispectral unmanned aerial vehicle image is spliced and resampled into the unmanned aerial vehicle image of the modeling area with the resolution of 0.05 m.
It should be noted that the modeling region is a region having a relatively small area, and the modeling region may be the same as the sample region. The crop with the growth state of no lodging or lodging can be included in the modeling area, and the varieties of the crops growing in the modeling area, the sample area and the area to be identified are the same.
Note that, according to the drone image of the modeling area, the type of the acquired drone image feature of the modeling area may be the same as the type of the drone image feature of the sample area. The method for obtaining the image features of the unmanned aerial vehicle in the modeling region may also be the same as the method for obtaining the image features of the unmanned aerial vehicle in the sample region, and details are not repeated in the embodiment of the present invention.
Based on the unmanned aerial vehicle image in the region of modelling, can obtain the crops lodging degree in the region of modelling through visual interpretation to correspond the crops lodging degree in above-mentioned region of modelling with the unmanned aerial vehicle image characteristic in the region of modelling, obtain the unmanned aerial vehicle image identification label that the unmanned aerial vehicle image characteristic in the region of modelling corresponds.
Based on the deep learning algorithm, the unmanned aerial vehicle image recognition model can be obtained through training according to the unmanned aerial vehicle image features of the modeling area and the corresponding unmanned aerial vehicle image recognition labels.
The unmanned aerial vehicle image recognition model describes the relation between the unmanned aerial vehicle image characteristics and the crop lodging degree, the unmanned aerial vehicle image characteristics of the sample area are input into the trained unmanned aerial vehicle image recognition model, and the crop lodging space distribution map of the sample area can be obtained.
It should be noted that the spatial distribution map of the lodging of the crop in the sample area can represent the spatial distribution of the lodging crop in the drone image of the sample area.
And acquiring the lodging degree of the crops in the sample area according to the lodging space distribution map of the crops.
Specifically, the area ratio of the area occupied by the fallen crops to the area of the sample region in the crop falling spatial distribution map of the sample region is calculated, and the area ratio is taken as the crop falling degree of the sample region.
According to the embodiment of the invention, the unmanned aerial vehicle image of the modeling area is acquired by the unmanned aerial vehicle, the unmanned aerial vehicle image recognition model is acquired according to the unmanned aerial vehicle image training of the modeling area, the crop lodging degree of the sample area is acquired based on the unmanned aerial vehicle image characteristic of the sample area and the trained unmanned aerial vehicle image recognition model, and the remote sensing image recognition model can be acquired based on the crop lodging degree of the sample area and the remote sensing image characteristic training of the sample area, so that the crop lodging degree of the area to be recognized with a larger area can be recognized more efficiently based on the trained remote sensing image recognition model, the crop lodging degree of the large area can be recognized more efficiently, and the time cost for recognizing the crop lodging degree of the large area can be reduced.
Based on the content of each embodiment, the obtaining of the crop lodging degree of the sample area according to the crop lodging spatial distribution map specifically includes: and acquiring the size of the window according to the remote sensing image of the sample region and the resolution of the unmanned aerial vehicle image of the sample region.
Specifically, the resolution of the remote-sensing image in the sample region is smaller than the resolution of the unmanned aerial vehicle image in the sample region, and the size of the square window can be obtained according to the multiple relation between the resolution of the remote-sensing image and the resolution of the unmanned aerial vehicle image. The length and width of the window may be pixels of the aforementioned multiple size or pixels of the multiple integer size, so that one window in the drone image may correspond to one pixel or one small region in the remote sensing image.
For example: the resolution of the remote sensing image of the sample region is 2m, the resolution of the unmanned aerial vehicle image of the sample region is 0.05m, and the spatial resolution of the remote sensing image of the sample region is 40 times of the resolution of the unmanned aerial vehicle image of the sample region. The area corresponding to 40 pixels in the unmanned aerial vehicle image can correspond to the area corresponding to 1 pixel in the remote sensing image; thus, a square window size of 40 × 40 pixels can be obtained, and a square window size of an integer multiple of 40 can also be obtained, for example: 80 × 80 pixels, 120 × 120 pixels, or 160 × 160 pixels.
And sliding the window with the size in the crop lodging space distribution map according to a preset step length, and acquiring the crop lodging degree of the sample area according to the lodging degree of crops in the window after each sliding.
It should be noted that the preset step size is the same as the size of the window.
And sliding the window with the determined size in a crop lodging space distribution map obtained according to the unmanned aerial vehicle image of the sample area according to a preset step length, and calculating the lodging degree of crops in the window after each sliding. And mapping the calculated lodging degree of the crops in the window after each sliding to corresponding pixel points or small areas in the remote sensing image of the sample area, so as to obtain the lodging degree of the crops in each pixel point or small area in the remote sensing image of the sample area. According to the crop lodging degree of each pixel point or small area in the remote sensing image of the sample area, the crop lodging degree of the sample area can be obtained.
It should be noted that the lodging degree of the crop in the window after each sliding is obtained by calculating the area ratio of the lodging area of the crop in the window after each sliding to the area of the window.
It should be noted that, after the crop lodging degree of the sample region is obtained, the crop lodging degree of the sample region may be corresponding to the remote sensing image features of the remote sensing image of the sample region, and an identification label corresponding to the remote sensing image features of the sample region is obtained for training the remote sensing image identification model.
The embodiment of the invention determines the size of the window according to the resolution of the remote sensing image of the sample area and the unmanned aerial vehicle image of the sample area, slides the window in a crop lodging space distribution map obtained according to the unmanned aerial vehicle image of the sample area according to a preset step length, calculates the lodging degree of crops in the sliding window, maps the lodging degree of the crops in the sliding window each time to the remote sensing image of the sample area to obtain the lodging degree of the crops in the sample area, can obtain the identification label corresponding to the remote sensing image characteristic of the sample area through the unmanned aerial vehicle image of the sample area, can obtain the remote sensing image identification model based on the identification label corresponding to the remote sensing image characteristic of the sample area and the remote sensing image characteristic training of the sample area, thereby more efficiently obtaining the lodging degree of the crops in the area to be identified with larger area based on the trained remote sensing image identification model, the crop lodging degree of a large-area can be identified more efficiently, and the time cost for identifying the crop lodging degree of the large-area can be reduced.
Based on the content of each embodiment, acquiring the image feature of the unmanned aerial vehicle in the sample region according to the unmanned aerial vehicle image in the sample region specifically includes: and acquiring the unmanned aerial vehicle image characteristics of the original sample region according to the unmanned aerial vehicle image of the original sample region.
It should be noted that the drone image of the original sample area in the embodiment of the present invention is a multispectral drone image captured by a drone. By image preprocessing of the original multispectral unmanned aerial vehicle image, the unmanned aerial vehicle image of the original sample region can be obtained.
In the embodiment of the invention, the original multispectral unmanned aerial vehicle image is spliced and resampled into the unmanned aerial vehicle image of the original sample area with the resolution of 0.05 m.
It should be noted that the raw sample region is a relatively small area. The primary sample area may include crop that is growing in an unworn or lodging condition, and the primary sample area is of the same variety of crop as the crop growing in the area to be identified.
Based on the unmanned aerial vehicle image characteristics of the original sample regions, screening the original sample regions through a box line diagram, and screening out a plurality of regions in the original sample regions to serve as sample regions.
Specifically, the distribution of the unmanned aerial vehicle image features of the original sample area is analyzed through a box line diagram, the dispersion degree of the distribution of the unmanned aerial vehicle image features of the original sample area is obtained, the original sample area is screened according to the dispersion degree of the distribution of the unmanned aerial vehicle image features of the original sample area, and the area with the higher dispersion degree of the unmanned aerial vehicle image features in the original sample area is removed.
The box plot is a statistical graph that can be used to display the degree of data dispersion.
After finding out the upper edge, the lower edge, the median and the two quartiles in the data, connecting the two quartiles to draw a box body, and then connecting the upper edge and the lower edge with the box body to obtain a box diagram, wherein the median is in the middle of the box body, and the data with higher dispersion degree is at the edge of the box body. Data are screened through the box line graph, and errors caused by human factors when the data are acquired can be eliminated.
After the raw sample regions are screened by the box plot, the remaining regions in the raw sample regions may be used as sample regions.
And acquiring the unmanned aerial vehicle image characteristics of the sample region according to the unmanned aerial vehicle image characteristics of the original sample region.
Specifically, unmanned aerial vehicle image features corresponding to a plurality of reserved regions in the original sample region are acquired as the unmanned aerial vehicle image features of the sample region.
According to the embodiment of the invention, after the unmanned aerial vehicle image characteristics of the original sample area are screened through the box line graph, the unmanned aerial vehicle image characteristics of the sample area are obtained, the area with larger dispersion degree in the original sample area can be removed according to the unmanned aerial vehicle image characteristics of the original sample area, the unmanned aerial vehicle image characteristics of the sample area for more accurately describing the crop lodging degree of the sample area can be obtained, the more accurate crop lodging degree of the sample area can be obtained, the remote sensing image identification model with higher identification precision can be obtained based on the more accurate crop lodging degree of the sample area through training, the crop lodging degree of the sample area with larger area can be more efficiently and more accurately obtained based on the trained remote sensing image identification model, and the time cost for identifying the crop lodging degree of the large area can be reduced.
Based on the content of above-mentioned each embodiment, unmanned aerial vehicle image characteristic includes: any number of vegetation index features, texture features, and raw band reflectance values.
In particular, the vegetation index is widely used for qualitatively and quantitatively evaluating vegetation coverage and the growth state thereof. In the embodiment of the invention, the lodging condition of the crops in a certain area can be described through the vegetation index of the area.
Specifically, the vegetation index of the drone image of the sample area, the drone image of the modeling area, and the drone image of the original sample area may include: at least one of a normalized vegetation index, a greenness vegetation index, an expanded enhanced vegetation index, a difference vegetation index, and a normalized greenness vegetation index.
The normalized vegetation index is the sum of the difference ratio of the reflection value of the near infrared band and the reflection value of the red light band in the image, and can reflect the information of crop growth, the vitality of the ecosystem, the productivity and the like in the image.
The greenness vegetation index is a component which is obtained by k-t transformation of the image and represents greenness, and can better reflect the difference of the coverage and growth conditions of crops in the image.
The expanded enhanced vegetation index is obtained based on the improvement of the normalized vegetation index, can improve the problems that the normalized vegetation index is easy to saturate and lacks of linear relation with actual vegetation coverage through atmospheric correction, and can more accurately reflect information such as crop growth, ecosystem vitality and productivity and the like in the image.
The difference vegetation index refers to the calculation of the reflectivity of two wave bands in the image, and can be used for representing the growth state of crops in the image, the coverage of the crops and the like.
The greenness vegetation index is normalized, and the influence of external conditions on the greenness vegetation index can be eliminated, so that the information of crop growth, ecological system vitality, productivity and the like in the image can be better reflected.
And calculating the normalized vegetation index, the greenness vegetation index, the expanded enhanced vegetation index, the difference vegetation index and the normalized greenness vegetation index of the obtained sample area, the modeling area and the original sample area, and respectively forming vegetation index characteristics of the sample area, the vegetation index characteristics of the modeling area and the vegetation index characteristics of the original sample area.
The texture feature refers to the change of the gray level of the image obtained by spatial statistics in the image, and can represent the property of the global feature in the image and describe the surface property of the surface feature in the image or a partial region of the image.
The original waveband reflectivity value refers to the ratio of the reflection flux of a ground object in an image in a certain waveband to the incident flux of the waveband, and the larger the reflectivity difference of different ground objects in the image on the same spectral waveband is, the easier the difference is to distinguish.
In the embodiment of the invention, any multiple of vegetation index features, texture features and original waveband reflectance values are used as unmanned aerial vehicle image features, the lodging degree of crops in an unmanned aerial vehicle image can be more accurately described through the unmanned aerial vehicle image features, the identification precision of an unmanned aerial vehicle image identification model obtained by unmanned aerial vehicle image feature training based on a modeling area can be improved, and the lodging degree of crops in a sample area can be more accurately obtained, so that a remote sensing image identification model with higher identification precision can be obtained by training based on the lodging degree of crops in the sample area more accurately, the lodging degree of crops in a to-be-identified area with a larger area can be more efficiently and more accurately obtained based on the trained remote sensing image identification model, and the time cost required for identifying the lodging degree of crops in a large-area can be reduced.
Based on the content of each embodiment, obtaining the texture feature of the original sample region according to the unmanned aerial vehicle image of the original sample region specifically includes: and acquiring the unmanned aerial vehicle principal component image of the original sample region according to the unmanned aerial vehicle image and the transformation matrix of the original sample region.
Wherein the transformation matrix is obtained based on the unmanned aerial vehicle image of the modeling area.
Specifically, by performing Principal Component Analysis (PCA) on the unmanned aerial vehicle image of the original sample region, the unmanned aerial vehicle Principal Component image of the original sample region can be acquired. The main calculation formula for performing principal component analysis on the unmanned aerial vehicle image of the original sample region is as follows:
Figure BDA0002838553110000171
wherein V represents an image of the principal component of the unmanned aerial vehicle in the original sample region, and V ═ Vi(ii) a i is 1,2, …, m, i represents a wave band index, and m represents the number of acquired main component images of the unmanned aerial vehicle; a represents a transformation matrix, and A ═ a11,a12,…,a1k;a21,a22,…,a2k;…;am1,am2,…,amk](ii) a X represents the drone image of the original sample area, X ═ Xi(ii) a i is 1,2, …, k represents the total number of wave bands in the unmanned aerial vehicle image of the original sample area, and m is less than or equal to k; x is the number ofiA band, x, in the drone image representing the original sample areai={xij(ii) a j is 1,2, …, n, j represents pixel index, n represents originalThe total number of pixels in a certain band in the drone image of the initial sample area.
It should be noted that the change matrix a may be obtained by solving a covariance matrix of the drone image in the modeling area.
And acquiring a gray level co-occurrence matrix and a pixel shape index according to the unmanned aerial vehicle main component image of the original sample region, and acquiring texture characteristics of the original sample region according to the gray level co-occurrence matrix and the pixel shape index.
Specifically, the Gray-level Co-occurrence Matrix (GLCM) of the principal component image of the unmanned aerial vehicle may be obtained by calculating the principal component image of the unmanned aerial vehicle in the original sample region. Through the gray level co-occurrence matrix of the principal component image of the unmanned aerial vehicle, the texture information of the original sample region which can be extracted comprises the following steps: second moment, contrast, inverse difference moment, autocorrelation and entropy of the original region.
The Pixel Shape Index (PSI) of the principal component image of the unmanned aerial vehicle can be obtained by calculating the principal component image of the unmanned aerial vehicle in the original sample region. The main expression of the pixel shape index of the principal component image of the unmanned aerial vehicle is as follows:
Figure BDA0002838553110000172
wherein PSI (r, s) represents PSI value of (r, s) pixel in principal component image of the unmanned aerial vehicle, D represents total number of direction lines, and D representsiThe length of the ith direction line is shown.
Local texture information of the original sample region can be obtained through the pixel shape index of the principal component image of the unmanned aerial vehicle, and the window effect caused by the gray level co-occurrence matrix can be avoided while the shape structure information of the ground objects in the image is accurately detected.
Texture information of the principal component image of the unmanned aerial vehicle, which is obtained according to the gray level co-occurrence matrix and the pixel shape index, can form texture characteristics of the sample region.
It should be noted that the method for obtaining the texture feature of the sample region according to the unmanned aerial vehicle image of the sample region, the method for obtaining the texture feature of the sample region according to the remote sensing image of the sample region, and the method for obtaining the texture feature of the region to be identified according to the remote sensing image of the region to be identified may be the same as the method for obtaining the texture feature of the original sample region according to the unmanned aerial vehicle image of the original sample region after determining the corresponding change matrix a.
The embodiment of the invention obtains the gray level co-occurrence matrix and the pixel shape index of the main component image of the unmanned aerial vehicle after obtaining the main component image of the unmanned aerial vehicle in the original sample area by analyzing the main component of the image of the unmanned aerial vehicle in the original sample area, obtains the texture characteristics of the original sample area according to the gray level co-occurrence matrix and the pixel shape index, can obtain the main component image of the unmanned aerial vehicle in the original multi-band area by analyzing the main component, can realize the dimension reduction of the unmanned aerial vehicle image in the original sample area, can reduce the data amount of subsequent calculation, can improve the calculation efficiency of the subsequent calculation, can more accurately describe the lodging degree of crops in the unmanned aerial vehicle image through the texture characteristics obtained by the gray level co-occurrence matrix and the pixel shape index, can obtain an unmanned aerial vehicle image identification model with higher identification precision based on training, and can obtain more accurately the lodging degree of crops in the sample area, the remote sensing image recognition model with higher recognition precision can be obtained based on the training of the crop lodging degree of the sample area more accurately, the crop lodging degree of the area to be recognized with larger area can be obtained more efficiently and more accurately based on the trained remote sensing image recognition model, and the time cost required for recognizing the crop lodging degree of the large area can be reduced.
Based on the content of each embodiment, before inputting the image features of the unmanned aerial vehicle in the sample area into the unmanned aerial vehicle image recognition model and acquiring the crop lodging spatial distribution map of the sample area, the method further includes: and acquiring the unmanned aerial vehicle image characteristics of the original modeling area according to the unmanned aerial vehicle image of the original modeling area.
It should be noted that the unmanned aerial vehicle image in the original modeling area in the embodiment of the present invention is a multispectral unmanned aerial vehicle image captured by an unmanned aerial vehicle. By image preprocessing of the original multispectral unmanned aerial vehicle image, the unmanned aerial vehicle image of the original modeling area can be obtained.
In the embodiment of the invention, the original multispectral unmanned aerial vehicle image is spliced and resampled into the unmanned aerial vehicle image of the original modeling area with the resolution of 0.05 m.
It should be noted that the original modeling region is a region having a relatively small area. The raw modeling region may be the same as the raw sample region. Crops with growth states of no lodging or lodging can be included in the original modeling area, and the varieties of the crops growing in the original modeling area, the original sample area and the area to be identified are the same.
In the embodiment of the present invention, according to the unmanned aerial vehicle image of the original modeling area, the type of the acquired unmanned aerial vehicle image feature of the original modeling area may be the same as the type of the unmanned aerial vehicle image feature of the original sample area. The method for obtaining the image features of the unmanned aerial vehicle in the original modeling region may also be the same as the method for obtaining the image features of the unmanned aerial vehicle in the original sample region, and details are not repeated in the embodiment of the present invention.
Based on the unmanned aerial vehicle image characteristics of the original modeling area, the original modeling area is screened through the box line graph, and a plurality of areas in the original modeling area are screened out to serve as the modeling area.
Specifically, the distribution of the unmanned aerial vehicle image features in the original modeling area is analyzed through the box line diagram, the dispersion degree of the unmanned aerial vehicle image feature distribution in the original modeling area is obtained, the original modeling area is screened according to the dispersion degree of the unmanned aerial vehicle image feature distribution in the original modeling area, and the area with the higher dispersion degree of the unmanned aerial vehicle image features in the original modeling area is removed.
After the original modeling area is screened through the box plot, a plurality of areas reserved in the original modeling area can be used as modeling areas.
And acquiring the unmanned aerial vehicle image characteristics of the modeling area according to the unmanned aerial vehicle image characteristics of the original modeling area.
And acquiring unmanned aerial vehicle image features corresponding to a plurality of reserved areas in the original modeling area as the unmanned aerial vehicle image features of the modeling area.
Based on the Xgboost classification method, training is carried out according to the unmanned aerial vehicle image features of the modeling area and the corresponding unmanned aerial vehicle image recognition labels, and an unmanned aerial vehicle image recognition model is obtained.
Specifically, training may be performed according to the unmanned aerial vehicle image features of the modeling area and the corresponding unmanned aerial vehicle image recognition tags based on an Xgboost classification method.
Wherein, the generalized objective function formula can be expressed as:
Obj(Θ)=L(Θ)+Ω(Θ)
in the formula, the first term on the right side of the equation is a loss term, and the fitting degree of the model in the unmanned aerial vehicle image characteristics of the modeling area is measured by adopting a Logistic loss function; the second term is a regular term, using L2Regularization, which can describe the complexity of the model.
More specifically, the objective function may be expressed as:
Figure BDA0002838553110000201
wherein Y represents a category label set, and Y ═ Yj(ii) a j ═ 1,2, …, n }; f represents a regression tree; t represents the addition of a new function f for the t time; DS represents unmanned aerial vehicle image characteristics of a modeling area, DS ═ cj;j=1,2,…,n},cj={cji;i=1,2,…,k}。
Figure BDA0002838553110000202
B represents the total number of regression trees.
And (3) applying a Taylor expansion, taking the first three terms of the Taylor expansion, deleting a high-order infinite small term, and expressing an objective function as follows:
Figure BDA0002838553110000203
in the formula, the first term gjCan be expressed as
Figure BDA0002838553110000204
Second term hjCan be expressed as
Figure BDA0002838553110000205
Third term omega (f)t) Can be expressed as
Figure BDA0002838553110000206
Wherein e is the regression tree node index, and T is the total number of nodes in the regression tree; an instance of the e-th leaf node may be defined as Ie={j|q(cj) E, grouping the objective functions according to a leaf node specification, and deleting a constant item to obtain:
Figure BDA0002838553110000211
in the formula (I), the compound is shown in the specification,
Figure BDA0002838553110000212
assume the structure of the regression tree (q (c)j) Is fixed, utilizes
Figure BDA0002838553110000213
The optimal weight of each leaf node and the corresponding objective function solution can be obtained:
Figure BDA0002838553110000214
Figure BDA0002838553110000215
in the formula, Obj represents a loss function of score, weIs the solution of the weights. The smaller the score of Obj, the better the structure of the regression tree.
It should be noted that the training method for obtaining the remote sensing image recognition model by training according to the remote sensing image characteristics of the sample area and the crop lodging degree of the sample area may be the same as the training method for obtaining the unmanned aerial vehicle image recognition model, and is not described in detail in the embodiment of the present invention.
The embodiment of the invention screens an original modeling area through a boxplot based on the acquired unmanned aerial vehicle image characteristics of the original modeling area to acquire the unmanned aerial vehicle image characteristics of the modeling area, trains to obtain an unmanned aerial vehicle image recognition model according to the unmanned aerial vehicle image characteristics of the modeling area and a corresponding unmanned aerial vehicle image recognition label based on an Xgboost classification method, can eliminate an area with larger dispersion degree in the original modeling area according to the unmanned aerial vehicle image characteristics of the original modeling area, can obtain the unmanned aerial vehicle image characteristics of the modeling area for more accurately describing the crop lodging degree of the modeling area, can train to obtain the unmanned aerial vehicle image recognition model with higher recognition precision, can obtain more accurate crop lodging degree of a sample area based on the unmanned aerial vehicle image recognition model with higher recognition precision, and can train to obtain a remote sensing image recognition model with higher recognition precision, the crop lodging degree of the large-area region to be recognized can be obtained more efficiently and more accurately based on the trained remote sensing image recognition model, and the time cost for recognizing the crop lodging degree of the large-area region can be reduced.
Fig. 2 is a schematic structural view of a crop lodging degree recognition device provided by the invention. The crop lodging degree identification device provided by the invention is described below with reference to fig. 2, and the crop lodging degree identification device described below and the crop lodging degree identification method described above can be referred to correspondingly. As shown in fig. 2, the apparatus includes: an image acquisition module 201 and an image recognition module 202, wherein:
the image obtaining module 201 is configured to obtain a remote sensing image feature of the to-be-identified region according to the remote sensing image of the to-be-identified region.
And the image identification module 202 is used for inputting the remote sensing image characteristics of the area to be identified into the remote sensing image identification model and acquiring the lodging degree of crops in the area to be identified.
The remote sensing image recognition model is obtained by training based on the remote sensing image characteristics of the sample area and the lodging degree of crops in the sample area; the crop lodging degree of the sample area is obtained in advance according to the unmanned aerial vehicle image of the sample area.
Specifically, the image acquisition module 201 and the image recognition module 202 are electrically connected.
The image obtaining module 201 obtains the multispectral high-resolution remote sensing image of the region to be identified through the high-resolution one-number B star (GF1B), and then performs image preprocessing on the multispectral remote sensing image of the region to be identified, so as to obtain the remote sensing image of the region to be identified.
It should be noted that the area to be identified is an area with a relatively large area, and the area to be identified may include crops whose growth states are no lodging or lodging.
It is to be noted that the crop in the practice of the present invention may be corn, but also rice, wheat or other crops.
Different parameters can be selected as the remote sensing image characteristics of the area to be identified according to actual conditions. The specific type of the remote sensing image feature of the region to be identified is not particularly limited in the embodiment of the present invention.
The method for obtaining the remote sensing image characteristics of the to-be-identified region according to the remote sensing image of the to-be-identified region can be determined according to the specific type of the remote sensing image characteristics of the to-be-identified region, and is not particularly limited in the embodiment of the present invention. For example, if the remote-sensing image feature of the area to be identified includes a vegetation index, the normalized vegetation index, the green-band vegetation index, the expanded enhanced vegetation index, the difference vegetation index and the normalized greenness vegetation index of the area to be identified may be calculated from the remote-sensing image of the area to be identified, and the calculation results of all the calculations may be used as the remote-sensing image feature of the area to be identified.
The image recognition module 202 can acquire the lodging degree of crops in the to-be-recognized area by inputting the remote sensing image of the to-be-recognized area into the trained remote sensing image recognition model.
When the area of the area to be identified is large, the remote sensing image of the area to be identified can be rapidly and conveniently acquired through the remote sensing technology. If when the unmanned aerial vehicle image of the regional waiting to discern that the area is great is obtained through the mode that unmanned aerial vehicle shot, then need shoot a large amount of unmanned aerial vehicle images through unmanned aerial vehicle, compare in obtaining the remote sensing image of waiting to discern the region through remote sensing technology, need invest more time cost and equipment cost.
On the other hand, compared with remote sensing images, the unmanned aerial vehicle images have the advantages of high definition, high space-time resolution, large scale, small area, high occurrence, strong operability in the shooting process and the like. To the less region of area, can obtain the crops lodging degree in this region more accurately according to unmanned aerial vehicle image.
According to the embodiment of the invention, after the unmanned aerial vehicle shoots the unmanned aerial vehicle image of the sample area with a smaller area, the crop lodging degree of the sample area is more accurately obtained according to the unmanned aerial vehicle image of the sample area, and the remote sensing image identification model is obtained based on the crop lodging degree of the sample area and the remote sensing image characteristic training of the sample area. Based on the trained remote sensing image recognition model, a large number of remote sensing images acquired through a remote sensing technology can be recognized, and therefore the lodging degree of crops in a large area to be recognized can be acquired more efficiently.
It should be noted that, in the embodiment of the present invention, the drone image of the sample area is a multispectral drone image. By image preprocessing of the multispectral unmanned aerial vehicle image of the sample region, the unmanned aerial vehicle image of the sample region can be obtained.
It should be noted that the sample area is a relatively small area, and the sample area may include a crop whose growth state is not lodging or lodging, and the crop growing in the sample area is of the same variety as the crop growing in the area to be identified.
The crop lodging degree of the sample area can be obtained in advance according to the unmanned aerial vehicle image of the sample area based on a deep learning algorithm. It can also be pre-acquired based on the drone image visual interpretation of the sample area.
It should be noted that the degree of crop lodging in the sample region can be described by the ratio of the area of crop lodging in the sample region to the area of the sample region.
Based on the characteristics between the remote sensing images of different areas and the unmanned aerial vehicle image, after the crop lodging degree of the sample area is obtained, the crop lodging degree of the sample area can be corresponding to the remote sensing image characteristics of the sample area, and the identification label corresponding to the remote sensing image characteristics of the sample area is obtained.
And based on a deep learning algorithm, training the obtained remote sensing image identification model according to the remote sensing image characteristics of the sample region and the identification labels corresponding to the remote sensing image characteristics of the sample region.
It should be noted that the remote sensing image recognition model can be obtained by training according to the remote sensing image characteristics of the sample region and the recognition labels corresponding to the remote sensing image characteristics of the sample region based on the Xgboost classification method. The remote sensing image identification model can be obtained by training according to the remote sensing image characteristics of the sample area and the identification labels corresponding to the remote sensing image characteristics of the sample area based on other deep learning algorithms.
The remote sensing image recognition model describes the relation between the remote sensing image characteristics and the crop lodging degree, the remote sensing image characteristics of the area to be recognized are input into the trained remote sensing image recognition model, and the crop lodging degree of the area to be recognized can be obtained.
According to the embodiment of the invention, the unmanned aerial vehicle image of the sample area is obtained through the unmanned aerial vehicle, the crop lodging degree of the sample area is obtained according to the unmanned aerial vehicle image of the sample area, the remote sensing image recognition model is obtained through training based on the crop lodging degree of the sample area and the remote sensing image characteristics of the sample area, the crop lodging degree of the area to be recognized is obtained through inputting the remote sensing image characteristics of the area to be recognized into the trained remote sensing image recognition model, the remote sensing image of the large area can be rapidly and conveniently obtained through a remote sensing technology, the remote sensing image can be recognized based on the trained remote sensing image recognition model, the crop lodging degree of the large area can be more efficiently recognized, and the time cost for recognizing the crop lodging degree of the large area can be reduced.
Fig. 3 illustrates a physical structure diagram of an electronic device, which may include, as shown in fig. 3: a processor (processor)310, a communication Interface (communication Interface)320, a memory (memory)330 and a communication bus 340, wherein the processor 310, the communication Interface 320 and the memory 330 communicate with each other via the communication bus 340. The processor 310 may invoke logic instructions in the memory 330 to perform a crop lodging level identification method, the method comprising: acquiring the remote sensing image characteristics of the area to be identified according to the remote sensing image of the area to be identified; inputting the remote sensing image characteristics of the area to be identified into a remote sensing image identification model, and acquiring the lodging degree of crops in the area to be identified; the remote sensing image recognition model is obtained by training based on the remote sensing image characteristics of the sample area and the lodging degree of crops in the sample area; the crop lodging degree of the sample area is obtained in advance according to the unmanned aerial vehicle image of the sample area.
In addition, the logic instructions in the memory 330 may be implemented in the form of software functional units and stored in a computer readable storage medium when the software functional units are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In another aspect, the present invention also provides a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions, which when executed by a computer, enable the computer to perform the method for identifying a degree of lodging of a crop provided by the above methods, the method comprising: acquiring the remote sensing image characteristics of the area to be identified according to the remote sensing image of the area to be identified; inputting the remote sensing image characteristics of the area to be identified into a remote sensing image identification model, and acquiring the lodging degree of crops in the area to be identified; the remote sensing image recognition model is obtained by training based on the remote sensing image characteristics of the sample area and the lodging degree of crops in the sample area; the crop lodging degree of the sample area is obtained in advance according to the unmanned aerial vehicle image of the sample area.
In yet another aspect, the present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program, which when executed by a processor, is implemented to perform the above-mentioned crop lodging degree identification method, the method comprising: acquiring the remote sensing image characteristics of the area to be identified according to the remote sensing image of the area to be identified; inputting the remote sensing image characteristics of the area to be identified into a remote sensing image identification model, and acquiring the lodging degree of crops in the area to be identified; the remote sensing image recognition model is obtained by training based on the remote sensing image characteristics of the sample area and the lodging degree of crops in the sample area; the crop lodging degree of the sample area is obtained in advance according to the unmanned aerial vehicle image of the sample area.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A crop lodging degree identification method is characterized by comprising the following steps:
acquiring the remote sensing image characteristics of the area to be identified according to the remote sensing image of the area to be identified;
inputting the remote sensing image characteristics of the area to be identified into a remote sensing image identification model to obtain the lodging degree of crops in the area to be identified;
the remote sensing image recognition model is obtained by training based on the remote sensing image characteristics of the sample area and the crop lodging degree of the sample area; the crop lodging degree of the sample area is obtained in advance according to the unmanned aerial vehicle image of the sample area.
2. The method for identifying the lodging degree of crops according to claim 1, wherein before inputting the remote-sensing image characteristics of the area to be identified into the remote-sensing image identification model and obtaining the lodging degree of crops in the area to be identified, the method further comprises:
acquiring unmanned aerial vehicle image characteristics of the sample region according to the unmanned aerial vehicle image of the sample region;
inputting the unmanned aerial vehicle image characteristics of the sample area into an unmanned aerial vehicle image identification model, and acquiring a crop lodging spatial distribution map of the sample area;
acquiring the lodging degree of the crops in the sample area according to the lodging space distribution map of the crops;
the unmanned aerial vehicle image recognition model is obtained by training unmanned aerial vehicle image features and corresponding unmanned aerial vehicle image recognition labels in a modeling area; the unmanned aerial vehicle image identification tag is predetermined according to the unmanned aerial vehicle image in the modeling area and corresponds to the unmanned aerial vehicle image in the modeling area one to one.
3. The method for identifying the lodging degree of crops as claimed in claim 2, wherein the step of obtaining the lodging degree of crops in the sample area according to the spatial distribution map of lodging of crops comprises:
acquiring the size of a window according to the remote sensing image of the sample region and the resolution of the unmanned aerial vehicle image of the sample region;
and sliding the window with the size in the crop lodging space distribution map according to a preset step length, and acquiring the crop lodging degree of the sample area according to the lodging degree of crops in the window after each sliding.
4. The method for identifying the lodging degree of crops as claimed in claim 2, wherein the step of obtaining the image characteristics of the drone in the sample area according to the drone image in the sample area specifically comprises:
acquiring unmanned aerial vehicle image characteristics of the original sample region according to the unmanned aerial vehicle image of the original sample region;
screening the original sample regions through a box line graph based on unmanned aerial vehicle image characteristics of the original sample regions, and screening out a plurality of regions in the original sample regions as the sample regions;
and acquiring the unmanned aerial vehicle image characteristics of the sample region according to the unmanned aerial vehicle image characteristics of the original sample region.
5. The method for identifying the lodging degree of crops as claimed in claim 4, wherein the unmanned aerial vehicle image features comprise:
any number of vegetation index features, texture features, and raw band reflectance values.
6. The method for identifying the lodging degree of a crop as claimed in claim 5, wherein the step of obtaining the textural features of the original sample region from the drone image of the original sample region comprises:
acquiring an unmanned aerial vehicle principal component image of the original sample region according to the unmanned aerial vehicle image and the transformation matrix of the original sample region;
acquiring a gray level co-occurrence matrix and a pixel shape index according to the unmanned aerial vehicle main component image of the original sample region, and acquiring texture characteristics of the original sample region according to the gray level co-occurrence matrix and the pixel shape index;
wherein the transformation matrix is obtained based on drone imagery of the modeled region.
7. The method for identifying the lodging degree of crops as claimed in claim 2, wherein before inputting the image characteristics of the drone in the sample area into the drone image identification model and obtaining the spatial distribution map of lodging of crops in the sample area, the method further comprises:
acquiring unmanned aerial vehicle image characteristics of an original modeling area according to the unmanned aerial vehicle image of the original modeling area;
screening the original modeling area through a box plot based on the unmanned aerial vehicle image characteristics of the original modeling area, and screening out a plurality of areas in the original modeling area as the modeling area;
acquiring unmanned aerial vehicle image characteristics of an original modeling area according to the unmanned aerial vehicle image characteristics of the modeling area;
training according to the unmanned aerial vehicle image features of the modeling area and the corresponding unmanned aerial vehicle image recognition labels based on an Xgboost classification method to obtain the unmanned aerial vehicle image recognition model.
8. A crop lodging degree recognition device, comprising:
the image acquisition module is used for acquiring the remote sensing image characteristics of the area to be identified according to the remote sensing image of the area to be identified;
the image identification module is used for inputting the remote sensing image characteristics of the area to be identified into a remote sensing image identification model and acquiring the lodging degree of crops in the area to be identified;
the remote sensing image recognition model is obtained by training based on the remote sensing image characteristics of the sample area and the crop lodging degree of the sample area; the crop lodging degree of the sample area is obtained in advance according to the unmanned aerial vehicle image of the sample area.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program implements the steps of the method for identifying a degree of lodging in a crop as claimed in any one of claims 1 to 7.
10. A non-transitory computer-readable storage medium, on which a computer program is stored, wherein the computer program, when executed by a processor, implements the steps of the method for identifying the degree of lodging of a crop as claimed in any one of claims 1 to 7.
CN202011484271.0A 2020-12-15 2020-12-15 Crop lodging degree identification method and device Active CN112597855B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011484271.0A CN112597855B (en) 2020-12-15 2020-12-15 Crop lodging degree identification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011484271.0A CN112597855B (en) 2020-12-15 2020-12-15 Crop lodging degree identification method and device

Publications (2)

Publication Number Publication Date
CN112597855A true CN112597855A (en) 2021-04-02
CN112597855B CN112597855B (en) 2024-04-16

Family

ID=75196369

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011484271.0A Active CN112597855B (en) 2020-12-15 2020-12-15 Crop lodging degree identification method and device

Country Status (1)

Country Link
CN (1) CN112597855B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113516176A (en) * 2021-06-21 2021-10-19 中国农业大学 Wheat lodging region identification method based on spectral texture characteristics and K nearest neighbor method
CN113516177A (en) * 2021-06-21 2021-10-19 中国农业大学 Wheat lodging region identification method based on spectral texture features and support vector machine
CN117789067A (en) * 2024-02-27 2024-03-29 山东字节信息科技有限公司 Unmanned aerial vehicle crop monitoring method and system based on machine learning

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108764263A (en) * 2018-02-12 2018-11-06 北京佳格天地科技有限公司 The atural object annotation equipment and method of remote sensing image
CN109813286A (en) * 2019-01-28 2019-05-28 中科光启空间信息技术有限公司 A kind of lodging disaster remote sensing damage identification method based on unmanned plane
CN110889394A (en) * 2019-12-11 2020-03-17 安徽大学 Rice lodging recognition method based on deep learning UNet network
CN110991714A (en) * 2019-11-21 2020-04-10 北京农业信息技术研究中心 Crop lodging disaster monitoring method and system
CN111091052A (en) * 2019-11-07 2020-05-01 中国农业大学 Corn lodging area extraction system and method based on maximum likelihood method
CN111242224A (en) * 2020-01-16 2020-06-05 贵州省草业研究所 Multi-source remote sensing data classification method based on unmanned aerial vehicle extraction classification sample points
CN111461052A (en) * 2020-04-13 2020-07-28 安徽大学 Migration learning-based method for identifying lodging regions of wheat in multiple growth periods
CN111860150A (en) * 2020-06-11 2020-10-30 中科禾信遥感科技(苏州)有限公司 Lodging rice identification method and device based on remote sensing image

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108764263A (en) * 2018-02-12 2018-11-06 北京佳格天地科技有限公司 The atural object annotation equipment and method of remote sensing image
CN109813286A (en) * 2019-01-28 2019-05-28 中科光启空间信息技术有限公司 A kind of lodging disaster remote sensing damage identification method based on unmanned plane
CN111091052A (en) * 2019-11-07 2020-05-01 中国农业大学 Corn lodging area extraction system and method based on maximum likelihood method
CN110991714A (en) * 2019-11-21 2020-04-10 北京农业信息技术研究中心 Crop lodging disaster monitoring method and system
CN110889394A (en) * 2019-12-11 2020-03-17 安徽大学 Rice lodging recognition method based on deep learning UNet network
CN111242224A (en) * 2020-01-16 2020-06-05 贵州省草业研究所 Multi-source remote sensing data classification method based on unmanned aerial vehicle extraction classification sample points
CN111461052A (en) * 2020-04-13 2020-07-28 安徽大学 Migration learning-based method for identifying lodging regions of wheat in multiple growth periods
CN111860150A (en) * 2020-06-11 2020-10-30 中科禾信遥感科技(苏州)有限公司 Lodging rice identification method and device based on remote sensing image

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张飞等: "基于无人机低空遥感与卫星遥感的洛宁县中药资源种植面积估算研究", 《中国中药杂志》, pages 4 *
樊东东: "基于代价敏感学习的冬小麦病虫害与倒伏遥感检测方法研究", 《中国优秀硕士学位论文全文数据库 农业科技辑》, pages 2 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113516176A (en) * 2021-06-21 2021-10-19 中国农业大学 Wheat lodging region identification method based on spectral texture characteristics and K nearest neighbor method
CN113516177A (en) * 2021-06-21 2021-10-19 中国农业大学 Wheat lodging region identification method based on spectral texture features and support vector machine
CN117789067A (en) * 2024-02-27 2024-03-29 山东字节信息科技有限公司 Unmanned aerial vehicle crop monitoring method and system based on machine learning
CN117789067B (en) * 2024-02-27 2024-05-10 山东字节信息科技有限公司 Unmanned aerial vehicle crop monitoring method and system based on machine learning

Also Published As

Publication number Publication date
CN112597855B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
Osco et al. A convolutional neural network approach for counting and geolocating citrus-trees in UAV multispectral imagery
CN112597855B (en) Crop lodging degree identification method and device
Wendel et al. Maturity estimation of mangoes using hyperspectral imaging from a ground based mobile platform
Pang et al. Improved crop row detection with deep neural network for early-season maize stand count in UAV imagery
Aich et al. Deepwheat: Estimating phenotypic traits from crop images with deep learning
Andujar et al. Using depth cameras to extract structural parameters to assess the growth state and yield of cauliflower crops
CN113392775B (en) Sugarcane seedling automatic identification and counting method based on deep neural network
Csillik et al. Cropland mapping from Sentinel-2 time series data using object-based image analysis
CN107832797B (en) Multispectral image classification method based on depth fusion residual error network
CN111161362A (en) Tea tree growth state spectral image identification method
CN114581768A (en) Crop lodging unmanned aerial vehicle monitoring method and device
Olenskyj et al. End-to-end deep learning for directly estimating grape yield from ground-based imagery
Soares et al. Plantation Rows Identification by Means of Image Tiling and Hough Transform.
CN113223040A (en) Remote sensing-based banana yield estimation method and device, electronic equipment and storage medium
Jónsson RGB and Multispectral UAV image classification of agricultural fields using a machine learning algorithm
CN112464762A (en) Agricultural product screening system and method based on image processing
Witharana et al. Benchmarking of data fusion algorithms in support of earth observation based Antarctic wildlife monitoring
CN114419367A (en) High-precision crop drawing method and system
AHM et al. A deep convolutional neural network based image processing framework for monitoring the growth of soybean crops
Jiang et al. MIoP-NMS: Perfecting crops target detection and counting in dense occlusion from high-resolution UAV imagery
Zhao et al. Evaluation of spatial resolution on crop disease detection based on multiscale images and category variance ratio
CN113887619A (en) Knowledge-guided remote sensing image fusion method
Christovam et al. Evaluation of sar to optical image translation using conditional generative adversarial network for cloud removal in a crop dataset
CN114937038B (en) Usability-oriented remote sensing image quality evaluation method
Muhsin et al. Detecting and monitoring the vegetal cover of Karbala Province (Iraq) using change detection methods

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant