CN113221790A - Method and device for generating field crop rotation mode based on radar data - Google Patents

Method and device for generating field crop rotation mode based on radar data Download PDF

Info

Publication number
CN113221790A
CN113221790A CN202110558351.4A CN202110558351A CN113221790A CN 113221790 A CN113221790 A CN 113221790A CN 202110558351 A CN202110558351 A CN 202110558351A CN 113221790 A CN113221790 A CN 113221790A
Authority
CN
China
Prior art keywords
crop
field
image
rasterized
radar data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110558351.4A
Other languages
Chinese (zh)
Inventor
姜浩
郑琼
李丹
王力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Institute of Geography of GDAS
Southern Marine Science and Engineering Guangdong Laboratory Guangzhou
Original Assignee
Guangzhou Institute of Geography of GDAS
Southern Marine Science and Engineering Guangdong Laboratory Guangzhou
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Institute of Geography of GDAS, Southern Marine Science and Engineering Guangdong Laboratory Guangzhou filed Critical Guangzhou Institute of Geography of GDAS
Priority to CN202110558351.4A priority Critical patent/CN113221790A/en
Publication of CN113221790A publication Critical patent/CN113221790A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Abstract

The invention relates to a field crop rotation mode generation method and a device based on radar data, which are characterized in that radar data of a radar remote sensing satellite is obtained, the radar data is preprocessed to obtain an optical remote sensing satellite image, a field image is identified from the optical remote sensing satellite image and is rasterized, the rasterized field image is matched with the preprocessed radar data, the average value of the radar data corresponding to the rasterized field image is calculated to obtain the average crop curve corresponding to the rasterized field image, the crop type and the crop period are identified according to the average crop curve, a crop rotation sequence is determined according to the crop type and the crop period, the crop rotation sequence is merged to generate the field crop rotation mode, the inconvenience of optical observation under cloudy weather is overcome, the accuracy of radar data observation is improved, the field is used as a unit for analysis, and the robustness is improved.

Description

Method and device for generating field crop rotation mode based on radar data
Technical Field
The invention relates to the technical field of plant information identification, in particular to a field crop rotation mode generation method and device based on radar data.
Background
The "crop rotation mode" refers to a planting mode of alternately planting different crops in the same field sequentially in seasons or years. Has important effects on maintaining soil fertility, improving yield and reducing plant diseases and insect pests. Optical data are generally adopted for the crop rotation mode monitoring, but in regions with cloudy weather, optical observation cannot be obtained, and only radar data can be used.
However, radar data is noisy, and the topography, humidity, field facilities, etc. in the field can cause changes in the backscatter coefficients, making the observation highly uncertain. The commonly used noise removing technology, such as a filter like Lee, uses a fixed window to observe, which may generate a mixed pixel effect, so that the observation accuracy is not high.
Disclosure of Invention
Accordingly, an object of the present invention is to provide a method and an apparatus for generating a field crop rotation pattern based on radar data, which have the advantages of high observation accuracy and good robustness.
In order to achieve the above object, a first aspect of the present invention provides a method for generating a field rotation pattern based on radar data, including:
acquiring radar data of a radar remote sensing satellite, and preprocessing the radar data;
acquiring an optical remote sensing satellite image, identifying and acquiring a field image from the optical remote sensing satellite image, and rasterizing the identified field image;
matching the rasterized field image with the preprocessed radar data, and calculating an average value of the radar data corresponding to the rasterized field image to obtain an average crop curve corresponding to the rasterized field image;
identifying a crop type and a crop period according to the average crop curve;
and determining a crop rotation sequence according to the crop type and the crop period, merging the crop rotation sequence, and generating a field rotation mode.
By acquiring radar data of a radar remote sensing satellite, preprocessing the radar data to acquire an optical remote sensing satellite image, identifying and obtaining a field image from the optical remote sensing satellite image, rasterizing the identified field image, matching the rasterized field image with the preprocessed radar data, calculating a radar data average value corresponding to the rasterized field image, and obtaining an average crop curve corresponding to the rasterized field image, according to the average crop curve, the crop type and the crop period are identified, the crop rotation sequence is determined according to the crop type and the crop period, the crop rotation sequence is merged, and a field rotation mode is generated.
Further, the radar data of the radar remote sensing satellite is obtained through a Sentinel-1A radar remote sensing satellite; the method comprises the steps of obtaining radar data of a radar remote sensing satellite and preprocessing the radar data, and comprises the following steps: the method comprises the steps of obtaining radar data obtained through a Sentinel-1A radar remote sensing satellite, and carrying out radiometric calibration, terrain correction and projection conversion on the radar data to obtain preprocessed radar data. And converting radar data acquired by the Sentinel-1A radar remote sensing satellite into a backscattering coefficient through a preprocessing process.
Further, the acquiring an optical remote sensing satellite image and identifying and obtaining a field image from the optical remote sensing satellite image comprises: acquiring an optical remote sensing satellite image, and inputting the optical remote sensing satellite image into a trained field block recognition model to obtain a field block image; wherein, the field identification model specifically comprises: sketching a field block boundary of the optical remote sensing satellite sample image to obtain field block sample data; and inputting the field sample data into an FCIS deep learning network for training and learning to obtain a field identification model. The FCIS deep learning network is used for automatically identifying the field blocks, and the identification efficiency is improved.
Further, the step of matching the rasterized field image with the preprocessed radar data, calculating an average value of the radar data corresponding to the rasterized field image, and obtaining an average crop curve corresponding to the rasterized field image includes: matching the rasterized field image with the preprocessed radar data according to longitude and latitude to obtain the preprocessed radar data corresponding to the rasterized field image; if the number of pixels of the rasterized field image is larger than a preset threshold value, performing morphological corrosion on the rasterized field image in a pixel unit, averaging the preprocessed radar data corresponding to all pixels in the rasterized field image after corrosion, and taking the average as a backscattering coefficient of the rasterized field image; if the number of pixels of the rasterized field image is smaller than a preset threshold value, averaging the corresponding preprocessed radar data in all pixels of the rasterized field image, wherein the average value is used as a backscattering coefficient of the rasterized field image; and drawing a curve according to the backscattering coefficient according to a time sequence to obtain an average crop curve of the rasterized field image. And by taking the field as a unit, an average crop curve is drawn, so that the robustness is improved.
Further, the step of identifying a crop type and a crop period from the average crop curve comprises: inputting the average crop curve into a trained crop type recognition model to recognize the crop type; identifying a crop period according to a time interval of the backscattering coefficient in the average crop curve from a minimum value to a maximum value to a minimum value; the crop type identification model specifically comprises the following steps: interpreting the crop type of the average crop curve of the field sample image to obtain crop type sample training data; wherein the crop type is interpreted according to the fluctuation amplitude, the width and the phenological time point of the average crop curve of the field sample image; and inputting the crop type sample training data into an xgboost classifier for training and learning to obtain a crop type identification model. The method has the advantages that the xgboost classifier is used for automatically identifying the field block as the type, and the manual periodic identification is combined, so that the identification efficiency is improved.
Further, the step of determining a crop rotation sequence according to the crop type and the crop period, merging the crop rotation sequence, and generating a field rotation mode includes: sequencing the crop types and the crop periods of each year field according to time, and determining a crop rotation sequence of the field; and removing unreasonable crop rotation sequences, and merging the crop rotation sequences of the same crop type, the same adjacent crop type and different initial crop types to obtain a crop rotation mode.
A second aspect of the present invention provides a field rotation pattern generation apparatus based on radar data, including:
the device comprises a preprocessing unit, a data acquisition unit and a data processing unit, wherein the preprocessing unit is used for acquiring radar data of a radar remote sensing satellite and preprocessing the radar data;
the system comprises a first identification unit, a second identification unit and a third identification unit, wherein the first identification unit is used for acquiring an optical remote sensing satellite image, identifying and acquiring a field image from the optical remote sensing satellite image, and rasterizing the identified field image;
the calculation unit is used for matching the rasterized field image with the preprocessed radar data, calculating an average value of the radar data corresponding to the rasterized field image, and obtaining an average crop curve corresponding to the rasterized field image;
the second identification unit is used for identifying the crop type and the crop period according to the average crop curve;
and the generating unit is used for determining a crop rotation sequence according to the crop type and the crop period, merging the crop rotation sequence and generating a field rotation mode.
Compared with the prior art, the invention has the following beneficial effects:
the invention provides a field crop rotation mode generation method and device based on radar data, which are characterized in that radar data of a radar remote sensing satellite are obtained, an optical remote sensing satellite image is obtained through preprocessing the radar data, a field image is obtained through identification from the optical remote sensing satellite image, the identified field image is rasterized, the rasterized field image is matched with the preprocessed radar data, the radar data average value corresponding to the rasterized field image is calculated, the average crop curve corresponding to the rasterized field image is obtained, the crop type and the crop period are identified according to the average crop curve, a rotation sequence is determined according to the crop type and the crop period, the rotation sequence is merged, the field crop rotation mode is generated, inconvenience of optical observation under cloudy weather is overcome, the accuracy of radar data observation is improved, the field is used as a unit for analysis, and the robustness is improved.
Drawings
FIG. 1 is a schematic flow chart of a method for generating a field crop rotation pattern based on radar data according to the present invention;
FIG. 2 is a schematic flow chart of S21 in the method for generating a field crop rotation pattern based on radar data according to the present invention;
FIG. 3 is a schematic flow chart of S30 in the method for generating a field crop rotation pattern based on radar data according to the present invention;
FIG. 4 is a schematic flow chart of S40 in the method for generating a field crop rotation pattern based on radar data according to the present invention;
FIG. 5 is a schematic flow chart of S41 in the method for generating a field crop rotation pattern based on radar data according to the present invention;
FIG. 6 is a schematic flow chart of S50 in the method for generating a field crop rotation pattern based on radar data according to the present invention;
FIG. 7 is a block diagram of a field rotation pattern generating apparatus according to the present invention;
FIG. 8 is a block diagram of a field image unit 621 of a field rotation pattern generating apparatus based on radar data according to the present invention;
fig. 9 is a block diagram showing the structure of the calculation unit 63 of the field rotation pattern generation device based on radar data according to the present invention;
fig. 10 is a block diagram of the second recognition unit 64 of the field rotation pattern generation apparatus based on radar data according to the present invention;
fig. 11 is a block diagram of a crop type unit 641 for generating a field rotation pattern based on radar data according to the present invention.
Detailed Description
For a better understanding and practice, the invention is described in detail below with reference to the accompanying drawings.
Referring to fig. 1, an embodiment of the present invention provides a method for generating a field rotation pattern based on radar data, including the following steps:
s10, radar data of the radar remote sensing satellite are obtained, and preprocessing is carried out on the radar data.
The radar remote sensing satellite is an earth observation remote sensing satellite loaded with a Synthetic Aperture Radar (SAR), and can observe earth objects all day long. In the embodiment of the application, radar data of the radar remote sensing satellite are obtained, and the radar data are preprocessed.
In an alternative embodiment, the step S10 includes step S11, which is as follows:
s11, radar data acquired through the Sentinel-1A radar remote sensing satellite is acquired, and radiometric calibration, terrain correction and projection conversion are carried out on the radar data to obtain preprocessed radar data.
The Sentinel-1 radar remote sensing satellite is an earth observation satellite in the European space agency Colbriy program (GMES) and consists of A, B satellites. Sentinel-1A carries a c-band SAR sensor, providing medium and high resolution imaging under all weather conditions. In The embodiment of The Application, radar data acquired through a Sentinel-1A radar remote sensing satellite is acquired, The radar data is radiation intensity information, The radiation intensity information is input into a Sentinel Application Platform (SNAP for short), radiation calibration, terrain correction and projection conversion are carried out, and preprocessed radar data, namely a backscattering coefficient, is obtained.
S20, obtaining an optical remote sensing satellite image, identifying and obtaining a field image from the optical remote sensing satellite image, and rasterizing the identified field image.
Optical remote sensing refers to collecting radiation reflected and emitted by a ground object into space by an optical system, converting the radiation into an electric signal by an optical detector, and then performing processing such as storage, data analysis and the like to obtain space, time and spectrum information of the ground object. In the embodiment of the application, an optical remote sensing satellite image in Google Earth is obtained, a field image is identified and obtained from the optical remote sensing satellite image, and the identified field image is rasterized. The method comprises the steps of identifying a field image, wherein the identified field image is rasterized by adopting a Geographic Data Abstraction Library (GDAL), the GDAL is an open source raster space Data conversion Library under an X/MIT permission protocol, various supported file formats are expressed by utilizing an abstract Data model, and a series of command line tools are used for Data conversion and processing.
In an alternative embodiment, the step S20 includes step S21, which is as follows:
and S21, acquiring an optical remote sensing satellite image, and inputting the optical remote sensing satellite image into a trained field block recognition model to obtain a field block image.
In the embodiment of the application, the optical remote sensing satellite image acquired from Google Earth is input into a trained field recognition model, and all fields in the optical remote sensing satellite image are recognized, so that the field image is obtained.
In an alternative embodiment, referring to fig. 2, the step S21 includes steps S22-S23, which are as follows:
and S22, sketching a field block boundary of the optical remote sensing satellite sample image to obtain field block sample data.
And S23, inputting the field sample data into an FCIS deep learning network for training and learning to obtain a field recognition model.
In the embodiment of the application, a large number of optical remote sensing satellite sample images are collected in advance, then field block boundaries in the sample images are manually outlined one by one aiming at each optical remote sensing satellite sample image to obtain field block sample data, and the field block sample data is input into a full-volume example perception Semantic Segmentation (FCIS) deep learning network for training and learning, so that a field block identification model is obtained.
And S30, matching the rasterized field image with the preprocessed radar data, and calculating an average value of the radar data corresponding to the rasterized field image to obtain an average crop curve corresponding to the rasterized field image.
In the embodiment of the application, the rasterized field image is matched with the preprocessed radar data, so that a backscattering coefficient corresponding to each field grid or pixel in the field image is obtained, and an average backscattering coefficient corresponding to the rasterized field image is calculated to obtain an average crop curve corresponding to the rasterized field image.
In an alternative embodiment, referring to fig. 3, the step S30 includes steps S31-S34, which are as follows:
s31, matching the rasterized field image with the preprocessed radar data according to longitude and latitude to obtain the preprocessed radar data corresponding to the rasterized field image;
s32, if the number of pixels of the rasterized field image is larger than a preset threshold value, performing morphological corrosion on the rasterized field image in a pixel unit, averaging the preprocessed radar data corresponding to all pixels in the rasterized field image after corrosion, and taking the average as a backscattering coefficient of the rasterized field image;
s33, if the number of pixels of the rasterized field image is smaller than a preset threshold value, averaging the preprocessed radar data corresponding to all the pixels in the rasterized field image, wherein the average value is used as a backscattering coefficient of the rasterized field image;
and S34, drawing a curve of the backscattering coefficient according to a time sequence to obtain an average crop curve of the rasterized field image.
In the embodiment of the application, the rasterized field image and the preprocessed radar data are matched according to longitude and latitude, so that each field grid or pixel in the field image obtains a corresponding backscattering coefficient. If the number of pixels in the rasterized field image is larger than 5, marking the rasterized field image as a field, performing morphological corrosion on the field by one pixel unit, and averaging corresponding backscattering coefficients in all pixels in the corroded field to obtain the backscattering coefficient of the whole field. And if the number of pixels in the rasterized field image is less than 5, marking the rasterized field image as a small field, not performing morphological corrosion on the small field, and averaging the backscattering coefficients corresponding to all pixels in the small field to obtain the backscattering coefficient of the whole small field. Wherein the morphological erosion is a basic operation commonly used in digital image processing, and the morphological erosion is performed on the field block to remove the influence of roads, irrigation facilities and the like around the field block on observation. And drawing curves of the backscattering coefficients of all the fields according to a time sequence to obtain an average crop curve of the rasterized field image. The revisit cycle of the Sentinel-1A radar remote sensing satellite is 12 days, 30 times of observation can be carried out on the same field each year, and a time sequence is formed by year-to-year observation.
And S40, identifying the crop type and the crop period according to the average crop curve.
In an embodiment of the application, the type of crop and the crop period in the field are identified based on the average crop curve.
In an alternative embodiment, referring to fig. 4, the step S40 includes steps S41-S42, which are as follows:
and S41, inputting the average crop curve into a trained crop type identification model to identify the crop type.
And S42, identifying a crop period according to the time interval of the backscattering coefficient in the average crop curve from a minimum value to a maximum value to a minimum value.
In the embodiment of the application, the average crop curves corresponding to different crop types are different, and the average crop curves are input into a trained crop type identification model to identify the crop types.
In an embodiment of the application, the crop period is identified based on a time interval in the average crop curve during which the backscatter coefficient changes from a minimum value to a maximum value to a minimum value. Specifically, the identifying the crop cycle comprises:
1. determining maximum (extreme large) and minimum (extreme small) values of the backscattering coefficient, as follows:
extreme large:vn-1<vn>vn+1
extreme small:vn-1>vn<vn+1
wherein v isn-1,vn,vn+1Three backscatter coefficients corresponding to sequentially adjacent time series are indicated.
2. And combining the subregions with the difference value between the maximum value and the minimum value of the adjacent backscattering coefficients less than 5dB according to the time sequence.
3. The total number of cycles between 2016 and 2019 is identified as one complete cycle from min to max to min. Specifically, the crop cycle can be set to 2 years 1, 1 year 2, 1 year 3.
In an alternative embodiment, referring to fig. 5, the step S41 includes steps S43-S44, which are as follows:
s43, interpreting the crop type of the average crop curve of the field sample image to obtain crop type sample training data; wherein the crop type is interpreted according to the fluctuation amplitude, the width and the phenological time point of the average crop curve of the field sample image;
and S44, inputting the crop type sample training data into an xgboost classifier for training and learning to obtain a crop type identification model.
In the embodiment of the application, an average crop curve of a large number of field sample images is collected, and crop type sample training data is obtained according to the fluctuation amplitude and width of the average crop curve of the field sample images and the interpretation of the crop type by the phenological time points. And inputting the crop type sample training data into an xgboost classifier for training and learning, wherein 20% of the sample training data is used for testing the xgboost classifier, 60% of the sample training data is used for training the classifier, and 20% of the sample training data is used for verifying the classification result to obtain a crop type recognition model. Wherein, xgboost is a machine learning function library of gradient lifting algorithm.
S50, determining a crop rotation sequence according to the crop type and the crop period, merging the crop rotation sequence, and generating a field rotation mode.
In the embodiment of the application, according to the crop type and the crop period of a field, a crop rotation sequence of the field is determined, the crop rotation sequence of the field is merged, and a field rotation mode is generated.
In an alternative embodiment, referring to fig. 6, the step S50 includes steps S51-S52, which are as follows:
s51, sequencing the crop types and the crop periods of each year field according to time, and determining a crop rotation sequence of the field;
s52, removing unreasonable crop rotation sequences, and merging the crop rotation sequences of the same crop type, the same adjacent crop type and different initial crop types to obtain a crop rotation mode.
In the embodiment of the application, the crop types and the crop periods of each year field are sorted according to time, and the crop rotation sequence of the field is determined. Specifically, pineapples are planted in 2016 and 2017 in a certain field, bananas are planted in 2018, peppers are planted in 2019, the period of the pineapples is 1 year and 2 years, the period of the bananas is 1 year and 1 year, the period of the peppers is 1 year and 2 years, and the rotation sequence of the certain field is 'pineapple-banana-pepper'.
And analyzing the result of the crop rotation sequence of the field, and finding that unreasonable crop rotation sequences need to be removed. Merging the crop rotation sequences of the same crop type and the same adjacent crop type but different initial crop types to obtain a crop rotation mode. For example, the crop rotation sequence "sugarcane-rice-pepper" and the crop rotation sequence "rice-pepper-sugarcane" are merged into the same crop rotation pattern.
By applying the embodiment of the invention, radar data of a radar remote sensing satellite is obtained, the radar data is preprocessed, an optical remote sensing satellite image is obtained, a field image is identified and obtained from the optical remote sensing satellite image, the identified field image is rasterized, the rasterized field image is matched with the preprocessed radar data, the radar data average value corresponding to the rasterized field image is calculated, an average crop curve corresponding to the rasterized field image is obtained, the crop type and the crop period are identified according to the average crop curve, a crop rotation sequence is determined according to the crop type and the crop period, the crop rotation sequence is merged to generate a field rotation mode, the inconvenience of optical observation in cloudy weather is overcome, the radar data observation precision is improved, and field analysis is carried out by taking a field as a unit, the robustness is improved.
Referring to fig. 7, an embodiment of the present invention provides a field rotation pattern generating device 6 based on radar data, including:
the preprocessing unit 61 is used for acquiring radar data of the radar remote sensing satellite and preprocessing the radar data;
the first identification unit 62 is configured to acquire an optical remote sensing satellite image, identify and acquire a field image from the optical remote sensing satellite image, and rasterize the identified field image;
a calculating unit 63, configured to match the rasterized field image with the preprocessed radar data, and calculate an average value of the radar data corresponding to the rasterized field image to obtain an average crop curve corresponding to the rasterized field image;
a second identifying unit 64 for identifying the crop type and the crop period according to the average crop curve;
and the generating unit 65 is configured to determine a crop rotation sequence according to the crop type and the crop period, merge the crop rotation sequence, and generate a field rotation mode.
Optionally, the first recognition unit 62 includes a field image unit 621, configured to obtain an optical remote sensing satellite image, input the optical remote sensing satellite image into a trained field recognition model, and obtain a field image.
Optionally, referring to fig. 8, the field image unit 621 specifically includes:
the delineating unit 6211 is configured to delineate a field boundary of the optical remote sensing satellite sample image, so as to obtain field sample data;
the first training learning unit 6212 is configured to input the field sample data into an FCIS deep learning network for training learning, so as to obtain a field recognition model.
Optionally, referring to fig. 9, the calculating unit 63 specifically includes:
a matching unit 631, configured to match the rasterized field image with the preprocessed radar data according to longitude and latitude, so as to obtain the preprocessed radar data corresponding to the rasterized field image;
a first averaging unit 632, configured to perform morphological erosion on the rasterized field image in a pixel unit if the number of pixels of the rasterized field image is greater than a preset threshold, average the preprocessed radar data corresponding to all pixels in the rasterized field image after erosion, where the average is used as a backscattering coefficient of the rasterized field image;
a second averaging unit 633, configured to, if the number of pixels of the rasterized field image is smaller than a preset threshold, average the preprocessed radar data corresponding to all pixels in the rasterized field image, where the average is used as a backscattering coefficient of the rasterized field image;
a drawing unit 634, configured to draw a curve according to the backscatter coefficients and a time series to obtain an average crop curve of the rasterized field image.
Optionally, referring to fig. 10, the second identifying unit 64 specifically includes:
a crop type unit 641, configured to input the average crop curve into a trained crop type identification model to identify a crop type;
a crop period unit 642 for identifying a crop period according to a time interval in which a backscattering coefficient in the average crop curve changes from a minimum value to a maximum value to a minimum value;
optionally, referring to fig. 11, the crop type unit 641 specifically includes:
the interpretation unit 6411 is configured to interpret the crop type for the average crop curve of the field sample image to obtain crop type sample training data; wherein the crop type is interpreted according to the fluctuation amplitude, the width and the phenological time point of the average crop curve of the field sample image;
and the second training learning unit 6412 is configured to input the crop type sample training data into an xgboost classifier for training and learning, so as to obtain a crop type identification model.
By applying the embodiment of the invention, radar data of a radar remote sensing satellite is obtained, the radar data is preprocessed, an optical remote sensing satellite image is obtained, a field image is identified and obtained from the optical remote sensing satellite image, the identified field image is rasterized, the rasterized field image is matched with the preprocessed radar data, the radar data average value corresponding to the rasterized field image is calculated, an average crop curve corresponding to the rasterized field image is obtained, the crop type and the crop period are identified according to the average crop curve, a crop rotation sequence is determined according to the crop type and the crop period, the crop rotation sequence is merged to generate a field rotation mode, the inconvenience of optical observation in cloudy weather is overcome, the radar data observation precision is improved, and field analysis is carried out by taking a field as a unit, the robustness is improved.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, to those skilled in the art, changes and modifications may be made without departing from the spirit of the present invention, and it is intended that the present invention encompass such changes and modifications.

Claims (10)

1. A field crop rotation mode generation method based on radar data is characterized by comprising the following steps:
acquiring radar data of a radar remote sensing satellite, and preprocessing the radar data;
acquiring an optical remote sensing satellite image, identifying and acquiring a field image from the optical remote sensing satellite image, and rasterizing the identified field image;
matching the rasterized field image with the preprocessed radar data, and calculating an average value of the radar data corresponding to the rasterized field image to obtain an average crop curve corresponding to the rasterized field image;
identifying a crop type and a crop period according to the average crop curve;
and determining a crop rotation sequence according to the crop type and the crop period, merging the crop rotation sequence, and generating a field rotation mode.
2. The method for generating a field rotation pattern based on radar data according to claim 1, wherein the radar data of the radar remote sensing satellite is obtained through a Sentinel-1A radar remote sensing satellite; the method comprises the steps of obtaining radar data of a radar remote sensing satellite and preprocessing the radar data, and comprises the following steps:
the method comprises the steps of obtaining radar data obtained through a Sentinel-1A radar remote sensing satellite, and carrying out radiometric calibration, terrain correction and projection conversion on the radar data to obtain preprocessed radar data.
3. The method according to claim 1, wherein the step of obtaining an optical remote sensing satellite image and identifying a field image from the optical remote sensing satellite image comprises:
acquiring an optical remote sensing satellite image, and inputting the optical remote sensing satellite image into a trained field block recognition model to obtain a field block image;
wherein, the field identification model specifically comprises:
sketching a field block boundary of the optical remote sensing satellite sample image to obtain field block sample data;
and inputting the field sample data into an FCIS deep learning network for training and learning to obtain a field identification model.
4. The method according to claim 1, wherein the step of matching the rasterized field image with the preprocessed radar data, calculating an average value of the radar data corresponding to the rasterized field image, and obtaining an average crop curve corresponding to the rasterized field image comprises:
matching the rasterized field image with the preprocessed radar data according to longitude and latitude to obtain the preprocessed radar data corresponding to the rasterized field image;
if the number of pixels of the rasterized field image is larger than a preset threshold value, performing morphological corrosion on the rasterized field image in a pixel unit, averaging the preprocessed radar data corresponding to all pixels in the rasterized field image after corrosion, and taking the average as a backscattering coefficient of the rasterized field image;
if the number of pixels of the rasterized field image is smaller than a preset threshold value, averaging the corresponding preprocessed radar data in all pixels of the rasterized field image, wherein the average value is used as a backscattering coefficient of the rasterized field image;
and drawing a curve according to the backscattering coefficient according to a time sequence to obtain an average crop curve of the rasterized field image.
5. The method of claim 1, wherein the step of identifying crop types and crop periods from the average crop curve comprises:
inputting the average crop curve into a trained crop type recognition model to recognize the crop type;
identifying a crop period according to a time interval of the backscattering coefficient in the average crop curve from a minimum value to a maximum value to a minimum value;
the crop type identification model specifically comprises the following steps:
interpreting the crop type of the average crop curve of the field sample image to obtain crop type sample training data; wherein the crop type is interpreted according to the fluctuation amplitude, the width and the phenological time point of the average crop curve of the field sample image;
and inputting the crop type sample training data into an xgboost classifier for training and learning to obtain a crop type identification model.
6. The method as claimed in claim 1, wherein the step of determining a crop rotation sequence according to the crop type and the crop period, merging the crop rotation sequence to generate the field rotation pattern comprises:
sequencing the crop types and the crop periods of each year field according to time, and determining a crop rotation sequence of the field;
and removing unreasonable crop rotation sequences, and merging the crop rotation sequences of the same crop type, the same adjacent crop type and different initial crop types to obtain a crop rotation mode.
7. A field crop rotation pattern generation device based on radar data, comprising:
the device comprises a preprocessing unit, a data acquisition unit and a data processing unit, wherein the preprocessing unit is used for acquiring radar data of a radar remote sensing satellite and preprocessing the radar data;
the system comprises a first identification unit, a second identification unit and a third identification unit, wherein the first identification unit is used for acquiring an optical remote sensing satellite image, identifying and acquiring a field image from the optical remote sensing satellite image, and rasterizing the identified field image;
the calculation unit is used for matching the rasterized field image with the preprocessed radar data, calculating an average value of the radar data corresponding to the rasterized field image, and obtaining an average crop curve corresponding to the rasterized field image;
the second identification unit is used for identifying the crop type and the crop period according to the average crop curve;
and the generating unit is used for determining a crop rotation sequence according to the crop type and the crop period, merging the crop rotation sequence and generating a field rotation mode.
8. The method according to claim 7, wherein the first identification unit comprises:
the field image unit is used for acquiring an optical remote sensing satellite image, inputting the optical remote sensing satellite image into a trained field recognition model and acquiring a field image;
wherein, the field identification model specifically comprises:
the delineating unit is used for delineating a field block boundary of the optical remote sensing satellite sample image to obtain field block sample data;
and the first training learning unit is used for inputting the field sample data into an FCIS deep learning network for training learning to obtain a field recognition model.
9. The device of claim 7, wherein the computing unit comprises:
the matching unit is used for matching the rasterized field image with the preprocessed radar data according to longitude and latitude to obtain the preprocessed radar data corresponding to the rasterized field image;
a first averaging unit, configured to, if the number of pixels of the rasterized field image is greater than a preset threshold, perform morphological erosion on the rasterized field image in a pixel unit, average the preprocessed radar data corresponding to all pixels in the rasterized field image after erosion, where the average is used as a backscattering coefficient of the rasterized field image;
a second averaging unit, configured to, if the number of pixels of the rasterized field image is smaller than a preset threshold, average the preprocessed radar data corresponding to all pixels in the rasterized field image, where the average is used as a backscattering coefficient of the rasterized field image;
and the drawing unit is used for drawing a curve of the backscattering coefficient according to a time sequence to obtain an average crop curve of the rasterized field image.
10. The device according to claim 7, wherein the second recognition unit includes:
the crop type unit is used for inputting the average crop curve into a trained crop type recognition model to recognize the crop type;
the crop period unit is used for identifying a crop period according to a time interval of the backscattering coefficient in the average crop curve from a minimum value to a maximum value to a minimum value;
the crop type identification model specifically comprises the following steps:
the interpretation unit is used for interpreting the crop type of the average crop curve of the field sample image to obtain crop type sample training data; wherein the crop type is interpreted according to the fluctuation amplitude, the width and the phenological time point of the average crop curve of the field sample image;
and the second training learning unit is used for inputting the training data of the crop type samples into the xgboost classifier for training and learning to obtain a crop type identification model.
CN202110558351.4A 2021-05-21 2021-05-21 Method and device for generating field crop rotation mode based on radar data Pending CN113221790A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110558351.4A CN113221790A (en) 2021-05-21 2021-05-21 Method and device for generating field crop rotation mode based on radar data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110558351.4A CN113221790A (en) 2021-05-21 2021-05-21 Method and device for generating field crop rotation mode based on radar data

Publications (1)

Publication Number Publication Date
CN113221790A true CN113221790A (en) 2021-08-06

Family

ID=77093824

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110558351.4A Pending CN113221790A (en) 2021-05-21 2021-05-21 Method and device for generating field crop rotation mode based on radar data

Country Status (1)

Country Link
CN (1) CN113221790A (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1794279A (en) * 2005-12-29 2006-06-28 江苏省农业科学院 Remote sensing estimation method for pest pesticide of crop rotation
KR100788233B1 (en) * 2007-06-19 2007-12-26 김경탁 Accumulative cultivation method using crops rotation system
CN103914678A (en) * 2013-01-05 2014-07-09 中国科学院遥感与数字地球研究所 Abandoned land remote sensing recognition method based on texture and vegetation indexes
CN104751477A (en) * 2015-04-17 2015-07-01 薛笑荣 Space domain and frequency domain characteristic based parallel SAR (synthetic aperture radar) image classification method
CN109360117A (en) * 2018-10-08 2019-02-19 西充恒河农牧业开发有限公司 A kind of crop growing mode recognition methods
CN109635731A (en) * 2018-12-12 2019-04-16 中国科学院深圳先进技术研究院 It is a kind of to identify method and device, storage medium and the processor effectively ploughed
CN110532967A (en) * 2019-09-02 2019-12-03 中国科学院遥感与数字地球研究所 A kind of Crop classification method based on No. 1 RVI time series of sentry
CN110793921A (en) * 2019-11-14 2020-02-14 山东省农业可持续发展研究所 Remote sensing monitoring and evaluation method and system for flood disasters of corns in emasculation and pollination period
CN110909679A (en) * 2019-11-22 2020-03-24 中国气象科学研究院 Remote sensing identification method and system for fallow crop rotation information of winter wheat historical planting area
CN111007013A (en) * 2019-11-01 2020-04-14 中科禾信遥感科技(苏州)有限公司 Crop rotation fallow remote sensing monitoring method and device for northeast cold region
CN111199185A (en) * 2019-11-26 2020-05-26 广州地理研究所 Ground surface temperature downscaling method, system and equipment based on XGboost learning algorithm
CN111666914A (en) * 2020-06-15 2020-09-15 中国科学院地理科学与资源研究所 Cultivated land identification method, system, equipment and storage medium based on distance between curves
CN111860325A (en) * 2020-07-20 2020-10-30 河南大学 Soil moisture inversion method, device, computer readable medium and electronic equipment
CN112819846A (en) * 2021-01-27 2021-05-18 成都四象纵横遥感科技有限公司 Multi-load remote sensing image-based rice yield estimation method for cloudy and rainy areas

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1794279A (en) * 2005-12-29 2006-06-28 江苏省农业科学院 Remote sensing estimation method for pest pesticide of crop rotation
KR100788233B1 (en) * 2007-06-19 2007-12-26 김경탁 Accumulative cultivation method using crops rotation system
CN103914678A (en) * 2013-01-05 2014-07-09 中国科学院遥感与数字地球研究所 Abandoned land remote sensing recognition method based on texture and vegetation indexes
CN104751477A (en) * 2015-04-17 2015-07-01 薛笑荣 Space domain and frequency domain characteristic based parallel SAR (synthetic aperture radar) image classification method
CN109360117A (en) * 2018-10-08 2019-02-19 西充恒河农牧业开发有限公司 A kind of crop growing mode recognition methods
CN109635731A (en) * 2018-12-12 2019-04-16 中国科学院深圳先进技术研究院 It is a kind of to identify method and device, storage medium and the processor effectively ploughed
CN110532967A (en) * 2019-09-02 2019-12-03 中国科学院遥感与数字地球研究所 A kind of Crop classification method based on No. 1 RVI time series of sentry
CN111007013A (en) * 2019-11-01 2020-04-14 中科禾信遥感科技(苏州)有限公司 Crop rotation fallow remote sensing monitoring method and device for northeast cold region
CN110793921A (en) * 2019-11-14 2020-02-14 山东省农业可持续发展研究所 Remote sensing monitoring and evaluation method and system for flood disasters of corns in emasculation and pollination period
CN110909679A (en) * 2019-11-22 2020-03-24 中国气象科学研究院 Remote sensing identification method and system for fallow crop rotation information of winter wheat historical planting area
CN111199185A (en) * 2019-11-26 2020-05-26 广州地理研究所 Ground surface temperature downscaling method, system and equipment based on XGboost learning algorithm
CN111666914A (en) * 2020-06-15 2020-09-15 中国科学院地理科学与资源研究所 Cultivated land identification method, system, equipment and storage medium based on distance between curves
CN111860325A (en) * 2020-07-20 2020-10-30 河南大学 Soil moisture inversion method, device, computer readable medium and electronic equipment
CN112819846A (en) * 2021-01-27 2021-05-18 成都四象纵横遥感科技有限公司 Multi-load remote sensing image-based rice yield estimation method for cloudy and rainy areas

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
E.ESAKKI VIGNESWARAN ET AL: "Decision Support System for Crop Rotation Using Machine Learning", 《2020 FOURTH INTERNATIONAL CONFERENCE ON INVENTIVE SYSTEMS AND CONTROL (ICISC)》 *
RANDY L. ANDERSON ET AL: "A Multi-Tactic Approach to Manage Weed Population Dynamics in Crop Rotations", 《AGRONOMY JOURNAL》 *
冯世鑫等: "黄花蒿轮作模式的研究", 《中国中药杂志》 *
化国强: "基于全极化SAR数据玉米长势监测及制图研究", 《中国优秀硕士学位论文全文数据库 (农业科技辑)》 *
钟礼山等: "利用SAR影像时间序列的耕地提取研究", 《地理科学进展》 *

Similar Documents

Publication Publication Date Title
Halme et al. Utility of hyperspectral compared to multispectral remote sensing data in estimating forest biomass and structure variables in Finnish boreal forest
Zhou et al. Automated rangeland vegetation cover and density estimation using ground digital images and a spectral-contextual classifier
Lu et al. Object-oriented change detection for landslide rapid mapping
Lu et al. Land cover classification in a complex urban-rural landscape with QuickBird imagery
Lu et al. Removal of noise by wavelet method to generate high quality temporal data of terrestrial MODIS products
Melgani Contextual reconstruction of cloud-contaminated multitemporal multispectral images
CN107064932A (en) A kind of construction land change detection method based on time series SAR remote sensing images
Delenne et al. From pixel to vine parcel: A complete methodology for vineyard delineation and characterization using remote-sensing data
Lefebvre et al. Object-oriented approach and texture analysis for change detection in very high resolution images
CN112101309A (en) Ground object target identification method and device based on deep learning segmentation network
CN112285710B (en) Multi-source remote sensing reservoir water storage capacity estimation method and device
CN112445241A (en) Ground surface vegetation identification method and system based on unmanned aerial vehicle remote sensing technology and readable storage medium
Grigillo et al. Automated building extraction from IKONOS images in suburban areas
Chen et al. A mathematical morphology-based multi-level filter of LiDAR data for generating DTMs
CN112836725A (en) Weak supervision LSTM recurrent neural network rice field identification method based on time sequence remote sensing data
CN112669363B (en) Method for measuring three-dimensional green space of urban green space
CN114819737B (en) Method, system and storage medium for estimating carbon reserves of highway road vegetation
Mishaa et al. Image based land cover classification for remote sensing applications-A review
CN113221790A (en) Method and device for generating field crop rotation mode based on radar data
Akila et al. Automation in plant growth monitoring using high-precision image classification and virtual height measurement techniques
CN113096129B (en) Method and device for detecting cloud cover in hyperspectral satellite image
Zhao et al. An automatic SAR-based change detection method for generating large-scale flood data records: the UK as a test case
David et al. Pathway detection and geometrical description from ALS data in forested mountaneous area
Cerbelaud et al. Supervised classification methods for automatic damage detection caused by heavy rainfall using multitemporal high resolution optical imagery and auxiliary data
Feyen et al. Mangrove Species Mapping and Above-Ground Biomass Estimation in Suriname Based on Fused Sentinel-1 and Sentinel-2 Imagery and National Forest Inventory Data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210806