CN115420688A - Agricultural disaster information remote sensing extraction loss evaluation method based on Internet of things - Google Patents

Agricultural disaster information remote sensing extraction loss evaluation method based on Internet of things Download PDF

Info

Publication number
CN115420688A
CN115420688A CN202211111068.8A CN202211111068A CN115420688A CN 115420688 A CN115420688 A CN 115420688A CN 202211111068 A CN202211111068 A CN 202211111068A CN 115420688 A CN115420688 A CN 115420688A
Authority
CN
China
Prior art keywords
disaster
remote sensing
area
sampling
crop
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211111068.8A
Other languages
Chinese (zh)
Inventor
胡成
孙勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Uicsoft Software Co ltd
Original Assignee
Anhui Uicsoft Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Uicsoft Software Co ltd filed Critical Anhui Uicsoft Software Co ltd
Priority to CN202211111068.8A priority Critical patent/CN115420688A/en
Publication of CN115420688A publication Critical patent/CN115420688A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/34Smoothing or thinning of the pattern; Morphological operations; Skeletonisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/05Agriculture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y20/00Information sensed or collected by the things
    • G16Y20/20Information sensed or collected by the things relating to the thing itself
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/10Detection; Monitoring
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N2021/1793Remote sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Agronomy & Crop Science (AREA)
  • Software Systems (AREA)
  • General Business, Economics & Management (AREA)
  • Economics (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Biochemistry (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Animal Husbandry (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Mining & Mineral Resources (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an agricultural disaster information remote sensing extraction loss assessment method based on the Internet of things, and relates to the technical field of crop disaster situation assessment. The invention comprises the following steps: shooting remote sensing images of crops in a plurality of areas before and after a disaster; acquiring an NDVI interpolation graph before and after a disaster, extracting disaster features of the interpolation graph and training a disaster identification module; carrying out remote sensing image acquisition on crops, and determining a crop area to be investigated; the method comprises the steps of inputting a crop area to be investigated into a disaster recognition module, outputting a predicted disaster trend by the disaster recognition module, and evaluating the grade of the disaster on remote sensing image data.

Description

Agricultural disaster information remote sensing extraction loss evaluation method based on Internet of things
Technical Field
The invention belongs to the technical field of crop disaster situation assessment, and particularly relates to an agricultural disaster information remote sensing extraction loss assessment method based on the Internet of things.
Background
The agricultural disasters comprise natural disasters, disasters caused by activities of sick birds and animals, and disasters caused by human factors. Natural disasters such as drought, waterlogging, water and soil loss, excessive rain, insufficient illumination, typhoon and the like caused by bad weather. The diseases, pests, birds and animals such as various diseases, insect pests, bird pests, rat pests, wild boar damage and the like; human factors such as damage to soil structure caused by incorrect cultivation, use of a large amount of chemical pesticides, improper use of medicines, human trampling and the like; agricultural meteorological disasters are disasters caused to agriculture by adverse meteorological conditions. Heat damage, freezing damage, frost damage, tropical crop cold damage and low-temperature cold damage caused by temperature factors; drought, flood, snow and hail damage caused by the moisture factor; wind damage caused by wind; dry hot air, cold rain, freezing and waterlogging and the like caused by the comprehensive action of meteorological factors; unlike the concept of meteorology, agricultural meteorology disasters are combined with the disaster suffered by agricultural production. For example, cold tides, late spring coldness, etc., are weather phenomena or processes in the weather, and do not always cause disasters. However, when they endanger crops such as wheat and rice, they cause agricultural meteorological disasters such as freeze injury, frost damage, low-temperature cold injury in spring, and the like. However, the existing agricultural disaster monitoring cannot accurately acquire a disaster area, and is not beneficial to a user to perform reasonable disaster prevention; meanwhile, disasters cannot be predicted in time, so that the agriculture is seriously affected by the disasters.
In summary, the problems of the prior art are as follows:
(1) The existing agricultural disaster monitoring can not accurately acquire a disaster range, and is not beneficial to reasonable disaster prevention of users; meanwhile, disasters cannot be predicted in time, so that the agricultural disaster is serious.
(2) The existing remote sensing data storage model mainly aims at remote sensing image product data and can not achieve better load balance in a distributed environment.
Disclosure of Invention
The invention aims to provide an agricultural disaster information remote sensing extraction loss assessment method based on the Internet of things.
In order to solve the technical problems, the invention is realized by the following technical scheme:
the invention relates to an agricultural disaster information remote sensing extraction loss assessment method based on the Internet of things, which comprises the following steps:
step S1: shooting remote sensing images of crops in a plurality of areas before and after a disaster;
step S2: preprocessing the acquired remote sensing image;
and step S3: acquiring an NDVI interpolation graph before and after a disaster, extracting disaster features of the interpolation graph and training a disaster identification module;
and step S4: carrying out remote sensing image acquisition on crops, screening remote sensing images by using a sample method, and determining a crop area to be investigated;
step S5: inputting a crop area to be investigated into a disaster identification module, and estimating and comparing the area affected by the disaster;
step S6: the disaster identification module outputs and predicts a disaster trend and evaluates the grade of the disaster on the remote sensing image data;
step S7: alarming at corresponding levels according to different evaluated disaster degrees;
step S8: and pushing the acquired remote sensing image and the prediction and evaluation data information to the manager.
As a preferred technical solution, in the step S2, the specific steps of preprocessing the remote sensing image are as follows:
step S21: carrying out image morphological reconstruction enhancement processing on the acquired remote sensing image;
step S22: enhancing the image by using an exponential enhancement function for the crop edge contour information of the reconstructed image;
step S23: after the image is enhanced, separating a crop area from a background area;
step S24: carrying out correction and registration processing on the separated crop area;
step S25: carrying out maximum likelihood supervision and classification on the processed crop region images;
step S26: and combining and binarizing the classification results to obtain a classification map of the crops.
As a preferred technical solution, in step S3, the NDVI difference map of the pre-disaster image is subtracted from the NVDI of the post-disaster image, and the NDVI difference map of the crop is obtained by combining the crop and non-crop binary classification maps.
As a preferred technical solution, in the step S3, the work flow of the disaster identifying module is as follows:
step S31: acquiring a suitable pre-disaster and post-disaster remote sensing image;
step S32: pre-classifying the remote sensing images and solving a post-disaster NDVI interpolation graph before disaster;
step S33: carrying out the sampling parameter design of a layering system and selecting sample numbers;
step S34: investigating the area and yield reduction rate of the disaster-suffered crops according to the sample labels;
step S35: carrying out reverse deduction on the total area of the damaged agricultural products;
step S36: and (5) disaster-affected grade distribution map making.
As a preferred technical solution, in step S33, a specific workflow of hierarchical system sampling is as follows:
step S331: a sampling unit;
step S332: layered sampling identification and layer number;
step S333: sampling total amount and each layer sample amount;
step S334: a system sampling identifier;
step S335: a sampling interval.
As a preferred technical solution, in step S35, two estimators, namely, respective regression estimation and respective ratio estimation, are adopted, and the classification area and the real sample area are combined to reversely estimate the actual total area of the disaster.
As a preferable technical scheme, when the layering system samples, according to a planting range diagram of crops, gridding the whole total crop planting area, taking each grid as a sampling unit, taking all grids as a sample input total, taking the classified area of the crops in the grids as a layering sampling identifier to perform layering sampling, and calculating the sample amount; and performing systematic sequencing in each layer according to the average value of the NDVI difference values of the pixels corresponding to the crops in the grid, performing systematic sampling in each layer according to the calculated sample amount of each layer, and selecting a final sample.
As a preferred technical solution, the formula for calculating the sample size is:
Figure BDA0003843166140000041
wherein N represents the total number of sampling units, L is the number of layers, and W h For each layer weight, d is the absolute error, z a/2 Is the right a/2 quantile of a standard normal distribution,
Figure BDA0003843166140000042
representing the variance of each layer.
As a preferred technical solution, in step S5, a formula for estimating the disaster area is as follows:
Figure BDA0003843166140000051
in the formula (I), the compound is shown in the specification,
Figure BDA0003843166140000052
the total amount is back-extrapolated for the regression estimation area,
Figure BDA0003843166140000053
is the mean value of the areas affected by the disaster of the regression estimators,
Figure BDA0003843166140000054
is the average value of the crop area of each layer of field samples,
Figure BDA0003843166140000055
is the average value of the total classified area of crops in each layer,
Figure BDA0003843166140000056
is the mean value of the classified area of each layer of crop sample, beta h Is a regression coefficient, W h Is the weight of each layer, L is the number of layers, and N is the total amount of samples.
The invention has the following beneficial effects:
according to the method, the disaster recognition module is trained by collecting the remote sensing images of the crops before and after disaster, the crop area to be investigated is input to the disaster recognition module, the area affected by the disaster is estimated and compared, the disaster grade is evaluated, the sampling speed and the prediction accuracy are improved, and the damage of the disaster to the agriculture is effectively reduced.
Of course, it is not necessary for any product to practice the invention to achieve all of the above-described advantages at the same time.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flow chart of an agricultural disaster information remote sensing extraction loss evaluation method based on the internet of things.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, the invention relates to an agricultural disaster information remote sensing extraction loss evaluation method based on the internet of things, which comprises the following steps:
step S1: shooting remote sensing images of crops in a plurality of areas before and after disaster;
step S2: preprocessing the acquired remote sensing image;
the specific steps for preprocessing the remote sensing image are as follows:
step S21: carrying out image morphological reconstruction enhancement processing on the acquired remote sensing image;
step S22: enhancing the reconstructed crop edge contour information of the image by using an index enhancement function;
step S23: after the image is enhanced, separating a crop area from a background area;
step S24: carrying out correction and registration processing on the separated crop area;
step S25: carrying out maximum likelihood supervision and classification on the processed crop area images;
step S26: and combining and binarizing the classification results to obtain a classification map of the crops.
And step S3: acquiring an NDVI interpolation graph before and after a disaster, extracting disaster features of the interpolation graph and training a disaster identification module;
in step S3, the NDVI difference map of the image before the disaster is subtracted from the NVDI of the image after the disaster, and the NDVI difference map of the crops is obtained by combining the crop and non-crop binary classification maps.
In step S3, the work flow of the disaster identification module is as follows:
step S31: acquiring a suitable pre-disaster and post-disaster remote sensing image;
step S32: pre-classifying the remote sensing image and solving a post-disaster NDVI interpolation graph before the disaster;
step S33: carrying out the sampling parameter design of a layering system and selecting sample numbers;
step S34: investigating the area and yield reduction rate of the disaster-suffered crops according to the sample labels;
step S35: carrying out reverse deduction on the total area of the damaged agricultural products;
step S36: and (5) disaster-affected grade distribution map making.
In step S33, the specific workflow of the hierarchical system sampling is as follows:
step S331: a sampling unit; the sampling unit is selected as a square grid, according to a sampling statistical theory, when the hierarchical identification of hierarchical sampling has better correlation with the overall target, the effectiveness of the hierarchical sampling can be ensured, and the size change of the spatial sampling unit influences the correlation between the hierarchical identification and the overall target due to a spatial scale effect. If the sampling unit is set to 6 x 6 TM picture elements, 36 picture elements are contained in each grid.
Step S332: layered sampling identification and layer number; the more the number of layers of the stratified sampling is, the higher the sampling accuracy is, but as the number of layers increases, the sampling survey cost also increases.
Step S333: sampling total amount and each layer sample amount;
step S334: a system sampling identifier;
step S335: a sampling interval.
The sampling method comprises the following main steps:
(1) Calculating a sampling interval
Figure BDA0003843166140000071
(2) Arranging and numbering all sampling units according to a certain sequence; (3) randomly extracting a number r from 1 to k; (4) extracting n samples at equal intervals; in this document, the NDVI differences of the TM image difference map are sorted, and the samples are calculated according to the total sampling units of each layer and the distributed sample sizeAnd sampling intervals, and obtaining sample numbers one by one according to the steps.
In step S35, two estimators, namely, a respective regression estimation and a respective ratio estimation, are used, and the classification area and the sample real area are combined to reversely estimate the actual total area of the disaster.
And step S4: carrying out remote sensing image acquisition on crops, screening the remote sensing images by using a sample method, and determining a crop area to be investigated;
step S5: inputting a crop area to be investigated into a disaster identification module, and estimating and comparing the area affected by the disaster;
step S6: the disaster identification module outputs and predicts a disaster trend and evaluates the grade of the disaster on the remote sensing image data;
and after an accurate disaster area is divided based on total amount control, removing samples in the non-disaster area, and performing disaster grade division on the disaster area by using the samples in the disaster area. Classifying the samples of the disaster-affected area according to the production reduction grade of actual investigation, sorting the samples in each grade according to the NDVI value, taking the maximum value and the minimum value of each grade, and solving two grade thresholds for distinguishing three disaster-affected grades. The specific way to obtain the threshold is to use the average value of the minimum value of the disaster-affected level 1 and the maximum value of the disaster-affected level 2 as the level threshold between the levels 1 and 2, and use the average value of the minimum value of the disaster-affected level 2 and the maximum value of the disaster-affected level 3 as the level threshold between the levels 2 and 3.
Step S7: alarming at corresponding levels is carried out on different evaluated disaster degrees; the alarm can be given by drawing a disaster level distribution diagram, and the specific production process is as follows:
after the difference threshold of the disaster-affected level NDVI is determined, the range-corrected disaster-affected corn area is taken as a range, all disaster-affected pixels of the TM image are graded through the NDVI grade threshold, and finally, the disaster-affected level distribution map is manufactured.
Step S8: and pushing the acquired remote sensing image and the prediction and evaluation data information to a manager.
When the layered system samples, according to a planting range diagram of crops, gridding the whole total crop planting area, taking each grid as a sampling unit, taking all grids as a sample inlet totality, taking the classified area of the crops in the grids as a layered sampling identifier to perform layered sampling, and calculating the sample amount; and performing systematic sequencing in each layer according to the average value of the NDVI difference values of the pixels corresponding to the crops in the grid, performing systematic sampling in each layer according to the calculated sample amount of each layer, and selecting a final sample.
The formula for calculating the sample size is:
Figure BDA0003843166140000091
wherein N represents the total number of sampling units, L is the number of layers, and W h For each layer weight, d is the absolute error, z a/2 Is the right a/2 quantile of a standard normal distribution,
Figure BDA0003843166140000092
representing the variance of each layer.
In step S5, the formula for estimating the disaster area is as follows:
Figure BDA0003843166140000093
in the formula (I), the compound is shown in the specification,
Figure BDA0003843166140000094
the total amount is back-extrapolated for the regression estimation area,
Figure BDA0003843166140000095
is the mean value of the disaster-affected areas of the regression estimator,
Figure BDA0003843166140000096
is the average value of the crop area of each layer of field samples,
Figure BDA0003843166140000097
is the average value of the total classified area of crops in each layer,
Figure BDA0003843166140000098
is the mean value of the classified area of each layer of crop sample, beta h Is a regression coefficient, W h Is the weight of each layer, L is the number of layers, and N is the total amount of samples.
It should be noted that, in the above system embodiment, each included unit is only divided according to functional logic, but is not limited to the above division as long as the corresponding function can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
In addition, it is understood by those skilled in the art that all or part of the steps in the method for implementing the embodiments described above may be implemented by a program instructing associated hardware, and the corresponding program may be stored in a computer-readable storage medium.
The preferred embodiments of the invention disclosed above are intended to be illustrative only. The preferred embodiments are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best utilize the invention. The invention is limited only by the claims and their full scope and equivalents.

Claims (9)

1. An agricultural disaster information remote sensing extraction loss assessment method based on the Internet of things is characterized by comprising the following steps:
step S1: shooting remote sensing images of crops in a plurality of areas before and after a disaster;
step S2: preprocessing the acquired remote sensing image;
and step S3: acquiring an NDVI interpolation graph before and after disaster, extracting disaster features of the interpolation graph and training a disaster recognition module;
and step S4: carrying out remote sensing image acquisition on crops, screening remote sensing images by using a sample method, and determining a crop area to be investigated;
step S5: inputting a crop area to be investigated into a disaster identification module, and estimating and comparing the area affected by the disaster;
step S6: the disaster identification module outputs and predicts a disaster trend and evaluates the grade of the disaster on the remote sensing image data;
step S7: alarming at corresponding levels according to different evaluated disaster degrees;
step S8: and pushing the acquired remote sensing image and the prediction and evaluation data information to the manager.
2. The internet-of-things-based agricultural disaster information remote sensing extraction loss evaluation method according to claim 1, wherein in the step S2, the remote sensing image is preprocessed in the following specific steps:
step S21: carrying out image morphological reconstruction enhancement processing on the acquired remote sensing image;
step S22: enhancing the reconstructed crop edge contour information of the image by using an index enhancement function;
step S23: after the image is enhanced, separating a crop area from a background area;
step S24: carrying out correction and registration processing on the separated crop area;
step S25: carrying out maximum likelihood supervision and classification on the processed crop area images;
step S26: and combining and binarizing the classification results to obtain a classification map of the crops.
3. The Internet of things-based agricultural disaster information remote sensing extraction loss evaluation method is characterized in that in the step S3, the NDVI difference map of the pre-disaster image is subtracted from the NVDI of the post-disaster image, and the NDVI difference map of the crops is obtained by combining the crop and non-crop binary classification maps.
4. The internet-of-things-based agricultural disaster information remote sensing extraction loss evaluation method as claimed in claim 1, wherein in the step S3, the working process of the disaster identification module is as follows:
step S31: acquiring a suitable pre-disaster and post-disaster remote sensing image;
step S32: pre-classifying the remote sensing images and solving a post-disaster NDVI interpolation graph before disaster;
step S33: designing sampling parameters of a layering system, and numbering samples by decimation;
step S34: investigating the area and yield reduction rate of the disaster-suffered crops according to the sample labels;
step S35: carrying out reverse extrapolation on the total area of the disaster-stricken agricultural products;
step S36: and (5) making a disaster-affected grade distribution diagram.
5. The Internet of things-based agricultural disaster information remote sensing extraction loss evaluation method according to claim 4, wherein in the step S33, a specific workflow of hierarchical system sampling is as follows:
step S331: a sampling unit;
step S332: layered sampling identification and layer number;
step S333: sampling total amount and each layer sample amount;
step S334: a system sampling identifier;
step S335: a sampling interval.
6. The Internet of things-based agricultural disaster information remote sensing extraction loss evaluation method as claimed in claim 4, wherein in the step S35, two estimators, namely respective regression estimation and respective ratio estimation, are adopted, and the classification area and the sample real area are combined to reversely deduce the actual total area of the disaster.
7. The Internet of things-based agricultural disaster information remote sensing extraction loss evaluation method is characterized in that when the layering system samples, the whole total crop planting area is gridded according to a crop planting range diagram, each grid is used as a sampling unit, all grids are used as an input sample population, the classified areas of the crops in the grids are used as layering sampling marks to carry out layering sampling, and the sample amount is calculated; and performing systematic sequencing in each layer according to the average value of the NDVI difference values of the pixels corresponding to the crops in the grid, performing systematic sampling in each layer according to the calculated sample amount of each layer, and selecting a final sample.
8. The Internet of things-based agricultural disaster information remote sensing extraction loss evaluation method as claimed in claim 7, wherein the formula for calculating the sample size is as follows:
Figure FDA0003843166130000031
wherein N represents the total number of sampling units, L is the number of layers, W h For each layer weight, d is the absolute error, z a/2 Is the right a/2 quantile of a standard normal distribution,
Figure FDA0003843166130000032
representing the variance of each layer.
9. The method for remote sensing, extracting and evaluating the loss of the agricultural disaster information based on the internet of things as claimed in claim 1, wherein in the step S5, the formula for estimating the disaster area is as follows:
Figure FDA0003843166130000033
in the formula (I), the compound is shown in the specification,
Figure FDA0003843166130000034
the total amount is back-extrapolated for the regression estimation area,
Figure FDA0003843166130000035
is the mean value of the areas affected by the disaster of the regression estimators,
Figure FDA0003843166130000041
is the average value of the crop area of each layer of field samples,
Figure FDA0003843166130000042
is the average value of the total classified area of crops in each layer,
Figure FDA0003843166130000043
is the mean value of the classified area of each layer of crop sample, beta h Is a regression coefficient, W h Is the weight of each layer, L is the number of layers, and N is the total number of samples.
CN202211111068.8A 2022-09-13 2022-09-13 Agricultural disaster information remote sensing extraction loss evaluation method based on Internet of things Pending CN115420688A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211111068.8A CN115420688A (en) 2022-09-13 2022-09-13 Agricultural disaster information remote sensing extraction loss evaluation method based on Internet of things

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211111068.8A CN115420688A (en) 2022-09-13 2022-09-13 Agricultural disaster information remote sensing extraction loss evaluation method based on Internet of things

Publications (1)

Publication Number Publication Date
CN115420688A true CN115420688A (en) 2022-12-02

Family

ID=84202657

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211111068.8A Pending CN115420688A (en) 2022-09-13 2022-09-13 Agricultural disaster information remote sensing extraction loss evaluation method based on Internet of things

Country Status (1)

Country Link
CN (1) CN115420688A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117876362A (en) * 2024-03-11 2024-04-12 国任财产保险股份有限公司 Deep learning-based natural disaster damage assessment method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117876362A (en) * 2024-03-11 2024-04-12 国任财产保险股份有限公司 Deep learning-based natural disaster damage assessment method and device
CN117876362B (en) * 2024-03-11 2024-05-28 国任财产保险股份有限公司 Deep learning-based natural disaster damage assessment method and device

Similar Documents

Publication Publication Date Title
Xie et al. Using Landsat observations (1988–2017) and Google Earth Engine to detect vegetation cover changes in rangelands-A first step towards identifying degraded lands for conservation
He et al. Integrating multi-sensor remote sensing and species distribution modeling to map the spread of emerging forest disease and tree mortality
Franklin et al. Discrimination of conifer height, age and crown closure classes using Landsat-5 TM imagery in the Canadian Northwest Territories
Fraga et al. Examining the relationship between the Enhanced Vegetation Index and grapevine phenology
Luciani et al. Agricultural monitoring, an automatic procedure for crop mapping and yield estimation: the great rift valley of Kenya case
CN109918449B (en) Internet of things-based agricultural disaster information remote sensing extraction method and system
Zhang et al. Prototype for monitoring and forecasting fall foliage coloration in real time from satellite data
Gavahi et al. How does precipitation data influence the land surface data assimilation for drought monitoring?
Gumma et al. Indo-Ganges river basin land use/land cover (LULC) and irrigated area mapping
CN115420688A (en) Agricultural disaster information remote sensing extraction loss evaluation method based on Internet of things
Sabzchi-Dehkharghani et al. Recognition of different yield potentials among rain-fed wheat fields before harvest using remote sensing
Texeira et al. An exploration of direct and indirect drivers of herbivore reproductive performance in arid and semi arid rangelands by means of structural equation models
CN110765901A (en) Agricultural disaster information remote sensing extraction system and method based on Internet of things
Palmer et al. Modelling annual evapotranspiration in a semi-arid, African savanna: functional convergence theory, MODIS LAI and the Penman–Monteith equation
Dempewolf et al. Wheat production forecasting for Pakistan from satellite data
Adams et al. Phenotypic trait extraction of soybean plants using deep convolutional neural networks with transfer learning.
SITAYEB et al. LANDSCAPE CHANGE IN THE STEPPE OF ALGERIA SOUTH-WEST USING REMOTE SENSING.
Dadi Assessing the transferability of random forset and time-weighted dynamic time warping for agriculture mapping
Üstündağ An adaptive Mealy machine model for monitoring crop status
Gowda et al. A decade of remote sensing and evapotranspiration research at usda-ars conservation and production research laboratory
Laneve et al. Sino-Eu Earth Observation Data to Support the Monitoring and Management of Agricultural Resources
Mohite et al. Citrus Gummosis disease severity classification using participatory sensing, remote sensing and weather data
Pan et al. Remote sensing of agricultural disasters monitoring: recent advances
Doan Application of remote sensing and GIS in modelling bison carrying capacity in mixed-grass prairie
Clarke Quantifying grizzly bear (Ursus arctos) habitat selection for a seasonal resource, the Canadian buffaloberry (Sheperdia canadensis) in southern British Columbia

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination