CN111047566B - Method for carrying out aquatic vegetation annual change statistics by unmanned aerial vehicle and multispectral satellite image - Google Patents

Method for carrying out aquatic vegetation annual change statistics by unmanned aerial vehicle and multispectral satellite image Download PDF

Info

Publication number
CN111047566B
CN111047566B CN201911229553.3A CN201911229553A CN111047566B CN 111047566 B CN111047566 B CN 111047566B CN 201911229553 A CN201911229553 A CN 201911229553A CN 111047566 B CN111047566 B CN 111047566B
Authority
CN
China
Prior art keywords
aquatic vegetation
year
image
point
satellite image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911229553.3A
Other languages
Chinese (zh)
Other versions
CN111047566A (en
Inventor
潘珉
王飞
李杨
李滨
孔祥丰
黄育红
宋任彬
施苏毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yunnan Tianmu Space Information Technology Co ltd
Kunming Dianchi Plateau Lake Research Institute
Original Assignee
Yunnan Tianmu Space Information Technology Co ltd
Kunming Dianchi Plateau Lake Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yunnan Tianmu Space Information Technology Co ltd, Kunming Dianchi Plateau Lake Research Institute filed Critical Yunnan Tianmu Space Information Technology Co ltd
Priority to CN201911229553.3A priority Critical patent/CN111047566B/en
Publication of CN111047566A publication Critical patent/CN111047566A/en
Application granted granted Critical
Publication of CN111047566B publication Critical patent/CN111047566B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10036Multispectral image; Hyperspectral image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Abstract

The invention discloses a method for carrying out annual change statistics on aquatic vegetation by utilizing an unmanned aerial vehicle and a multispectral satellite image, which comprises the steps of collecting the multispectral satellite image of a survey area, preprocessing the image, processing a photo collected by the unmanned aerial vehicle to obtain an orthographic image, and registering the orthographic image to the preprocessed multispectral satellite image; classifying aquatic vegetation by utilizing the multispectral image and the registered orthographic image; labeling the aquatic vegetation types in the investigation region on the multispectral satellite image; acquiring satellite images with the same technical specification in the same time period of the second year, and preprocessing; registering the multispectral satellite image acquired in the second year with the multispectral satellite image acquired in the first year; comparing the differences among the samples obtained by the marking points of each aquatic vegetation, and updating a marking point sample library according to the difference; and then repeatedly collecting the aquatic vegetation annually to obtain the statistical result of the annual change condition of the aquatic vegetation.

Description

Method for carrying out aquatic vegetation annual change statistics by unmanned aerial vehicle and multispectral satellite image
Technical Field
The invention relates to the technical field of unmanned aerial vehicle and multispectral satellite image statistics, in particular to a method for carrying out aquatic vegetation annual change statistics by utilizing unmanned aerial vehicle and multispectral satellite images.
Background
Aquatic vegetation-vegetation type consisting of aquatic plants grown in a body of water. Because the ecological conditions in the water body are consistent, and the water has fluidity, the aquatic plants can be widely migrated and spread, and the aquatic plants are widely planted and also planted in the world. The species of aquatic plants are approximately the same within each zone. The aquatic plants have simple types of higher plants and a plurality of lower plants. The aquatic vegetation plays a key role in maintaining the healthy lake ecosystem, and plays an important role in maintaining the structure and the function of the ecosystem and improving the water environment.
The method is one of effective modes for preventing illegal and illegal discharge of sewage and wastewater, timely treating water pollution and pollution to water environment. At present, the aquatic vegetation in a large area is monitored and checked mainly by adopting a remote sensing monitoring mode and an unmanned aerial vehicle monitoring mode, and in the application of the patent number 201811312185.7, the application requires biomass data actually measured on the ground, and correlation analysis is carried out by combining multispectral image data of the unmanned aerial vehicle. The method is time-consuming, high in cost and not beneficial to popularization.
Disclosure of Invention
The invention aims to provide a method for carrying out annual change statistics on aquatic vegetation by using an unmanned plane and multispectral satellite images, so as to solve the problems of high cost, long time consumption and the like of the existing aquatic vegetation monitoring mode in the background technology.
In order to achieve the above purpose, the present invention provides the following technical solutions: the method for carrying out aquatic vegetation annual change statistics by the unmanned aerial vehicle and the multispectral satellite image comprises the following steps:
firstly, collecting multispectral satellite images of a survey area, wherein the resolution is better than 0.5 meter, and the spectrum at least comprises three visible light wave bands and one near infrared wave band;
step two, preprocessing the multispectral satellite image to enable the multispectral satellite image to have correct position information and spectrum information;
thirdly, flying the lakeside and the lakeside according to a set route by using an unmanned aerial vehicle mounted with a cradle head and a camera within a week around the acquisition time point of the multispectral satellite image, and photographing vertically downwards according to a set interval time or a distance lens, wherein the resolution of the photo is required to be ensured to be superior to 10cm, and the overlapping degree of the adjacent photos is ensured to be more than 60%;
step four, processing the collected photos to obtain the lakeside and the orthographic images of the lakeside;
registering the orthographic image to the multispectral satellite image preprocessed in the second step, so as to ensure that the common coverage areas of the orthographic image and the multispectral satellite image are basically overlapped;
step six, classifying the aquatic vegetation according to the multispectral image preprocessed in the step two and the orthographic image registered in the step five, and determining the types of the aquatic vegetation;
step seven, for the uncertain aquatic vegetation types, manually checking on site, and collecting on-site photos;
marking all aquatic vegetation types in the investigation region on a multispectral satellite image, taking each marking point as a center, dividing satellites according to spectrum differences, extracting boundaries to obtain the growth range of the aquatic vegetation marked by the marking point in the year, and simultaneously collecting satellite image samples of each aquatic vegetation marking point to obtain a first year sample library corresponding to each aquatic vegetation marking point;
step nine, acquiring multispectral satellite images with the same technical specifications in the same time period of the second year, and preprocessing the multispectral satellite images;
step ten, registering the multispectral satellite image obtained in the step nine with the multispectral satellite image of the first year preprocessed in the step two, so as to ensure that the common coverage areas of the multispectral satellite image and the multispectral satellite image are basically overlapped;
step eleven, superposing the aquatic vegetation positions and the types marked in the step eight on the multispectral satellite images of the second year after the registration in the step ten;
and step twelve, on the multispectral satellite image of the second year registered in the step ten, taking each marking point as a center, and carrying out segmentation and boundary extraction according to the spectrum difference of the satellite image of the year to obtain the growth range of the aquatic vegetation marked by the marking point of the year. Simultaneously, satellite image sample collection of each aquatic vegetation mark point is carried out for the year, and a second year sample library corresponding to each aquatic vegetation mark point is obtained;
step thirteen, comparing the difference between the sample obtained in the step twelve and the sample obtained in the step eight of each aquatic vegetation mark point, if the difference is within a set threshold value, adding the sample collected in the second year of the point into a sample library of the point, if the difference exceeds the set threshold value, performing field check, collecting an orthographic image with the resolution of better than 10cm at the point by using an unmanned plane, analyzing whether the aquatic vegetation at the point has variety change, if the variety change occurs, adding a new aquatic vegetation mark point, and marking the corresponding newly-increased aquatic vegetation variety; if no change occurs, adding the sample collected from the point in the second year to a sample library of the point;
and fourteen, repeating the steps nine to thirteenth each year, and obtaining the statistical result of the annual change condition of the aquatic vegetation in the investigation region.
Compared with the prior art, the invention has the beneficial effects that: the method is based on the resolution ratio of the unmanned aerial vehicle image to 10cm, the aquatic vegetation is roughly extracted, the aquatic vegetation is accurately extracted through manual field checking, the growth center point of the aquatic vegetation is obtained, and the position and the type of the aquatic vegetation are marked. Based on multispectral satellite images with resolution ratio of better than 0.5 m, overlapping the growth center point of the aquatic vegetation, and dividing and extracting boundaries of the satellite images based on the spectrum difference of the satellite images by taking the center point as the center point to obtain the growth range of the aquatic vegetation and satellite image samples of the aquatic vegetation at the point. And processing the newly added aquatic vegetation through satellite image comparison of the second year, and dividing and extracting boundaries by taking the aquatic vegetation growing points as the centers based on the images of the year, so that the aquatic vegetation growing range of the second year is obtained. Statistics were then carried out according to this method each year. The aquatic vegetation growth center point and the types thereof are determined through the unmanned aerial vehicle image with resolution of better than 10cm in the first year, so that the precision is high and the efficiency is high. The main work of each year is to analyze and process multispectral satellite images with resolution better than 0.5 m, and the method has the characteristics of high timeliness, high efficiency and low labor cost. By the method, the annual change of the aquatic vegetation is accurately counted, the water environment change of a target area can be effectively and intuitively fed back macroscopically, meanwhile, the area where the aquatic vegetation type is changed can be focused, and the method has an important guiding effect on the water ecological treatment. In addition, as time increases, historical data is continuously accumulated, each aquatic vegetation sample library is gradually enriched, and after the sample size reaches a certain scale, automatic and intelligent aquatic vegetation monitoring can be realized by utilizing machine learning.
Drawings
FIG. 1 is a schematic diagram showing the structure of steps in an embodiment of the present invention;
fig. 2 is a schematic diagram of an unmanned aerial vehicle image obtained in Step4 according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of stitching unmanned aerial vehicle images in Step5 according to an embodiment of the present invention;
FIG. 4 is a Step chart of Step7 of determining the aquatic vegetation type and its growth center point according to an embodiment of the present invention;
FIG. 5 is a photograph taken by the field check in Step7.2 in accordance with an embodiment of the present invention;
FIG. 6 is a schematic diagram of the cross method calculation in Step7.5 according to the embodiment of the present invention;
FIG. 7 is a diagram showing steps 11 according to an embodiment of the present invention.
Detailed Description
The technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, but not all embodiments, and all other embodiments obtained by those skilled in the art without making creative efforts based on the embodiments of the present invention are included in the protection scope of the present invention.
Examples: the method for carrying out annual change statistics on the aquatic vegetation on the lakeside and the lakeside by using the unmanned aerial vehicle and the multispectral satellite image comprises the following steps: the detailed step flow is shown with reference to fig. 1;
step1: the investigation regions are determined, and the investigation regions are generally distributed in a band shape, and the main regions are the lakeshore and the lakeshore.
Step2: the multi-spectrum satellite image is acquired, the resolution is required to be better than 0.5 meter, and the multi-spectrum satellite image has at least four wave bands of red, green, blue and near infrared.
Step3: geometric correction and radiation correction. The acquired multispectral satellite image is preprocessed to have accurate geographic coordinates and spectrum information.
Step4: and acquiring an unmanned aerial vehicle image. And in a time period adjacent to the satellite image acquisition time, using the unmanned aerial vehicle with the cradle head and the camera mounted thereon to fly the lakeside and the lakeside according to a set route, and simultaneously taking a picture vertically downwards according to a set time interval (or distance). The resolution of the photo is ensured to be better than 10cm, and the overlapping degree of the adjacent photos is ensured to be more than 60%. As shown in fig. 2 below; wherein: length of overlapping region of adjacent picturesRatio of A to the overall length of the picture frame B
Figure GDA0002357596250000051
Should be greater than 60%.
Step5: and splicing the unmanned aerial vehicle images. An orthographic image of the investigation region better than 10cm resolution was obtained, and the photograph after processing is exemplified in fig. 3 as follows:
step6: registering the unmanned aerial vehicle image to the multispectral satellite image. And selecting obvious common characteristic points on the unmanned aerial vehicle image and the satellite image, and performing geometric correction by a quadratic polynomial method to enable the unmanned aerial vehicle image and the multispectral satellite image to be basically overlapped.
Step7: the aquatic vegetation species and the center point of growth thereof are determined. The method comprises the following substeps: the step diagram is shown in fig. 4 below;
step7.1: and judging the aquatic vegetation types through unmanned aerial vehicle image spectrum expression. The resolution ratio of the unmanned aerial vehicle image is better than 10cm, and the aquatic vegetation types can be primarily interpreted through manual interpretation.
Step7.2: and (5) field check: the primary interpretation results need to be further checked in the field, especially for the target with less certainty of interpretation in the industry.
The photographs taken for the field check are shown in fig. 5 as follows:
step7.3: and (5) performing inner industry arrangement according to the outer industry checking result, and marking accurate aquatic vegetation types and basic growth positions thereof.
Step7.4: and performing segmentation and boundary extraction based on the spectral information of the unmanned aerial vehicle image.
Step7.5: based on the boundary information, a new aquatic vegetation growth center is calculated. In order to ensure that the center point is positioned in the growth area, the coordinates of the center point are calculated by a cross method, and for a certain growth point, the method specifically comprises the following steps:
(1) And traversing the coordinates to obtain the maximum value N_max and the minimum value N_min of the north coordinates of the region.
(2) The coordinates of the intersection point of the boundary of the region and the straight line n= (n_max-n_min)/2 are obtained, and are denoted as P1 (N, E1), P2 (N, E2), P3 (N, E3), and P4 (N, E4), and the intersection point should be present in pairs, and there should be at least 1 pair, and there may be a plurality of pairs.
(3) d1=e2-E1, d2=e4-E4, d3=e6-E5, … are calculated, d1, d2, d3, … are compared, the pair of coordinates corresponding to the maximum value (denoted as d_max) is extracted, and if there is only one pair in total, the pair of coordinates is directly selected without comparison. The pair of seats taken out are marked p_s1 (N, e_s1), p_s2 (N, e_s2).
(4) The new coordinates of the growth center point of the aquatic vegetation are calculated as a cross method of P_new (N, E_s1+d_max/2) as shown in the following 6:
step8: and acquiring a first year satellite image sample and a growth range corresponding to each growth center point. And superposing the growth center point and the range thereof obtained by Step7 on the satellite image, and collecting satellite image samples of each center point, wherein the range of each growth center point obtained by Step7 is taken as the standard of the sample range, and the content of the samples is data of four wave bands in the range.
Step9: and acquiring the multispectral satellite images with the same specification in the same time period of the investigation region in the second year, and preprocessing according to the Step3 method. After preprocessing, the satellite images acquired in the second year are registered to the satellite images acquired in the first year by adopting a Step6 method, so that the two satellite images are basically overlapped.
Step10: and acquiring a second year satellite image sample and a growth range corresponding to each growth center point.
The method comprises the following substeps:
step10.1: the center point and the range of growth for one year are overlapped.
Step10.2: the growth range over the last year was sampled for each growth point.
Step10.3: and obtaining a satellite image sample of the second year corresponding to each growth center point.
Step11: and comparing whether the satellite image sample difference of the same growth point of the two-stage images exceeds a set threshold value. This step comprises the following substeps, the specific steps are shown in fig. 7 below;
step11.1: one-to-one correspondence of satellite image samples of the first year and the second year
Step11.2: each sample was segmented according to spectral features and texture features for the next year. And obtaining a plurality of small blocks after segmentation.
Step11.3: and (3) carrying out similarity detection on each small block of the segmented satellite image in the sample range of the first year by taking the satellite image sample of the first year and the adjacent aquatic vegetation as a reference, wherein the main considered parameters are red, green, blue, near infrared spectrum statistical values and NDVI. And meanwhile, the similarity factor X of each small block is obtained (the value range is 0-1, wherein 0 represents complete dissimilarity and 1 represents complete coincidence).
The calculation method of X is as follows:
Figure GDA0002357596250000071
Figure GDA0002357596250000072
Figure GDA0002357596250000073
Figure GDA0002357596250000074
Figure GDA0002357596250000075
Figure GDA0002357596250000076
step11.4: classifying the small blocks according to the calculated similarity factors of the small blocks and the surrounding adjacent aquatic vegetation (classifying the small blocks and the aquatic vegetation with the largest similarity factor), and calculating the difference value D=1-X.
Step11.5: and calculating a difference threshold value, and judging the magnitude relation between the difference value and the threshold value. The calculation of the difference threshold is calculated according to a statistical method, and the specific steps are as follows:
(1) Classifying all the aquatic vegetation samples according to the types, classifying the aquatic vegetation of the same type into one type, calculating the spectrum characteristic value of the aquatic vegetation, and calculating by adopting the following formula to obtain the spectrum characteristic value of each aquatic vegetation type.
Figure GDA0002357596250000077
Figure GDA0002357596250000078
Figure GDA0002357596250000079
Figure GDA0002357596250000081
Figure GDA0002357596250000082
(2) Calculating X between different aquatic vegetation r 、X g 、X b 、X nir 、X NDVI The difference between them is respectively designated as DeltaX r 、ΔX g 、ΔX b 、ΔX nir 、ΔX NDVI
(3) Difference threshold
Figure GDA0002357596250000083
The magnitude relation between the threshold value and the difference value D is compared.
Step12: and if the difference value D exceeds the threshold value T, acquiring unmanned aerial vehicle images and live photos of the point and the periphery of the point.
Step13: according to the collected photo, whether the aquatic vegetation changes in the point in the last year is checked.
Step14: if a change (including a new increase or decrease) occurs, corresponding vegetation information is added or deleted in the aquatic vegetation center for the year.
Step15: if the difference D of the two-stage image samples does not exceed the threshold value T or D exceeds T, but the vegetation type of the point is not changed through the inspection of the unmanned aerial vehicle image and the field photo, the image of the second year is added into a sample library of the vegetation of the point.
Step16: and obtaining a preliminary second year satellite image sample and a growth range of each growth center point according to Step11-Step 15.
Step17-Step20: the investigation region is checked for newly increased but annotated aquatic vegetation in regions that were not categorized into the aquatic vegetation coverage for the previous year. If yes, the unmanned aerial vehicle and the live photo are used for verification, and vegetation information of the point is added.
Step21: and obtaining the complete aquatic vegetation growth center point and the growth range result of the second year. The results were analyzed as a benchmark for the next year.
Step22: and repeating the work for the first two years, and obtaining the annual change statistical result of the aquatic vegetation growing points and the growing ranges thereof in the investigation region.
Although the present invention has been described with reference to the foregoing embodiments, it will be apparent to those skilled in the art that modifications may be made to the embodiments described, or equivalents may be substituted for elements thereof, and any modifications, equivalents, improvements and changes may be made without departing from the spirit and principles of the present invention.

Claims (1)

1. The method for carrying out annual change statistics on aquatic vegetation by using unmanned aerial vehicle and multispectral satellite images is characterized by comprising the following steps of: the specific method comprises the following steps:
firstly, collecting multispectral satellite images of a survey area, wherein the resolution is better than 0.5 meter, and the spectrum at least comprises three visible light wave bands and one near infrared wave band;
step two, preprocessing the multispectral satellite image to enable the multispectral satellite image to have correct position information and spectrum information;
thirdly, flying the lakeside and the lakeside according to a set route by using an unmanned aerial vehicle mounted with a cradle head and a camera within a week around the acquisition time point of the multispectral satellite image, and photographing with a lens vertically downwards according to a set interval time or distance, wherein the resolution of the photo is required to be ensured to be superior to 10cm, and the overlapping degree of the adjacent photos is ensured to be more than 60%;
step four, processing the collected photos to obtain the lakeside and the orthographic images of the lakeside;
registering the orthographic image to the multispectral satellite image preprocessed in the second step, so as to ensure that the common coverage areas of the orthographic image and the multispectral satellite image are basically overlapped;
step six, classifying the aquatic vegetation according to the multispectral image preprocessed in the step two and the orthographic image registered in the step five, and determining the types of the aquatic vegetation;
step seven, for the uncertain aquatic vegetation types, manually checking on site, and collecting on-site photos;
marking all aquatic vegetation types in the investigation region on a multispectral satellite image, taking each marking point as a center, dividing the satellite image according to spectrum difference and extracting boundaries to obtain the growth range of the aquatic vegetation marked by the marking point in the year, and simultaneously collecting satellite image samples of each aquatic vegetation marking point to obtain a first year sample library corresponding to each aquatic vegetation marking point;
step nine, acquiring multispectral satellite images with the same technical specifications in the same time period of the second year, and preprocessing the multispectral satellite images;
step ten, registering the multispectral satellite image obtained in the step nine with the multispectral satellite image of the first year preprocessed in the step two, so as to ensure that the common coverage areas of the multispectral satellite image and the multispectral satellite image are basically overlapped;
step eleven, superposing the aquatic vegetation positions and the types marked in the step eight on the multispectral satellite images of the second year after the registration in the step ten;
a step twelve of carrying out segmentation and boundary extraction on the multispectral satellite images of the second year registered in the step ten by taking each marking point as a center according to the spectrum difference of the satellite images of the year to obtain the growth range of the aquatic vegetation marked by the marking point of the year, and simultaneously carrying out satellite image sample acquisition of the year on each aquatic vegetation marking point to obtain a second year sample library corresponding to each aquatic vegetation marking point;
step thirteen, comparing the difference between the sample obtained in the step twelve and the sample obtained in the step eight of each aquatic vegetation mark point, if the difference is within a set threshold value, adding the sample collected in the second year of the point into a sample library of the point, if the difference exceeds the set threshold value, performing field check, collecting an orthographic image with the resolution of better than 10cm at the point by using an unmanned plane, analyzing whether the aquatic vegetation at the point has variety change, if the variety change occurs, adding a new aquatic vegetation mark point, and marking the corresponding newly-increased aquatic vegetation variety; if no change occurs, adding the sample collected from the point in the second year to a sample library of the point; the step of determining the threshold value comprises the following sub-steps:
s101, satellite image samples in the first year and the second year correspond to each other one by one;
s102, dividing each sample in the next year according to spectral features and texture features, and obtaining a plurality of small blocks after dividing;
s103, carrying out similarity detection on each small block of the satellite image segmented in the sample range of the first year by taking the satellite image sample of the first year and the adjacent aquatic vegetation thereof as a reference, wherein the main considered parameters are red, green, blue, near infrared spectrum statistical values and NDVI; meanwhile, the similarity factor X of each small block is obtained, and the calculation method of X is as follows:
Figure FDA0004178166410000021
Figure FDA0004178166410000022
Figure FDA0004178166410000023
Figure FDA0004178166410000031
Figure FDA0004178166410000032
Figure FDA0004178166410000033
s104, classifying the small blocks according to the calculated similarity factors of the small blocks and surrounding adjacent aquatic vegetation, classifying the small blocks and which aquatic vegetation has the largest similarity factor, and calculating the difference value D=1-X;
s105, calculating a difference threshold, judging the magnitude relation between the difference value and the threshold, and calculating the difference threshold according to a statistical method, wherein the specific steps are as follows:
s10501, classifying all aquatic vegetation samples according to types, classifying the aquatic vegetation of the same type into one type, calculating a spectrum characteristic value of the aquatic vegetation, and calculating by adopting the following formula to obtain the spectrum characteristic value of each aquatic vegetation type;
Figure FDA0004178166410000034
Figure FDA0004178166410000035
Figure FDA0004178166410000036
Figure FDA0004178166410000037
Figure FDA0004178166410000038
s10502 calculating X between different aquatic vegetation r 、X g 、X b 、x nir 、X NDVI The difference between them is respectively designated as DeltaX r 、ΔX g 、ΔX b 、ΔX nir 、ΔX NDVI
S10503, a difference threshold calculation formula is as follows:
Figure FDA0004178166410000039
comparing the magnitude relation between the threshold value and the difference value D;
s106, if the difference value D exceeds the threshold value T, acquiring unmanned aerial vehicle images and on-site photos of the point and the periphery of the point;
s107, checking whether the aquatic vegetation changes in the point in the last year according to the acquired photo;
s108, if the new vegetation is increased or decreased, adding or deleting corresponding vegetation information in the aquatic vegetation center point of the year;
s109, if the difference D of the two-stage image samples does not exceed the threshold T, or D exceeds T, but the vegetation type of the point is not changed through the inspection of the unmanned aerial vehicle image and the field photo, adding the image of the second year into a sample library of the vegetation of the point;
and fourteen, repeating the steps nine to thirteenth each year, and obtaining the statistical result of the annual change condition of the aquatic vegetation in the investigation region.
CN201911229553.3A 2019-12-04 2019-12-04 Method for carrying out aquatic vegetation annual change statistics by unmanned aerial vehicle and multispectral satellite image Active CN111047566B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911229553.3A CN111047566B (en) 2019-12-04 2019-12-04 Method for carrying out aquatic vegetation annual change statistics by unmanned aerial vehicle and multispectral satellite image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911229553.3A CN111047566B (en) 2019-12-04 2019-12-04 Method for carrying out aquatic vegetation annual change statistics by unmanned aerial vehicle and multispectral satellite image

Publications (2)

Publication Number Publication Date
CN111047566A CN111047566A (en) 2020-04-21
CN111047566B true CN111047566B (en) 2023-07-14

Family

ID=70234642

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911229553.3A Active CN111047566B (en) 2019-12-04 2019-12-04 Method for carrying out aquatic vegetation annual change statistics by unmanned aerial vehicle and multispectral satellite image

Country Status (1)

Country Link
CN (1) CN111047566B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11620821B2 (en) 2020-06-25 2023-04-04 GEOSAT Aerospace & Technology Apparatus and method for image-guided agriculture
US11580730B2 (en) * 2020-06-25 2023-02-14 GEOSAT Aerospace & Technology Apparatus and method for image-guided agriculture
CN112884672B (en) * 2021-03-04 2021-11-23 南京农业大学 Multi-frame unmanned aerial vehicle image relative radiation correction method based on contemporaneous satellite images
CN114460099A (en) * 2022-02-11 2022-05-10 软通智慧信息技术有限公司 Unmanned aerial vehicle-based water hyacinth monitoring method and device, unmanned aerial vehicle and medium
CN115993336B (en) * 2023-03-23 2023-06-16 山东省水利科学研究院 Method for monitoring vegetation damage on two sides of water delivery channel and early warning method
CN116484319B (en) * 2023-06-21 2023-09-01 交通运输部水运科学研究所 Ship lock upstream and downstream reservoir area multi-source data fusion method and system based on machine learning

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11203443A (en) * 1998-01-19 1999-07-30 Hitachi Ltd Method and system for processing multispectral satellite image and hydrosphere evaluating method
CA3004388A1 (en) * 2015-11-08 2017-05-11 Agrowing Ltd A method for aerial imagery acquisition and analysis
JP2018097506A (en) * 2016-12-12 2018-06-21 株式会社日立製作所 Satellite image processing system and method
CN108445489A (en) * 2018-02-07 2018-08-24 海南云保遥感科技有限公司 The method for determining tropical agriculture loss based on satellite remote sensing and unmanned aerial vehicle remote sensing
US10127451B1 (en) * 2017-04-24 2018-11-13 Peter Cecil Vanderbilt Sinnott Method of detecting and quantifying sun-drying crops using satellite derived spectral signals
CN108846335A (en) * 2018-05-31 2018-11-20 武汉市蓝领英才科技有限公司 Wisdom building site district management and intrusion detection method, system based on video image
CN109977801A (en) * 2019-03-08 2019-07-05 中国水利水电科学研究院 A kind of quick Dynamic Extraction method and system of region water body of optical joint and radar
CN110020635A (en) * 2019-04-15 2019-07-16 中国农业科学院农业资源与农业区划研究所 Growing area crops sophisticated category method and system based on unmanned plane image and satellite image
JP2019144607A (en) * 2018-02-15 2019-08-29 西日本高速道路株式会社 Tree species estimation method using satellite image and tree species soundness determination method for tree species estimated
CN110310246A (en) * 2019-07-05 2019-10-08 广西壮族自治区基础地理信息中心 A kind of cane -growing region remote sensing information extracting method based on three-line imagery
JP2019185281A (en) * 2018-04-06 2019-10-24 株式会社日立製作所 Satellite image change extraction system, satellite image change extraction method, and front end unit in satellite image change extraction system

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9930316B2 (en) * 2013-08-16 2018-03-27 University Of New Brunswick Camera imaging systems and methods
US8693771B2 (en) * 2011-11-18 2014-04-08 Mitsubishi Electric Research Laboratories, Inc. Method for pan-sharpening panchromatic and multispectral images using dictionaries
US9390331B2 (en) * 2014-04-15 2016-07-12 Open Range Consulting System and method for assessing riparian habitats
US20160047101A1 (en) * 2014-08-17 2016-02-18 Chad James Krofta Self-Ballasted Wood Structure for Shoreline Protection and Aquatic Habitat Enhancement and Method of Manufacture
US9129355B1 (en) * 2014-10-09 2015-09-08 State Farm Mutual Automobile Insurance Company Method and system for assessing damage to infrastructure
US10445877B2 (en) * 2016-12-30 2019-10-15 International Business Machines Corporation Method and system for crop recognition and boundary delineation
CN106971146B (en) * 2017-03-03 2018-04-03 环境保护部卫星环境应用中心 Based on three water body exception remote sensing dynamic monitoring and controlling method, the device and system for looking into technology
CN106875636A (en) * 2017-04-05 2017-06-20 南京理工大学 Blue algae monitoring method for early warning and system based on unmanned plane
CN108020211B (en) * 2017-12-01 2020-07-07 云南大学 Method for estimating biomass of invasive plants through aerial photography by unmanned aerial vehicle
CN108195767B (en) * 2017-12-25 2020-07-31 中国水产科学研究院东海水产研究所 Estuary wetland foreign species monitoring method
CN109684929A (en) * 2018-11-23 2019-04-26 中国电建集团成都勘测设计研究院有限公司 Terrestrial plant ECOLOGICAL ENVIRONMENTAL MONITORING method based on multi-sources RS data fusion
CN109816674A (en) * 2018-12-27 2019-05-28 北京航天福道高技术股份有限公司 Registration figure edge extracting method based on Canny operator
CN109697475A (en) * 2019-01-17 2019-04-30 中国地质大学(北京) A kind of muskeg information analysis method, remote sensing monitoring component and monitoring method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11203443A (en) * 1998-01-19 1999-07-30 Hitachi Ltd Method and system for processing multispectral satellite image and hydrosphere evaluating method
CA3004388A1 (en) * 2015-11-08 2017-05-11 Agrowing Ltd A method for aerial imagery acquisition and analysis
JP2018097506A (en) * 2016-12-12 2018-06-21 株式会社日立製作所 Satellite image processing system and method
US10127451B1 (en) * 2017-04-24 2018-11-13 Peter Cecil Vanderbilt Sinnott Method of detecting and quantifying sun-drying crops using satellite derived spectral signals
CN108445489A (en) * 2018-02-07 2018-08-24 海南云保遥感科技有限公司 The method for determining tropical agriculture loss based on satellite remote sensing and unmanned aerial vehicle remote sensing
JP2019144607A (en) * 2018-02-15 2019-08-29 西日本高速道路株式会社 Tree species estimation method using satellite image and tree species soundness determination method for tree species estimated
JP2019185281A (en) * 2018-04-06 2019-10-24 株式会社日立製作所 Satellite image change extraction system, satellite image change extraction method, and front end unit in satellite image change extraction system
CN108846335A (en) * 2018-05-31 2018-11-20 武汉市蓝领英才科技有限公司 Wisdom building site district management and intrusion detection method, system based on video image
CN109977801A (en) * 2019-03-08 2019-07-05 中国水利水电科学研究院 A kind of quick Dynamic Extraction method and system of region water body of optical joint and radar
CN110020635A (en) * 2019-04-15 2019-07-16 中国农业科学院农业资源与农业区划研究所 Growing area crops sophisticated category method and system based on unmanned plane image and satellite image
CN110310246A (en) * 2019-07-05 2019-10-08 广西壮族自治区基础地理信息中心 A kind of cane -growing region remote sensing information extracting method based on three-line imagery

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Monitoring the Invasion of Spartina alterniflora from 1993 to 2014 with Landsat TM and SPOT 6 Satellite Date in Yueqing Bay,China;Wang Anqi et al.;《PLOS ONE》;第10卷(第8期);第3-10页 *
孙永光等.《典型河口-海湾围镇填海开发的生态环境效应评价方法与应用》.海洋出版社,2014,第95-96页. *
鄱阳湖湿地景观类型变化趋势及其对水位变动的响应;游海林等;《生态学杂志》;第35卷(第9期);第2487-2493页 *

Also Published As

Publication number Publication date
CN111047566A (en) 2020-04-21

Similar Documents

Publication Publication Date Title
CN111047566B (en) Method for carrying out aquatic vegetation annual change statistics by unmanned aerial vehicle and multispectral satellite image
Guo et al. Crop 3D—a LiDAR based platform for 3D high-throughput crop phenotyping
CN102147250B (en) Digital line graph mapping method
GB2618896A (en) System and method for crop monitoring
CN109087312B (en) Automatic planning method and system for unmanned aerial vehicle air route
KR102450019B1 (en) Water Quality Monitoring Method and System for Using Unmanned Aerial Vehicle
CN107389036A (en) A kind of large spatial scale vegetation coverage computational methods of combination unmanned plane image
CN104132897B (en) A kind of nitrogenous measuring method of plant leaf blade based on handheld device and device
CN107449400B (en) Measuring system and measuring method for forest aboveground biomass
CN114821362B (en) Multi-source data-based rice planting area extraction method
CN112614147B (en) Crop seedling stage plant density estimation method and system based on RGB image
CN101324423A (en) Device and method for automatically measuring individual plant height
Hou et al. Extraction of remote sensing-based forest management units in tropical forests
CN103630091A (en) Leaf area measurement method based on laser and image processing techniques
CN111898494B (en) Mining disturbed land boundary identification method
CN110765977A (en) Method for extracting wheat lodging information based on multi-temporal remote sensing data of unmanned aerial vehicle
CN109919088B (en) Automatic extraction method for identifying individual plants of pitaya in karst region
Ouyang et al. Assessment of canopy size using UAV-based point cloud analysis to detect the severity and spatial distribution of canopy decline
CN102288776B (en) Corn plant growth rate measuring method
CN115294482B (en) Edible fungus yield estimation method based on unmanned aerial vehicle remote sensing image
CN112634213A (en) System and method for predicting winter wheat canopy leaf area index by unmanned aerial vehicle
CN111275631A (en) Method for eliminating shadow interference during urban water body extraction by remote sensing image
CN114778476A (en) Alfalfa cotton field soil water content monitoring model based on unmanned aerial vehicle remote sensing
CN115358991A (en) Method and system for identifying seedling leaking quantity and position of seedlings
CN116124774A (en) Method for predicting nitrogen content of canopy based on unmanned aerial vehicle spectrum multi-source data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant