CN117522956A - Method for calculating degradation indication grass seed coverage by using unmanned aerial vehicle image - Google Patents

Method for calculating degradation indication grass seed coverage by using unmanned aerial vehicle image Download PDF

Info

Publication number
CN117522956A
CN117522956A CN202311546584.8A CN202311546584A CN117522956A CN 117522956 A CN117522956 A CN 117522956A CN 202311546584 A CN202311546584 A CN 202311546584A CN 117522956 A CN117522956 A CN 117522956A
Authority
CN
China
Prior art keywords
picture
calculating
coverage
degradation
grass seeds
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311546584.8A
Other languages
Chinese (zh)
Other versions
CN117522956B (en
Inventor
李春梅
李楷
欧为友
李海洋
金云鹏
杨鹏
孔莎莎
方程梧
黄湘杰
韩耀生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qinghai University
Original Assignee
Qinghai University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qinghai University filed Critical Qinghai University
Priority to CN202311546584.8A priority Critical patent/CN117522956B/en
Priority claimed from CN202311546584.8A external-priority patent/CN117522956B/en
Publication of CN117522956A publication Critical patent/CN117522956A/en
Application granted granted Critical
Publication of CN117522956B publication Critical patent/CN117522956B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/70Labelling scene content, e.g. deriving syntactic or semantic representations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Computational Linguistics (AREA)
  • Remote Sensing (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a method for calculating degradation indication grass seed coverage by using unmanned aerial vehicle images, which comprises the steps of carrying out frame cutting and cutting by video processing software to obtain picture data, marking the degradation indication grass seed in the picture by graphic image annotation software, and training by semantic segmentation network and instance segmentation network to accurately calculate the degradation indication grass seed coverage so as to evaluate the grass degradation level; the invention provides a concrete calculation method of an actual ground proportion AGP (advanced graphics processing), which aims at the situation that the height of a known shot grassland image is equal to that of an unknown shot grassland image, and converts all images to the same height to calculate the coverage by calculating the actual ground proportion of each image, so that the coverage is close to the actual coverage, and the grassland degradation grade is evaluated more scientifically.

Description

Method for calculating degradation indication grass seed coverage by using unmanned aerial vehicle image
Technical Field
The invention relates to the technical field of computer image processing, in particular to a method for calculating degradation indication grass seed coverage by using unmanned aerial vehicle images.
Background
The Sanjiang source region has high ecological and economic value in China, the ecological system of the region is fragile and sensitive, and grassland degradation is easy to occur under the common interference of natural factors and human factors (such as global warming and over-reclamation grazing), so that the grassland degradation is predicted and measures are taken at present.
Calculating images shot by an unmanned aerial vehicle through an artificial intelligence technology and by means of a computer image processing technology to analyze the average coverage of degradation indication grasses, wherein the method is an emerging technical means for evaluating grassland degradation, and the method for researching grassland degradation evaluation of Sanjiang sources through a deep learning technology is an important stage in realizing the intelligentization of grassland evaluation; however, the discrepancy in the image height of the unmanned aerial vehicle during semantic segmentation can lead to discrepancy between the calculated result of the grass coverage proportion of the degradation indication and the actual situation, and further lead to the error of the evaluation conclusion of the grassland degradation, and the method for reducing the error is a difficult problem to be solved by the application; meanwhile, a method for calculating the degradation indication grass seed coverage by using the unmanned aerial vehicle image is worth exploring.
Disclosure of Invention
The invention provides a method for calculating degradation indication grass seed coverage by using unmanned aerial vehicle images, aiming at the problems mentioned in the background art, and the adopted technical scheme comprises the following steps:
s1, processing images shot by an unmanned aerial vehicle: shooting a grassland by using an unmanned aerial vehicle, and obtaining an image in a video or picture format;
if the image is in the video format, using video processing software to cut frames and cut the image and form a picture sequence so as to obtain picture data;
if the image is in a picture format, forming a picture sequence to obtain picture data;
s2, labeling and training the picture data to form semantic segmentation results: marking the degradation indication grass seeds in each picture of the picture data in the step S1 by using graphic image annotation software, inputting the marked grass seeds into a semantic segmentation network and training the marked grass seeds to obtain a semantic segmentation result; step S3 is carried out simultaneously;
s3, judging the shooting height of the picture and the size of the picture: judging the shooting height and the picture size of each picture in the picture data obtained in the step S1;
s31, if the sizes of the pictures are consistent and the shooting heights are known, calculating the average height and the image height, and then calculating the actual ground proportion AGP of each picture; executing the step S4;
s32, if the sizes of the pictures are inconsistent and the shooting heights are known, calculating average heights and image heights, and then calculating the actual ground proportion AGP of each picture; executing step S5;
s33, if the sizes of the pictures are inconsistent and the shooting heights are unknown, marking the degradation indication grass seeds in the pictures by using graphic image annotation software, inputting the degradation indication grass seeds into an instance segmentation network, and training the instance segmentation network to obtain an instance segmentation result; executing step S6;
s4, calculating coverage of the degradation indication grass seeds: calculating coverage of the degradation indication grass seeds by taking a picture as a unit according to the actual ground proportion AGP and a semantic segmentation result;
s5, calculating coverage of the degradation indication grass seeds: calculating coverage of the degradation indication grass seeds by taking pixels as a unit according to the actual ground proportion AGP and the semantic segmentation result;
s6, calculating coverage of the degradation indication grass seeds: calculating coverage of the degradation indication grass seeds by taking pixels as a unit according to the actual ground proportion AGP and the semantic segmentation result;
and S7, judging the degradation level of the grassland according to the coverage.
Further, the calculation method in step S31 is as follows:
proportional conversion is carried out based on the height to obtain the average coverage of the degradation indication grass seeds in the actual ground, and then statistical data in the form of table 1 are obtained:
TABLE 1
In table 1: h is the height of the original pictures, m is the number of pictures at the corresponding height, and p is the average coverage of all the images of the degradation indication grass seeds at the corresponding height;
the calculation process of the average coverage is as follows:
the average height L is calculated using equation 1:
L=(H 1 +H 2 +H 3 +…+H n ) N formula 1
Calculating the actual ground proportion a of each original picture by adopting a formula 2 t
a t =(H t /L) 2 Equation 2
In equation 2: a, a t Is of height H t An actual ground proportion AGP of the original picture;
using equation 3, the average coverage ACG of the degradation indicating grass seeds is calculated:
in equation 3: a is the actual ground proportion AGP, m is the number of pictures at the corresponding height, and p is the average coverage of the degradation indication grass seeds at the corresponding height.
Further, the calculation method in step S32 is as follows:
counting the number of pixels of each picture after clipping to obtain statistical data in the form of table 2:
TABLE 2
In table 2: c is the number of pixels of the processed picture;
the calculation process of the average coverage is as follows:
calculating the actual ground proportion a of each original picture by adopting a formula 2 t
Using equation 4, the average coverage ACG of the degradation indicating grass seeds is calculated:
further, the calculation method in step S33 is as follows:
judging the actual ground proportion AGP of each picture according to the ratio of the average pixel numbers pavg of the degradation indication grass seeds of each picture, and calculating the average coverage according to the obtained actual ground proportion AGP to obtain the statistical data in the form of a table 3:
TABLE 3 Table 3
In table 3: q is the number of splendid achnatherum in the picture, and splendid achnatherum is set as the sample of the degradation indication grass seed;
the method comprises the steps of obtaining the number of splendid achnatherum and the number of pixels of splendid achnatherum in each picture by using an example segmentation network, estimating the ground height of the shot picture according to the average number of the splendid achnatherum in each picture, judging the actual ground proportion AGP of the picture, and further calculating the average coverage of degradation indication grass seeds in all original pictures, wherein the average number of pixels pavg adopts a formula 5:
calculating the reference average pixel count PAVG of all original pictures by adopting the formula 6
Calculating the actual ground proportion AGP value of each picture by adopting the formula 7
The average coverage ACG of the degradation indicating grass seeds was calculated using equation 8:
if there is no achnatherum in the picture, PAVG is selected as the average number of achnatherum pixels per each achnatherum, i.e., a=1, to calculate the final result.
Preferably, in step S1, the video processing software is a video processing packet in python.
Preferably, in steps S2 and S3, the graphic image annotation software is LabelMe; the example segmentation network is Mask-CNN, and the semantic segmentation network is Mask2Former.
Preferably, in step S7, the grassland degradation grade is classified as light, moderate and heavy hazard, i.e. degradation indicates that the average coverage of grass seeds is less than 25% is considered light hazard, between 25% and 50% is considered moderate hazard, and greater than 50% is considered heavy hazard.
The implementation of the invention has the beneficial effects that: the invention provides a method for calculating coverage of degradation indication grass seeds by using unmanned aerial vehicle images, which comprises the steps of carrying out frame cutting and cutting by video processing software to obtain picture data, marking the degradation indication grass seeds in each picture by graphic image annotation software, and training by semantic segmentation network and instance segmentation network to accurately calculate the coverage of the degradation indication grass seeds, thereby evaluating the degradation grade of grasslands; the invention provides a concrete calculation method of an actual ground proportion AGP (advanced graphics processing) aiming at two situations of known shot grassland image height and unknown shot grassland image height, namely, calculating the actual ground proportion of each image to convert all images to the same height for calculating the coverage degree, thereby obtaining the coverage degree of the toxic weeds close to the actual, and evaluating the grassland degradation grade more scientifically.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1: calculating degradation indication grass coverage by using unmanned aerial vehicle images;
fig. 2: the achnatherum splendid achnatherum image is shot by a 2 m-height unmanned aerial vehicle;
fig. 3: the 4 m-height unmanned aerial vehicle shoots the splendid achnatherum image;
fig. 4: the achnatherum splendid achnatherum image is shot by the 6 m-height unmanned aerial vehicle;
fig. 5: a MaksR-CNN prediction graph at a height of 2 m;
fig. 6: a MaksR-CNN prediction graph at a height of 4 m;
fig. 7: maksR-CNN prediction graph at 6m height.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
For a better understanding of the present invention, some of the designations are explained below:
(1) Unmanned aerial vehicle image: the method refers to an image file stored when an unmanned aerial vehicle shoots an object on the ground; here, the image file is stored after image capturing is performed on the grassland.
(2) Ground Sampling Distance (GSD): representing the ground size corresponding to a single phase element in the digital image, which describes the center distance of two consecutive pixels. The calculation expression is shown as follows:
GSD=s*H/f
wherein: s denotes the camera's phase element size, H denotes the height at the time of shooting, and f denotes the focal length of the lens at the time of shooting. S and f are constants of fixed values because parameters of cameras carried by the unmanned aerial vehicle are fixed values; the ratio of the areas of the photographed images at the same resolution is equal to the square of the ratio of the photographing heights. The GSDs are all pictures processed by video frame cutting, and each GSD is numbered.
(3) Actual floor ratio (Actual ground proportion, abbreviated AGP): the ratio of the actual floor area taken at different heights for images of the same resolution.
(4) Actual coverage (abbreviated ACG): refers to the captured unmanned aerial vehicle image, and the actual degradation obtained after the altitude conversion indicates the coverage of the grass.
In one embodiment of the present invention, please refer to fig. 1; a method for calculating degradation indication grass seed coverage by using unmanned aerial vehicle images adopts the following technical scheme:
s1, processing images shot by an unmanned aerial vehicle: shooting a grassland by using an unmanned aerial vehicle, and obtaining an image in a video or picture format;
if the image is in the video format, using video processing software to cut frames and cut the image and form a picture sequence so as to obtain picture data;
if the image is in a picture format, forming a picture sequence to obtain picture data;
s2, labeling and training the picture data to form semantic segmentation results: marking the degradation indication grass seeds in each picture of the picture data in the step S1 by using graphic image annotation software, inputting the marked grass seeds into a semantic segmentation network and training the marked grass seeds to obtain a semantic segmentation result; step S3 is carried out simultaneously;
s3, judging the shooting height of the picture and the size of the picture: judging the shooting height and the picture size of each picture in the picture data obtained in the step S1;
s31, if the sizes of the pictures are consistent and the shooting heights are known, calculating the average height and the image height, and then calculating the actual ground proportion AGP of each picture; executing the step S4;
s32, if the sizes of the pictures are inconsistent and the shooting heights are known, calculating average heights and image heights, and then calculating the actual ground proportion AGP of each picture; executing step S5;
s33, if the sizes of the pictures are inconsistent and the shooting heights are unknown, marking the degradation indication grass seeds in the pictures by using graphic image annotation software, inputting the degradation indication grass seeds into an instance segmentation network, and training the instance segmentation network to obtain an instance segmentation result; executing step S6;
s4, calculating coverage of the degradation indication grass seeds: calculating coverage of the degradation indication grass seeds by taking a picture as a unit according to the actual ground proportion AGP and a semantic segmentation result;
s5, calculating coverage of the degradation indication grass seeds: calculating coverage of the degradation indication grass seeds by taking pixels as a unit according to the actual ground proportion AGP and the semantic segmentation result;
s6, calculating coverage of the degradation indication grass seeds: calculating coverage of the degradation indication grass seeds by taking pixels as a unit according to the actual ground proportion AGP and the semantic segmentation result;
and S7, judging the degradation level of the grassland according to the coverage.
Further, the calculation method in step S31 is as follows:
proportional conversion is carried out based on the height to obtain the average coverage of the degradation indication grass seeds in the actual ground, and then statistical data in the form of table 1 are obtained:
TABLE 1
In table 1: h is the height of the original pictures, m is the number of pictures at the corresponding height, and p is the average coverage of all the images of the degradation indication grass seeds at the corresponding height;
the calculation process of the average coverage is as follows:
the average height L is calculated using equation 1:
L=(H i +H 2 +H 3 +…+H n ) N formula 1
Calculating the actual ground proportion a of each original picture by adopting a formula 2 t
a t =(H t /L) 2 Equation 2
In equation 2: a, a t Is of height H t An actual ground proportion AGP of the original picture;
using equation 3, the average coverage ACG of the degradation indicating grass seeds is calculated:
in equation 3: a is the actual ground proportion AGP, m is the number of pictures at the corresponding height, and p is the average coverage of the degradation indication grass seeds at the corresponding height.
Further, the calculation method in step S32 is as follows:
counting the number of pixels of each picture after clipping to obtain statistical data in the form of table 2:
TABLE 2
In table 2: c is the number of pixels of the processed picture;
the calculation process of the average coverage is as follows:
calculating the actual ground proportion a of each original picture by adopting a formula 2 t
Using equation 4, the average coverage ACG of the degradation indicating grass seeds is calculated:
further, the calculation method in step S33 is as follows:
judging the actual ground proportion AGP of each picture according to the ratio of the average pixel numbers pavg of the degradation indication grass seeds of each picture, and calculating the average coverage according to the obtained actual ground proportion AGP to obtain the statistical data in the form of a table 3:
TABLE 3 Table 3
In table 3: q is the number of splendid achnatherum in the picture, and splendid achnatherum is set as the sample of the degradation indication grass seed;
the method comprises the steps of obtaining the number of splendid achnatherum and the number of pixels of splendid achnatherum in each picture by using an example segmentation network, estimating the ground height of the shot picture according to the average number of the splendid achnatherum in each picture, judging the actual ground proportion AGP of the picture, and further calculating the average coverage of degradation indication grass seeds in all original pictures, wherein the average number of pixels pavg adopts a formula 5:
calculating the reference average pixel count PAVG of all original pictures by adopting the formula 6
Calculating the actual ground proportion AGP value of each picture by adopting the formula 7
The average coverage ACG of the degradation indicating grass seeds was calculated using equation 8:
if there is no achnatherum in the picture, PAVG is selected as the average number of achnatherum pixels per each achnatherum, i.e., a=1, to calculate the final result.
Preferably, in step S1, the video processing software is a video processing packet in python.
Preferably, in steps S2 and S3, the graphic image annotation software is LabelMe; the example segmentation network is Mask R-CNN, and the semantic segmentation network is Mask2Former.
Preferably, in step S7, the grassland degradation grade is classified as light, moderate and heavy hazard, i.e. degradation indicates that the average coverage of grass seeds is less than 25% is considered light hazard, between 25% and 50% is considered moderate hazard, and greater than 50% is considered heavy hazard.
Related experiments, using the above-described method, the inventors conducted the following work, and for the convenience of the experiment, splendid achnathus, which is common on the grassland, was selected as a sample of the degradation indicating grass species in the experiment to calculate the coverage thereof.
The image is photographed at a certain grassland in the southern county of the Tibetan, hainan province, and is respectively an image with three heights of 2m, 4m and 6 m; 269 pictures with the height of 2m, 288 pictures with the height of 4m and 302 pictures with the height of 6m are obtained in a video frame cutting mode, and the total number of the pictures is 859; wherein the training set is 699, the verification set is 159, and the ratio of the training set to the verification set is 4:1; figures 2, 3 and 4 are respectively the pictures of the achnathus taken by the unmanned aerial vehicle at the heights of 2m, 4m and 6 m.
The inventor selects the currently mainstream example segmentation network to train the splendid achnatherum data set, uses RTNDet, YOLACT, boxIns and Mask R-CNN example segmentation networks to train respectively, and finally selects the Mask R-CNN segmentation network with the best effect as the example segmentation network, wherein mAP_50 of the Mask R-CNN segmentation network on the splendid achnatherum data set reaches 77.2%, as shown in Table 4.
TABLE 4 Table 4
Fig. 5, 6 and 7 are prediction results of the Maks R-CNN example segmentation network under three heights respectively, wherein the left side is a mask graph marked, and the right side is a segmentation result graph of network prediction.
Calculating coverage, in table 5, the average number of pixels of splendid achnatherum in the 2m height image was 13974, the number of splendid achnatherum was 221, the number of pictures was 60, the average number of splendid achnatherum in the 4m height image was 5170, the number of splendid achnatherum was 409, the number of pictures was 42, the average number of pixels of splendid achnatherum in the 6m height image was 2564, the number of splendid achnatherum was 669, and the number of pictures was 57; by selecting a 4m height as the reference height, a predicted AGP value of 0.36 for a 2m height picture, a predicted AGP value of 2.01 for a 6m height picture, and theoretical AGP values of 0.25 and 2.25 for 2m and 6m heights, respectively, can be obtained.
TABLE 5
As shown in table 6, from the data shown in table 5, it was found that the actual coverage should be 8.89% by the previous method, the result obtained if the splendid achnatherum of all pictures was calculated on average without considering the heights, and the result obtained by calculating the coverage again by dividing the network heights by examples was 9.14. The error is reduced from 1.73% to 0.25%.
TABLE 6
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.

Claims (7)

1. A method for calculating degradation indication grass seed coverage using unmanned aerial vehicle images, comprising the steps of:
s1, processing images shot by an unmanned aerial vehicle: shooting a grassland by using an unmanned aerial vehicle, and obtaining an image in a video or picture format;
if the image is in the video format, using video processing software to cut frames and cut the image and form a picture sequence so as to obtain picture data;
if the image is in a picture format, forming a picture sequence to obtain picture data;
s2, labeling and training the picture data to form semantic segmentation results: marking the degradation indication grass seeds in each picture of the picture data in the step S1 by using graphic image annotation software, inputting the marked grass seeds into a semantic segmentation network and training the marked grass seeds to obtain a semantic segmentation result; step S3 is carried out simultaneously;
s3, judging the shooting height of the picture and the size of the picture: judging the shooting height and the picture size of each picture in the picture data obtained in the step S1;
s31, if the sizes of the pictures are consistent and the shooting heights are known, calculating the average height and the image height, and then calculating the actual ground proportion AGP of each picture; executing the step S4;
s32, if the sizes of the pictures are inconsistent and the shooting heights are known, calculating average heights and image heights, and then calculating the actual ground proportion AGP of each picture; executing step S5;
s33, if the sizes of the pictures are inconsistent and the shooting heights are unknown, marking the degradation indication grass seeds in the pictures by using graphic image annotation software, inputting the degradation indication grass seeds into an instance segmentation network, and training the instance segmentation network to obtain an instance segmentation result; executing step S6;
s4, calculating coverage of the degradation indication grass seeds: calculating coverage of the degradation indication grass seeds by taking a picture as a unit according to the actual ground proportion AGP and a semantic segmentation result;
s5, calculating coverage of the degradation indication grass seeds: calculating coverage of the degradation indication grass seeds by taking pixels as a unit according to the actual ground proportion AGP and the semantic segmentation result;
s6, calculating coverage of the degradation indication grass seeds: calculating coverage of the degradation indication grass seeds by taking pixels as a unit according to the actual ground proportion AGP and the semantic segmentation result;
and S7, judging the degradation level of the grassland according to the coverage.
2. The method of claim 1, wherein the calculating method in step S31 is as follows:
proportional conversion is carried out based on the height to obtain the average coverage of the degradation indication grass seeds in the actual ground, and then statistical data in the form of table 1 are obtained:
TABLE 1
In table 1: h is the height of the original pictures, m is the number of pictures at the corresponding height, and p is the average coverage of all the images of the degradation indication grass seeds at the corresponding height;
the calculation process of the average coverage is as follows:
the average height L is calculated using equation 1:
L=(H 1 +H 2 +H 3 +…+H n ) N formula 1
Calculating the actual ground proportion a of each original picture by adopting a formula 2 t
a t =(H t /L) 2 Equation 2
In equation 2: a, a t Is of height H t An actual ground proportion AGP of the original picture;
using equation 3, the average coverage ACG of the degradation indicating grass seeds is calculated:
in equation 3: a is the actual ground proportion AGP, m is the number of pictures at the corresponding height, and p is the average coverage of the degradation indication grass seeds at the corresponding height.
3. The method of claim 1, wherein the calculating method in step S32 is as follows:
counting the number of pixels of each picture after clipping to obtain statistical data in the form of table 2:
TABLE 2
In table 2: c is the number of pixels of the processed picture;
the calculation process of the average coverage is as follows:
calculating each original sheet by adopting a formula 2Actual ground proportion a of starting picture t
Using equation 4, the average coverage ACG of the degradation indicating grass seeds is calculated:
4. the method of claim 1, wherein the calculating method in step S33 is as follows:
judging the actual ground proportion AGP of each picture according to the ratio of the average pixel numbers pavg of the degradation indication grass seeds of each picture, and calculating the average coverage according to the obtained actual ground proportion AGP to obtain the statistical data in the form of a table 3:
TABLE 3 Table 3
In table 3: q is the number of splendid achnatherum in the picture, and splendid achnatherum is set as the sample of the degradation indication grass seed;
the method comprises the steps of obtaining the number of splendid achnatherum and the number of pixels of splendid achnatherum in each picture by using an example segmentation network, estimating the ground height of the shot picture according to the average number of the splendid achnatherum in each picture, judging the actual ground proportion AGP of the picture, and further calculating the average coverage of degradation indication grass seeds in all original pictures, wherein the average number of pixels pavg adopts a formula 5:
calculating the reference average pixel count PAVG of all original pictures by adopting the formula 6
Calculating the actual ground proportion AGP value of each picture by adopting the formula 7
The average coverage ACG of the degradation indicating grass seeds was calculated using equation 8:
if there is no achnatherum in the picture, PAVG is selected as the average number of achnatherum pixels per each achnatherum, i.e., a=1, to calculate the final result.
5. A method of calculating degradation indicating grass coverage using unmanned aerial vehicle imagery according to any one of claims 1 to 4, wherein in step S1 the video processing software is the video processing package in python.
6. A method of calculating degradation indicating grass coverage using unmanned aerial vehicle imagery according to any one of claims 1 to 4, wherein in steps S2 and S3 the graphical image annotation software is LabelMe; the example segmentation network is Mask R-CNN, and the semantic segmentation network is Mask2Former.
7. A method of calculating a degradation indicating grass seed coverage using unmanned aerial vehicle imagery according to any one of claims 1 to 4, wherein in step S7 the grass degradation grade is classified as mild, moderate and severe hazard, i.e. the average coverage of the degradation indicating grass seed is less than 25% is considered to be mild hazard, between 25% and 50% is considered to be moderate hazard, and greater than 50% is considered to be severe hazard.
CN202311546584.8A 2023-11-20 Method for calculating degradation indication grass seed coverage by using unmanned aerial vehicle image Active CN117522956B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311546584.8A CN117522956B (en) 2023-11-20 Method for calculating degradation indication grass seed coverage by using unmanned aerial vehicle image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311546584.8A CN117522956B (en) 2023-11-20 Method for calculating degradation indication grass seed coverage by using unmanned aerial vehicle image

Publications (2)

Publication Number Publication Date
CN117522956A true CN117522956A (en) 2024-02-06
CN117522956B CN117522956B (en) 2024-07-02

Family

ID=

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105069398A (en) * 2015-07-10 2015-11-18 南京信息工程大学 Grassland coverage-degree extraction method based on mobile phone camera
CN108170926A (en) * 2017-12-12 2018-06-15 伊犁师范学院 A kind of information data acquisition of river valley grassland degeneration situation and analysis method
CN108647568A (en) * 2018-03-30 2018-10-12 电子科技大学 Grassland degeneration extraction method based on full convolutional neural networks
CN110245867A (en) * 2019-06-18 2019-09-17 青海大学 A kind of grassland degeneration stage division based on bp neural network
CN110348108A (en) * 2019-07-08 2019-10-18 青海大学 A method of evaluation Grassland degradation degree
CN111476197A (en) * 2020-04-24 2020-07-31 中科天盛卫星技术服务有限公司 Oil palm identification and area extraction method and system based on multi-source satellite remote sensing image
AU2020103570A4 (en) * 2020-11-20 2021-02-04 College of Grassland and Environmental Science, Xinjiang Agricultural University Grassland soil degradation evaluation method
US11074447B1 (en) * 2018-07-13 2021-07-27 Hana Resources, Inc. Land analysis system using drone-captured data
CN114627064A (en) * 2022-03-01 2022-06-14 中国电建集团中南勘测设计研究院有限公司 Sparse vegetation regionalization extraction method based on two-dimensional image
CN115082785A (en) * 2022-05-30 2022-09-20 青海大学 Method for distinguishing odontoptosis characteristics of rabbit-shaped degraded meadow of plateau mice
CN115497052A (en) * 2022-11-03 2022-12-20 兰州理工大学 Saline soil identification monitoring method and system and electronic equipment
CN116778325A (en) * 2023-06-19 2023-09-19 湖北工业大学 Sunflower coverage acquisition method, system, device and medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105069398A (en) * 2015-07-10 2015-11-18 南京信息工程大学 Grassland coverage-degree extraction method based on mobile phone camera
CN108170926A (en) * 2017-12-12 2018-06-15 伊犁师范学院 A kind of information data acquisition of river valley grassland degeneration situation and analysis method
CN108647568A (en) * 2018-03-30 2018-10-12 电子科技大学 Grassland degeneration extraction method based on full convolutional neural networks
US11074447B1 (en) * 2018-07-13 2021-07-27 Hana Resources, Inc. Land analysis system using drone-captured data
CN110245867A (en) * 2019-06-18 2019-09-17 青海大学 A kind of grassland degeneration stage division based on bp neural network
CN110348108A (en) * 2019-07-08 2019-10-18 青海大学 A method of evaluation Grassland degradation degree
CN111476197A (en) * 2020-04-24 2020-07-31 中科天盛卫星技术服务有限公司 Oil palm identification and area extraction method and system based on multi-source satellite remote sensing image
AU2020103570A4 (en) * 2020-11-20 2021-02-04 College of Grassland and Environmental Science, Xinjiang Agricultural University Grassland soil degradation evaluation method
CN114627064A (en) * 2022-03-01 2022-06-14 中国电建集团中南勘测设计研究院有限公司 Sparse vegetation regionalization extraction method based on two-dimensional image
CN115082785A (en) * 2022-05-30 2022-09-20 青海大学 Method for distinguishing odontoptosis characteristics of rabbit-shaped degraded meadow of plateau mice
CN115497052A (en) * 2022-11-03 2022-12-20 兰州理工大学 Saline soil identification monitoring method and system and electronic equipment
CN116778325A (en) * 2023-06-19 2023-09-19 湖北工业大学 Sunflower coverage acquisition method, system, device and medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HUANJIONG WANG等: "《The identification of indicator grass species of grassland degradation based on the field spectral characteristics》", 《2009 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM》, 18 February 2010 (2010-02-18) *
土登达杰;普布旦增;仁青诺布;: "基于语义分割的草场生长状态分析", 电子技术与软件工程, no. 16, 15 August 2020 (2020-08-15) *
张锡鹏: "基于无人机高光谱遥感的荒漠化草原草种类分类研究", 《中国优秀硕士学位论文全文数据库 (工程科技Ⅰ辑)》, 15 February 2021 (2021-02-15) *

Similar Documents

Publication Publication Date Title
CN110276767B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN108805103B (en) Image processing method and device, electronic equipment and computer readable storage medium
WO2017016050A1 (en) Image preview method, apparatus and terminal
CN105282421B (en) A kind of mist elimination image acquisition methods, device and terminal
CN109712177B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN108416754A (en) A kind of more exposure image fusion methods automatically removing ghost
CN106447602A (en) Image mosaic method and device
CN110310301B (en) Method and device for detecting target object
CN110349163A (en) Image processing method and device, electronic equipment, computer readable storage medium
CN113160053B (en) Pose information-based underwater video image restoration and splicing method
CN111724317A (en) Method for constructing Raw domain video denoising supervision data set
CN109685853A (en) Image processing method, device, electronic equipment and computer readable storage medium
CN110111269A (en) Low-light-level imaging algorithm and device based on multiple dimensioned context converging network
CN108109148A (en) Image solid distribution method, mobile terminal
CN107194901B (en) Image processing method, image processing device, computer equipment and computer readable storage medium
CN111553337A (en) Hyperspectral multi-target detection method based on improved anchor frame
CN103870847A (en) Detecting method for moving object of over-the-ground monitoring under low-luminance environment
KR101949968B1 (en) Apparatus and method for determining the visibility distance of an image containing a fog component
CN107341782A (en) Image processing method, device, computer equipment and computer-readable recording medium
CN114331835A (en) Panoramic image splicing method and device based on optimal mapping matrix
CN117522956B (en) Method for calculating degradation indication grass seed coverage by using unmanned aerial vehicle image
CN117522956A (en) Method for calculating degradation indication grass seed coverage by using unmanned aerial vehicle image
WO2023000878A1 (en) Photographing method and apparatus, and controller, device and computer-readable storage medium
US20150327828A1 (en) Body motion display device and body motion display method
CN109410308A (en) Image processing method and device, electronic equipment, computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant