CN112215186A - Marsh wetland vegetation classification method, device, computer equipment and storage medium - Google Patents
Marsh wetland vegetation classification method, device, computer equipment and storage medium Download PDFInfo
- Publication number
- CN112215186A CN112215186A CN202011131909.2A CN202011131909A CN112215186A CN 112215186 A CN112215186 A CN 112215186A CN 202011131909 A CN202011131909 A CN 202011131909A CN 112215186 A CN112215186 A CN 112215186A
- Authority
- CN
- China
- Prior art keywords
- vegetation
- initial image
- classification
- image
- classification result
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 46
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 26
- 238000003066 decision tree Methods 0.000 claims abstract description 23
- 238000004590 computer program Methods 0.000 claims description 21
- 230000011218 segmentation Effects 0.000 claims description 18
- 238000007637 random forest analysis Methods 0.000 description 11
- 238000010586 diagram Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 3
- 241000196324 Embryophyta Species 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000009499 grossing Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000003415 peat Substances 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 241000894007 species Species 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 230000009182 swimming Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/243—Classification techniques relating to the number of classes
- G06F18/24323—Tree-organised classifiers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
- G06V10/267—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Multimedia (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a marsh wetland vegetation classification method, a device, computer equipment and a storage medium, wherein the method comprises the steps of obtaining a marsh wetland image from an unmanned aerial vehicle to obtain an initial image; determining the attribute of the initial image, and classifying the vegetation in the initial image according to the attribute to obtain a classification result; and segmenting the initial image according to the classification result to obtain the vegetation type. According to the method, the image of the marsh wetland is shot by the unmanned aerial vehicle, the attribute of the obtained image is determined by adopting a decision tree and a Gini index mode, so that primary classification is carried out, the image after primary classification is subjected to graph cut by adopting a maximum flow/minimum cut algorithm, and the class determination is carried out by combining an RF classifier, so that a vegetation classification mode based on the image of the unmanned aerial vehicle is realized, and the image shot by the unmanned aerial vehicle has high spatial resolution, and the vegetation classification accuracy in a special scene can be improved.
Description
Technical Field
The invention relates to a vegetation classification method, in particular to a marsh wetland vegetation classification method, a marsh wetland vegetation classification device, computer equipment and a storage medium.
Background
All the characteristics of the plant community can be used as the basis for classification, such as appearance structure characteristics, plant species composition, vegetation dynamic characteristics, habitat characteristics and the like. The classification of vegetation is the basis of scientific research on vegetation. For vegetation classification, a satellite image-based classification method is adopted at present, but because the satellite image has a fixed orbit, the accuracy is reduced for smaller wetlands. In many cases, the wetland area is relatively small, and therefore satellite-based classification is not sensitive enough, which may cause major errors. One of the significant problems is pixel blending. For example, when the pixel size is 10m, each pixel has this size, which affects the total reflectance value of the pixel and therefore the good boundary or range of the pixel, species definition cannot be achieved, several methods can reduce errors in the satellite image, but most of them require a wide hyperspectral band.
Therefore, a new method is needed to be designed, so that a vegetation classification mode based on the unmanned aerial vehicle image is realized, and the vegetation classification accuracy under a special scene is improved.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a method, a device, computer equipment and a storage medium for classifying vegetation in marsh wetland.
In order to achieve the purpose, the invention adopts the following technical scheme: the marsh wetland vegetation classification method comprises the following steps:
acquiring a marsh wetland image from an unmanned aerial vehicle to obtain an initial image;
determining the attribute of the initial image, and classifying the vegetation in the initial image according to the attribute to obtain a classification result;
and segmenting the initial image according to the classification result to obtain the vegetation type.
The further technical scheme is as follows: the determining the attribute of the initial image and classifying the vegetation in the initial image according to the attribute to obtain a classification result includes:
splitting the initial image into a decision tree;
determining attributes of the decision tree by using a kini coefficient;
and classifying the vegetation in the initial image according to the attributes to obtain a classification result.
The further technical scheme is as follows: the segmenting the initial image according to the classification result to obtain vegetation categories includes:
and segmenting the initial image by adopting a maximum flow/minimum cut algorithm according to the classification result to obtain the vegetation type.
The further technical scheme is as follows: the segmenting the initial image by adopting a maximum flow/minimum cut algorithm according to the classification result to obtain the vegetation type comprises the following steps:
determining a vegetation region in the initial image according to the classification result;
segmenting the vegetation region by adopting a maximum flow/minimum cut algorithm to obtain images of different vegetation;
and determining the vegetation types according to the images of different vegetation to obtain the vegetation types.
The further technical scheme is as follows: the determining the vegetation type according to the different vegetation patterns to obtain the vegetation type comprises the following steps:
the image of different vegetation is subjected to category determination using a pixel-based RF classifier to obtain vegetation categories.
The invention also provides a marsh wetland vegetation classification device, which comprises:
the image acquisition unit is used for acquiring a marsh wetland image from the unmanned aerial vehicle to obtain an initial image;
the attribute determining unit is used for determining the attribute of the initial image and classifying the vegetation in the initial image according to the attribute to obtain a classification result;
and the segmentation unit is used for segmenting the initial image according to the classification result so as to obtain the vegetation type.
The further technical scheme is as follows: the attribute determining unit includes:
a splitting subunit, configured to split the initial image into a decision tree;
a determining subunit, configured to determine an attribute of the decision tree by using a kini coefficient;
and the classification subunit is used for classifying the vegetation in the initial image according to the attributes to obtain a classification result.
The further technical scheme is as follows: and the segmentation unit is used for segmenting the initial image by adopting a maximum flow/minimum cut algorithm according to the classification result so as to obtain the vegetation type.
The invention also provides computer equipment which comprises a memory and a processor, wherein the memory is stored with a computer program, and the processor realizes the method when executing the computer program.
The invention also provides a storage medium storing a computer program which, when executed by a processor, is operable to carry out the method as described above.
Compared with the prior art, the invention has the beneficial effects that: according to the method, the image of the marsh wetland is shot by the unmanned aerial vehicle, the attribute of the obtained image is determined by adopting a decision tree and a Gini index mode, so that primary classification is carried out, the image after primary classification is subjected to graph cut by adopting a maximum flow/minimum cut algorithm, and the class determination is carried out by combining an RF classifier, so that a vegetation classification mode based on the image of the unmanned aerial vehicle is realized, and the image shot by the unmanned aerial vehicle has high spatial resolution, and the vegetation classification accuracy in a special scene can be improved.
The invention is further described below with reference to the accompanying drawings and specific embodiments.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic view of an application scenario of the marsh wetland vegetation classification method provided by the embodiment of the invention;
fig. 2 is a schematic flow chart of a method for classifying vegetation in a marsh wetland according to an embodiment of the present invention;
fig. 3 is a sub-flow diagram of the marsh wetland vegetation classification method provided by the embodiment of the invention;
fig. 4 is a sub-flow diagram of the marsh wetland vegetation classification method provided by the embodiment of the invention;
fig. 5 is a schematic block diagram of a wetland vegetation classification device provided by the embodiment of the invention;
fig. 6 is a schematic block diagram of an attribute determining unit of the wetland vegetation classification device provided by the embodiment of the invention;
fig. 7 is a schematic block diagram of a segmentation unit of the wetland vegetation classification device provided by the embodiment of the invention;
FIG. 8 is a schematic block diagram of a computer device provided by an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Referring to fig. 1 and 2, fig. 1 is a schematic view of an application scenario of the method for classifying vegetation in swamp wetland according to the embodiment of the present invention. Fig. 2 is a schematic flow chart of the wetland vegetation classification method provided by the embodiment of the invention. The method for classifying the vegetation in the marsh wetland is applied to a server. The server performs data interaction with the unmanned aerial vehicle, and after the unmanned aerial vehicle acquires the corresponding marsh wetland image, the attribute determination and image segmentation are performed on the image so as to determine the vegetation type in the image.
Fig. 2 is a schematic flow chart of the wetland vegetation classification method provided by the embodiment of the invention. As shown in fig. 2, the method includes the following steps S110 to S130.
S110, acquiring a marsh wetland image from the unmanned aerial vehicle to obtain an initial image.
In this embodiment, the initial image is an image obtained by shooting a swamp wetland with an unmanned aerial vehicle.
The unmanned aerial vehicle is used to realize very high spatial resolution of the obtained pictures, and the resolution can be flexibly debugged, which cannot be realized by using satellite images. Due to the application of the unmanned aerial vehicle, the high spatial resolution of wetland mapping and the flexibility of image capturing are changed. For certain applications, such as wetland rendering, a bayesian pixel-based classifier is required to provide high accuracy segmentation for each different site, terrain, season, and other atmospheric conditions when used with a joint segmentation algorithm for images.
And S120, determining the attribute of the initial image, and classifying the vegetation in the initial image according to the attribute to obtain a classification result.
In this embodiment, the classification result refers to an initial classification of vegetation within the initial image.
In an embodiment, referring to fig. 3, the step S120 may include steps S121 to S123.
And S121, splitting the initial image into a decision tree.
In this embodiment, the decision tree refers to the content formed by splitting the initial image.
The initial image is divided into 100 random subsets (repetitions) to form a decision tree and for each tree, the attributes are determined using equations, and the final class selection for each pixel uses majority voting. Each tree can only split samples over a random subset, and decision trees need to select attributes.
And S122, determining the attribute of the decision tree by adopting the Gini coefficient.
Specifically, the attribute is determined using the kini coefficient criterion, the kini coefficient G being: based on the value of G, the attribute is automatically determined. Wherein p isiIs the proportion of pixels (i-1 to N) belonging to a particular class N, i.e. it is a prior probability. At least 10% of the entire ground truth should be provided as training and can be used for testing; divide the sample into 100 random subsets (repetitions) and determine the attributes using equations for each tree;
the final category selection for each pixel uses majority voting. For decision trees, the highest (no segmentation number) is 20; diversity index of split-standard ═ kini; for discriminant analysis, the kernel is quadratic; for naive bayes, the kernel-gaussian support vector machine, the kernel-radial basis function (rbf) -0.25; for K nearest neighbors, the neighbor number is 2; distance is Euclidean; for a random forest, the number of trees (t) is 100(1000 repeat samples); the number of divisions is 5853.
And S123, classifying the vegetation in the initial image according to the attributes to obtain a classification result.
After determining the attributes of the different regions within the initial image, the initial category of vegetation corresponding to each region can be determined.
And S130, segmenting the initial image according to the classification result to obtain the vegetation type.
In the present embodiment, the vegetation type refers to the type of vegetation in the marsh. The accuracy of the vegetation type and the classification result is high.
Specifically, the initial image is segmented by adopting a maximum flow/minimum cut algorithm according to the classification result to obtain the vegetation type.
After the unmanned aerial vehicle images are classified, segmenting the images by using a maximum flow/minimum segmentation algorithm; this technique uses region information to form appropriate segments from the pixels. The segment may be considered a post-classification smoothing process based on spatial similarity. The use of the maximum flow minimum cut algorithm is commonly referred to as graph cut completion segment formation. The algorithm uses a data cost and a smoothness cost. And performing graph cutting segmentation in MATLAB v.2019b by using an MATLAB wrapper mex file function, so that a user can call the C/C + + file. The subdivision step includes calculating data costs, smoothness costs and energy usage, a posterior probability map based on pixel classification. Based on the maximum probability of a pixel, a segment is formed and the pixel is added.
Data cost DpBased on the individual labels of the pixels and their likelihood functions. Data cost D for a given set of features UN in a vectorized image with N pixelspThe cost of assigning the class n to the pixel p is measured. In the image processing, DpMay be generally denoted as Dp=||Up(n)-I(p)||2;
Where I (p) is the observed reflectance of pixel p. On the other hand, smoothness cost Vp,qFor promoting populations. n isp、nqThe labels for pixels p, q, respectively, are defined using the following equations.
Where Δ (p, q) ═ i (p) -i (q) denotes how different the reflectance values of p and q are, c > 0is a smoothness factor, standard deviation σ > 0is used to control the contribution of Δ (p, q) to the penalty, T ═ 1ifnpnqOtherwise, it is 0. Minimizing the energy (E) can be directly interpreted as post-maximizing. Using the probability function in the previous step, the energy function is E (U)N,n)=∑p∈NDp+∑p,q∈ NVp,q(ii) a Thus, E (U)NN) i.e. having a total of N pixels U for all N classesNThe energy of the image vector of (a) is minimized, resulting in a smooth segment. The pixels with the smallest E are connected together to form segments according to an initial label obtained from a pixel-based RF (Random forest) classification, resulting in an image tile partition.
In an embodiment, referring to fig. 4, the step S130 may include steps S131 to S133.
S131, determining the vegetation area in the initial image according to the classification result.
In this embodiment, the classification result may preliminarily divide the initial image into different regions.
S132, segmenting the vegetation region by adopting a maximum flow/minimum cut algorithm to obtain images of different vegetation.
In this embodiment, the images of different vegetation refer to areas of different vegetation divided from within the initial image.
And S133, determining the vegetation type according to the images of different vegetation to obtain the vegetation type.
Specifically, a pixel-based RF classifier is employed to perform a class determination on images of different vegetation to derive a vegetation class.
The unmanned aerial vehicle image is provided with intensity and color information, the unmanned aerial vehicle image is segmented by using the posterior probability associated with each pixel of each category based on the minimum segmentation technology of the maximum flow, in order to calculate the posterior probability, the unmanned aerial vehicle image needs to be initially classified, the segmentation is based on the smoothing process of the classified area, the smoothness is higher compared with the image which is completely manually marked, 13 wave bands are used for further classifying the unmanned aerial vehicle image in total according to the texture and the color intensity, and the RF is used as a classifier in the embodiment from the aspects of accuracy and time.
For example: the study object was a small portion of the marsh of the Ireland Clara marsh. In all these ecoenvironments, the main focus is to protect the active peat formation areas, which are considered as C, SC and AF ecoenvironments. There are areas covered by peat moss with hammocks, depressions, lawns and many swimming pools. SM ecoregions appearing at SC ecoregion boundaries may appear almost homogenous, making it difficult to distinguish them. To capture high resolution images, tethered drones are used. An optical camera view with 100-. The lens has the functions of distortion prevention and automatic focusing, and is equivalent to 20mm and 35mm in format. When the image is clicked on, the ratio is kept to 4: 3. the highest temperature on the day was recorded as 20C. The flying height is 100m and the spatial resolution of the captured image is 1.8 cm. The drone task is pre-loaded with Google maps in the Pix4DCapture application to use iOS-12 devices. The average speed of capturing images at 70% of the front and 80% of the side overlap, respectively, was 3 m/s.
For geo-registration, there is a geo-tag in the drone image, i.e., a location with a longer latitude. For better localization, the image is superimposed on a high-resolution digitalglobworld image, with a spatial resolution of 30 cm, which can be used as a basic map in ArcMap v.10.6.1. The "geo-registration" toolkit provided in the 3-4 ground control determines points for each image and corrects the projections to a geo-coordinate system-the world geodetic system 84. 3-4 ground controls were determined for each image and projected, and the available images were randomly divided into 70% training and 30% testing. In order to have a correct understanding of the drawing accuracy, all images are marked as four vegetation communities, namely M, SMSC, C and AF.
According to the marsh wetland vegetation classification method, the image of the marsh wetland is shot by the unmanned aerial vehicle, the attribute of the obtained image is determined by adopting a decision tree and a Gini index mode, primary classification is carried out, the image after primary classification is subjected to graph cut by adopting a maximum flow/minimum cut algorithm, and the class determination is carried out by combining an RF classifier, so that the vegetation classification mode based on the image of the unmanned aerial vehicle is realized, and the image shot by the unmanned aerial vehicle can have high spatial resolution, and the vegetation classification accuracy in a special scene can be improved.
Fig. 5 is a schematic block diagram of a wetland vegetation classification device 300 according to an embodiment of the invention. As shown in fig. 5, the present invention also provides a marsh wetland vegetation classification device 300 corresponding to the above marsh wetland vegetation classification method. The marsh wetland vegetation classification device 300 includes means for performing the above-described marsh wetland vegetation classification method, and may be configured in a server. Specifically, referring to fig. 5, the marsh wetland vegetation classification device 300 includes an image acquisition unit 301, an attribute determination unit 302 and a segmentation unit 303.
The image acquisition unit 301 is used for acquiring a swamp wetland image from the unmanned aerial vehicle to obtain an initial image; an attribute determining unit 302, configured to determine an attribute of the initial image, and classify vegetation in the initial image according to the attribute to obtain a classification result; a segmentation unit 303, configured to segment the initial image according to the classification result to obtain a vegetation category.
In one embodiment, as shown in FIG. 6, the attribute determination unit 302 includes a splitting subunit 3021, a determining subunit 3022, and a classifying subunit 3023.
A splitting subunit 3021, configured to split the initial image into a decision tree; a determining subunit 3022, configured to determine an attribute of the decision tree using a kini coefficient; and the classification subunit 3023 is configured to classify the vegetation in the initial image according to the attributes to obtain a classification result.
In an embodiment, the segmentation unit 303 is configured to segment the initial image according to the classification result by using a max flow/min cut algorithm to obtain a vegetation class.
In an embodiment, as shown in fig. 7, the segmentation unit 303 includes a region determination subunit 3031, a region segmentation subunit 3032, and a category determination subunit 3033.
A region determining subunit 3031, configured to determine a region of vegetation in the initial image according to the classification result; a region segmentation subunit 3032, configured to segment a region of vegetation by using a maximum flow/minimum cut algorithm to obtain images of different vegetation; and a category determining subunit 3033, configured to determine categories of vegetation according to the images of different vegetation to obtain vegetation categories.
In an embodiment, the category determination subunit 3033 is configured to perform category determination on images of different vegetation using a pixel-based RF classifier to obtain vegetation categories.
It should be noted that, as can be clearly understood by those skilled in the art, the specific implementation processes of the wetland vegetation classification device 300 and each unit may refer to the corresponding descriptions in the foregoing method embodiments, and for convenience and brevity of description, no further description is provided herein.
The above-described wetland vegetation classification apparatus 300 may be implemented in the form of a computer program that can be run on a computer device as shown in fig. 8.
Referring to fig. 8, fig. 8 is a schematic block diagram of a computer device according to an embodiment of the present application. The computer device 500 may be a server, wherein the server may be an independent server or a server cluster composed of a plurality of servers.
Referring to fig. 8, the computer device 500 includes a processor 502, memory, and a network interface 505 connected by a system bus 501, where the memory may include a non-volatile storage medium 503 and an internal memory 504.
The non-volatile storage medium 503 may store an operating system 5031 and a computer program 5032. The computer programs 5032 comprise program instructions that, when executed, cause the processor 502 to perform a method of marshland vegetation classification.
The processor 502 is used to provide computing and control capabilities to support the operation of the overall computer device 500.
The internal memory 504 provides an environment for the operation of the computer program 5032 in the non-volatile storage medium 503, and when the computer program 5032 is executed by the processor 502, the processor 502 can execute a marsh wetland vegetation classification method.
The network interface 505 is used for network communication with other devices. Those skilled in the art will appreciate that the configuration shown in fig. 8 is a block diagram of only a portion of the configuration relevant to the present teachings and does not constitute a limitation on the computer device 500 to which the present teachings may be applied, and that a particular computer device 500 may include more or less components than those shown, or combine certain components, or have a different arrangement of components.
Wherein the processor 502 is configured to run the computer program 5032 stored in the memory to implement the following steps:
acquiring a marsh wetland image from an unmanned aerial vehicle to obtain an initial image; determining the attribute of the initial image, and classifying the vegetation in the initial image according to the attribute to obtain a classification result; and segmenting the initial image according to the classification result to obtain the vegetation type.
In an embodiment, when the processor 502 implements the steps of determining the attribute of the initial image and classifying vegetation in the initial image according to the attribute to obtain a classification result, the following steps are specifically implemented:
splitting the initial image into a decision tree; determining attributes of the decision tree by using a kini coefficient; and classifying the vegetation in the initial image according to the attributes to obtain a classification result.
In an embodiment, when the step of segmenting the initial image according to the classification result to obtain the vegetation classification is implemented by the processor 502, the following steps are specifically implemented:
and segmenting the initial image by adopting a maximum flow/minimum cut algorithm according to the classification result to obtain the vegetation type.
In an embodiment, when the step of segmenting the initial image by using the max flow/min cut algorithm according to the classification result to obtain the vegetation classification is implemented by the processor 502, the following steps are specifically implemented:
determining a vegetation region in the initial image according to the classification result; segmenting the vegetation region by adopting a maximum flow/minimum cut algorithm to obtain images of different vegetation; and determining the vegetation types according to the images of different vegetation to obtain the vegetation types.
In an embodiment, when the step of determining the vegetation type according to the graph of different vegetation is implemented by the processor 502 to obtain the vegetation type, the following steps are specifically implemented:
the image of different vegetation is subjected to category determination using a pixel-based RF classifier to obtain vegetation categories.
It should be understood that in the embodiment of the present Application, the Processor 502 may be a Central Processing Unit (CPU), and the Processor 502 may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. Wherein a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It will be understood by those skilled in the art that all or part of the flow of the method implementing the above embodiments may be implemented by a computer program instructing associated hardware. The computer program includes program instructions, and the computer program may be stored in a storage medium, which is a computer-readable storage medium. The program instructions are executed by at least one processor in the computer system to implement the flow steps of the embodiments of the method described above.
Accordingly, the present invention also provides a storage medium. The storage medium may be a computer-readable storage medium. The storage medium stores a computer program, wherein the computer program, when executed by a processor, causes the processor to perform the steps of:
acquiring a marsh wetland image from an unmanned aerial vehicle to obtain an initial image; determining the attribute of the initial image, and classifying the vegetation in the initial image according to the attribute to obtain a classification result; and segmenting the initial image according to the classification result to obtain the vegetation type.
In an embodiment, when the processor executes the computer program to implement the steps of determining the attribute of the initial image and classifying vegetation in the initial image according to the attribute to obtain a classification result, the following steps are specifically implemented:
splitting the initial image into a decision tree; determining attributes of the decision tree by using a kini coefficient; and classifying the vegetation in the initial image according to the attributes to obtain a classification result.
In an embodiment, when the processor executes the computer program to implement the step of segmenting the initial image according to the classification result to obtain the vegetation classification, the processor specifically implements the following steps:
and segmenting the initial image by adopting a maximum flow/minimum cut algorithm according to the classification result to obtain the vegetation type.
In an embodiment, when the processor executes the computer program to implement the step of segmenting the initial image by using a max flow/min cut algorithm according to the classification result to obtain the vegetation classification, the following steps are specifically implemented:
determining a vegetation region in the initial image according to the classification result; segmenting the vegetation region by adopting a maximum flow/minimum cut algorithm to obtain images of different vegetation; and determining the vegetation types according to the images of different vegetation to obtain the vegetation types.
In an embodiment, when the processor executes the computer program to implement the step of determining the vegetation type according to the graph of different vegetation to obtain the vegetation type, the following steps are specifically implemented:
the image of different vegetation is subjected to category determination using a pixel-based RF classifier to obtain vegetation categories.
The storage medium may be a usb disk, a removable hard disk, a Read-Only Memory (ROM), a magnetic disk, or an optical disk, which can store various computer readable storage media.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative. For example, the division of each unit is only one logic function division, and there may be another division manner in actual implementation. For example, various elements or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented.
The steps in the method of the embodiment of the invention can be sequentially adjusted, combined and deleted according to actual needs. The units in the device of the embodiment of the invention can be merged, divided and deleted according to actual needs. In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a storage medium. Based on such understanding, the technical solution of the present invention essentially or partially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a terminal, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (10)
1. The marsh wetland vegetation classification method is characterized by comprising the following steps:
acquiring a marsh wetland image from an unmanned aerial vehicle to obtain an initial image;
determining the attribute of the initial image, and classifying the vegetation in the initial image according to the attribute to obtain a classification result;
and segmenting the initial image according to the classification result to obtain the vegetation type.
2. The method for classifying vegetation in swamp wetland according to claim 1, wherein the determining attributes of the initial image and classifying vegetation in the initial image according to the attributes to obtain a classification result comprises:
splitting the initial image into a decision tree;
determining attributes of the decision tree by using a kini coefficient;
and classifying the vegetation in the initial image according to the attributes to obtain a classification result.
3. The method for classifying vegetation in swamp wetland according to claim 1, wherein the step of segmenting the initial image according to the classification result to obtain vegetation categories comprises:
and segmenting the initial image by adopting a maximum flow/minimum cut algorithm according to the classification result to obtain the vegetation type.
4. The method for classifying vegetation in swamp wetland according to claim 3, wherein the step of segmenting the initial image by adopting a maximum flow/minimum cut algorithm according to the classification result to obtain the vegetation class comprises the following steps:
determining a vegetation region in the initial image according to the classification result;
segmenting the vegetation region by adopting a maximum flow/minimum cut algorithm to obtain images of different vegetation;
and determining the vegetation types according to the images of different vegetation to obtain the vegetation types.
5. The method for classifying vegetation in swamp wetland according to claim 4, wherein the determining the vegetation type according to the pattern of different vegetation to obtain the vegetation type comprises:
the image of different vegetation is subjected to category determination using a pixel-based RF classifier to obtain vegetation categories.
6. Wetland vegetation sorter, its characterized in that includes:
the image acquisition unit is used for acquiring a marsh wetland image from the unmanned aerial vehicle to obtain an initial image;
the attribute determining unit is used for determining the attribute of the initial image and classifying the vegetation in the initial image according to the attribute to obtain a classification result;
and the segmentation unit is used for segmenting the initial image according to the classification result so as to obtain the vegetation type.
7. The wetland vegetation classification device of claim 6, wherein the attribute determination unit comprises:
a splitting subunit, configured to split the initial image into a decision tree;
a determining subunit, configured to determine an attribute of the decision tree by using a kini coefficient;
and the classification subunit is used for classifying the vegetation in the initial image according to the attributes to obtain a classification result.
8. The wetland vegetation classification device of claim 7, wherein the segmentation unit is configured to segment the initial image according to the classification result by using a max-flow/min-cut algorithm to obtain a vegetation classification.
9. A computer arrangement, characterized in that the computer arrangement comprises a memory having stored thereon a computer program and a processor implementing the method according to any of claims 1-5 when executing the computer program.
10. A storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method according to any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011131909.2A CN112215186B (en) | 2020-10-21 | 2020-10-21 | Classification method, device, computer equipment and storage medium for marsh wetland vegetation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011131909.2A CN112215186B (en) | 2020-10-21 | 2020-10-21 | Classification method, device, computer equipment and storage medium for marsh wetland vegetation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112215186A true CN112215186A (en) | 2021-01-12 |
CN112215186B CN112215186B (en) | 2024-04-05 |
Family
ID=74056226
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011131909.2A Active CN112215186B (en) | 2020-10-21 | 2020-10-21 | Classification method, device, computer equipment and storage medium for marsh wetland vegetation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112215186B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114612787A (en) * | 2022-03-21 | 2022-06-10 | 南京市测绘勘察研究院股份有限公司 | Urban green land deep learning extraction method supported by scale change strategy |
CN115965812A (en) * | 2022-12-13 | 2023-04-14 | 桂林理工大学 | Evaluation method for wetland vegetation species and ground feature classification by unmanned aerial vehicle image |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160334276A1 (en) * | 2015-05-12 | 2016-11-17 | BioSensing Systems, LLC | Apparatuses and methods for bio-sensing using unmanned aerial vehicles |
US20170076438A1 (en) * | 2015-08-31 | 2017-03-16 | Cape Analytics, Inc. | Systems and methods for analyzing remote sensing imagery |
CN107273868A (en) * | 2017-06-28 | 2017-10-20 | 电子科技大学 | A kind of method that the dump and coal gangue area of coal field are distinguished in remote sensing images |
CN107636678A (en) * | 2015-06-29 | 2018-01-26 | 北京市商汤科技开发有限公司 | Method and apparatus for the attribute of prognostic chart picture sample |
CN109272522A (en) * | 2018-10-19 | 2019-01-25 | 山东大学 | A kind of image thinning dividing method based on local feature |
CN109635811A (en) * | 2018-11-09 | 2019-04-16 | 中国科学院空间应用工程与技术中心 | The image analysis method of spatial plant |
CN109829425A (en) * | 2019-01-31 | 2019-05-31 | 沈阳农业大学 | A kind of small scale terrain classification method and system of Farmland Landscape |
CN110084205A (en) * | 2019-04-30 | 2019-08-02 | 合肥工业大学 | One kind being based on improved object-oriented Classification of High Resolution Satellite Images method |
US10410092B1 (en) * | 2015-12-16 | 2019-09-10 | Hrl Laboratories, Llc | Automated classification of rock types and analyst-centric visualizations—front end |
US20200074014A1 (en) * | 2018-08-28 | 2020-03-05 | Google Llc | Analysis for results of textual image queries |
-
2020
- 2020-10-21 CN CN202011131909.2A patent/CN112215186B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160334276A1 (en) * | 2015-05-12 | 2016-11-17 | BioSensing Systems, LLC | Apparatuses and methods for bio-sensing using unmanned aerial vehicles |
CN107636678A (en) * | 2015-06-29 | 2018-01-26 | 北京市商汤科技开发有限公司 | Method and apparatus for the attribute of prognostic chart picture sample |
US20170076438A1 (en) * | 2015-08-31 | 2017-03-16 | Cape Analytics, Inc. | Systems and methods for analyzing remote sensing imagery |
US10410092B1 (en) * | 2015-12-16 | 2019-09-10 | Hrl Laboratories, Llc | Automated classification of rock types and analyst-centric visualizations—front end |
CN107273868A (en) * | 2017-06-28 | 2017-10-20 | 电子科技大学 | A kind of method that the dump and coal gangue area of coal field are distinguished in remote sensing images |
US20200074014A1 (en) * | 2018-08-28 | 2020-03-05 | Google Llc | Analysis for results of textual image queries |
CN109272522A (en) * | 2018-10-19 | 2019-01-25 | 山东大学 | A kind of image thinning dividing method based on local feature |
CN109635811A (en) * | 2018-11-09 | 2019-04-16 | 中国科学院空间应用工程与技术中心 | The image analysis method of spatial plant |
CN109829425A (en) * | 2019-01-31 | 2019-05-31 | 沈阳农业大学 | A kind of small scale terrain classification method and system of Farmland Landscape |
CN110084205A (en) * | 2019-04-30 | 2019-08-02 | 合肥工业大学 | One kind being based on improved object-oriented Classification of High Resolution Satellite Images method |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114612787A (en) * | 2022-03-21 | 2022-06-10 | 南京市测绘勘察研究院股份有限公司 | Urban green land deep learning extraction method supported by scale change strategy |
CN114612787B (en) * | 2022-03-21 | 2024-05-10 | 南京市测绘勘察研究院股份有限公司 | Urban green space deep learning extraction method supported by scale change strategy |
CN115965812A (en) * | 2022-12-13 | 2023-04-14 | 桂林理工大学 | Evaluation method for wetland vegetation species and ground feature classification by unmanned aerial vehicle image |
CN115965812B (en) * | 2022-12-13 | 2024-01-19 | 桂林理工大学 | Evaluation method for classification of unmanned aerial vehicle images on wetland vegetation species and land features |
Also Published As
Publication number | Publication date |
---|---|
CN112215186B (en) | 2024-04-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11151378B2 (en) | Systems and methods for analyzing remote sensing imagery | |
Berman et al. | Non-local image dehazing | |
CN111080526B (en) | Method, device, equipment and medium for measuring and calculating farmland area of aerial image | |
CN104915949B (en) | A kind of image matching method of combination point feature and line feature | |
CN103824267B (en) | Super-pixel life cycle management method, device and readable storage medium storing program for executing | |
Marais et al. | An optimal image transform for threshold-based cloud detection using heteroscedastic discriminant analysis | |
CN112215186B (en) | Classification method, device, computer equipment and storage medium for marsh wetland vegetation | |
CN108566513A (en) | A kind of image pickup method of unmanned plane to moving target | |
CN113160053B (en) | Pose information-based underwater video image restoration and splicing method | |
CN114694043B (en) | Ground wounded person identification method, device and medium for airborne multispectral and multispectral optimal characteristics under complex scene | |
CN116543325A (en) | Unmanned aerial vehicle image-based crop artificial intelligent automatic identification method and system | |
CN111062341B (en) | Video image area classification method, device, equipment and storage medium | |
JP2023544473A (en) | Image expansion device, control method, and program | |
JP5352435B2 (en) | Classification image creation device | |
CN112016484B (en) | Plant invasion evaluation method, plant invasion evaluation device, computer equipment and storage medium | |
Larsen | Individual tree top position estimation by template voting | |
CN114821353A (en) | Radar feature matching-based crop planting area rapid extraction method and system | |
CN113566815A (en) | Construction method and device of star map recognition navigation triangle library | |
CN112669346A (en) | Method and device for determining road surface emergency | |
CN116523884B (en) | Remote sensing image data intelligent interpretation method | |
Song et al. | Thin Cloud Removal for Single RGB Aerial Image | |
CN118097562B (en) | Remote monitoring method for seaweed proliferation condition | |
CN118446938B (en) | Shadow area restoration method and device for remote sensing image and electronic equipment | |
CN118196214B (en) | Outdoor camera distribution control method and equipment based on three-dimensional scene simulation | |
CN115115542B (en) | Quick restoration method for color difference strip after cloud platform remote sensing image mosaic |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |