CN104851113B - The urban vegetation extraction method of multiple-spatial resolution remote sensing image - Google Patents

The urban vegetation extraction method of multiple-spatial resolution remote sensing image Download PDF

Info

Publication number
CN104851113B
CN104851113B CN201510186167.6A CN201510186167A CN104851113B CN 104851113 B CN104851113 B CN 104851113B CN 201510186167 A CN201510186167 A CN 201510186167A CN 104851113 B CN104851113 B CN 104851113B
Authority
CN
China
Prior art keywords
mrow
msub
mtd
msup
vegetation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201510186167.6A
Other languages
Chinese (zh)
Other versions
CN104851113A (en
Inventor
佃袁勇
姚崇怀
徐永荣
周志翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong Agricultural University
Original Assignee
Huazhong Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong Agricultural University filed Critical Huazhong Agricultural University
Priority to CN201510186167.6A priority Critical patent/CN104851113B/en
Publication of CN104851113A publication Critical patent/CN104851113A/en
Application granted granted Critical
Publication of CN104851113B publication Critical patent/CN104851113B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Other Investigation Or Analysis Of Materials By Electrical Means (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a kind of urban vegetation extraction method of multiple-spatial resolution remote sensing image, including the initial patch extraction of multispectral image vegetation based on high spatial resolution, the multispectral image computation vision perceptual parameters to high spatial resolution, the panchromatic wave-band image to high spatial resolution calculate textural characteristics parameters, combination features vegetation area automatic increase, obtain the steps such as vegetation area distribution map.Image block dividing processing is employed, accelerates image processing speed.According to NDVI features, using automatically selecting for the adaptive dynamic threshold of maximal mathematical expectation algorithm.The panchromatic wave-band of remote sensing image, multi light spectrum hands are made full use of, the NDVI features extracted from panchromatic wave-band and multi light spectrum hands, visual signature, textural characteristics are integrated, vegetation area judgement is carried out to vegetation, accuracy is improved.

Description

The urban vegetation extraction method of multiple-spatial resolution remote sensing image
Technical field
The present invention relates to a kind of urban vegetation extraction method of multiple-spatial resolution remote sensing image, belong to remote sensing image neck Domain.
Background technology
Urban vegetation has carbon sequestration, releases foster function, plays an important roll in terms of urban ecological environment is kept.Based on high score The urban vegetation covering that resolution remote sensing image is extracted, is to assess ecocity, the important indicator of garden-like cities.Height is utilized at this stage The method that resolution remote sense image extracts urban vegetation mainly has following several:
(1) artificial visual interpretation interpretation.This method is main with artificial interpretation vegetation area, vegetation border of then sketching. The feature that Main Basiss vegetation shows in the picture in visual interpretation includes shape, size, color and tint, shade, position, line Reason relation etc..The method that visual interpretation uses manual work screen digitization, automaticity is relatively low, and subjectivity is strong, it is desirable to have Abundant image interpretation practical experience, and the thematic information extracted is better understood from and recognized.But being due to can be fully Ground utilizes various information in remote sensing image, is still the important method of remote sensing image processing.
(2) supervised classification method of object-oriented.This method mainly utilizes color, space line of the vegetation area in image Characteristic is managed, on the basis of the image segmentation of object-oriented, the various features of each object are calculated, including (it includes spectrum:Spectrum Average, brightness, mean square deviation, ratio, the spectral signature related to field, the spectral signature related with parent object etc.) shape (bag Include area, girth, the length of side, tight ness rating, length-width ratio and the shape facility related with parent object to subobject), texture is (according to ash Second moment (energy), contrast, correlation, entropy, unfavourable balance square that degree co-occurrence matrix is defined) and topological characteristic etc., finally select phase The feature answered carries out object-based classification and extracted.
These above-mentioned extracting methods all existing defects and problem, including:
(1) when extracting vegetation area, it is necessary to select sample in advance, priori is generated, and it is special to be not based on image itself Property.There is man-machine interaction during this, artificial influence factor is more.
(2) automaticity is not high.The yardstick split in assorting process, threshold value, the feature of vegetation classification are required for people Work interactive selection, these pilot process have had a strong impact on the automaticity of vegetation extraction.In addition, in cutting procedure, view picture Image uses same threshold value, does not take into full account the difference (such as meadow, coniferous forest, broad-leaf forest) between different vegetation types Feature difference on image.
(3) multispectral data in high spatial resolution image is only make use of, not using with more high spatial resolution The characteristics of panchromatic wave-band, and in practice, as the remotely-sensed data of the high spatial resolutions such as WorldView, QuickBird it is panchromatic with it is many Spectrum image is bundled.The multispectral data in high spatial resolution image is only make use of, not using with more The characteristics of high spatial resolution panchromatic wave-band, and in practice, as the remote sensing of the high spatial resolutions such as WorldView, QuickBird Data PAN and multi-spectral image is bundled.
The content of the invention
In order to solve the above problems, the present invention has panchromatic and light more for IKONOS, QuickBird, WorldView etc. The high-resolution remote sensing image of wave band is composed, it is full-automatic to extract urban vegetation region, it is proposed that a kind of multiple-spatial resolution remote sensing image Urban vegetation extraction method.
The present invention provides a kind of urban vegetation extraction method of multiple-spatial resolution remote sensing image, mainly including following step Suddenly:
Step 1: the initial patch of multispectral image vegetation based on high spatial resolution is extracted;
Step 2: to the multispectral image computation vision perceptual parameters of high spatial resolution;
Step 3: calculating textural characteristics parameter to the panchromatic wave-band image of high spatial resolution;
Step 4: the vegetation area automatic increase of combination features, obtains vegetation area distribution map.
It is preferred that, above-mentioned steps one specifically include following steps:
Step 1.1 calculates NDVI vegetation spectral indexes, and calculation formula is
Wherein NIR represents the pixel value of near infrared band, and R represents the pixel value of red band;
Step 1.2 carries out piecemeal to NDVI images, per block size 300*300 pixels, in image edge part less than 300* During 300 pixel, it is defined by actual size;
Step 1.3 the NDVI block datas of each 300*300 pixel are obtained automatically segmentation threshold distinguish vegetation with it is non- Vegetation, obtains initial vegetation area;
Step 1.4 repeat step 1.3, obtains the vegetation prime area of all piecemeals, merges the vegetation region in all piecemeals Domain, obtains initial vegetation area, and connective mark is carried out to whole image vegetation area, the pixel of each vegetative patch is obtained Coordinate.
It is preferred that, the acquisition of segmentation threshold is obtained by the following method in above-mentioned steps 1.3:
(1) the several ground mulching type of correspondence in each piecemeal NDVI, with ωiRepresent, it is assumed that each atural object classification NDVI distribution obey Gaussian density function distribution, then probability-distribution function p (x) total NDVI in each piecemeal can be represented For:
Wherein, n represents the total quantity of classification, ωiI-th of atural object classification is represented, average and variance use m respectivelyiRepresent, Its probability density function is expressed as
Wherein, p (ωi) be each atural object classification probability density, its meet condition be:
Under these conditions, the quantity n of estimation atural object classification, each class can be converted into by solving the threshold value of vegetation and non-vegetation Other ωiAverage and variance point miComprise the following steps that:
(2) maximum of NDVI in the frequency histogram f (x) of NDVI in piecemeal image, search piecemeal image is counted NDVImaxWith minimum value NDVImin, by NDVIminTo NDVImax256 parts are equally divided into, statistics NDVI values appear in every a interval Pixel quantity, obtain NDVI histogram f (x);
(3) single order for calculating NDVI histogram f (x) is reciprocal, straight for 0 position searching NDVI frequencies according to single order inverse Maximum point, minimum point in square figure;
(4) quantity of maximum point is counted, ground mulching categorical measure n is determined, using the NDVI values at maximum as often The initial mean value m of one ground mulching classificationi, initial σ is calculated with two neighboring minimum pointiFormula it is as follows:
σi=(NDVI2-NDVI1)/4
Wherein, NDVI2, NDVI1Corresponding NDVI values at two neighboring minimum are represented respectively;
Wherein, wherein ncount is the number of all pixels of the NDVI values between two neighboring minimum, and n_total is Total number of pixels of piecemeal;
(5) maximal mathematical expectation (EM) algorithm is based on, p (ω are estimated using loop iterationi)、miEM algorithms are by following Ring iterative, each iteration is by asking two steps of desired value and expectation maximization to constitute;The former according to the currency of parameter to be estimated, The desired value of direct estimation probability density from observation data, the latter is by maximizing this expectation come the estimation of undated parameter Amount, this two step in whole iterative process successively alternately, until iterative process convergence;Specific formula for calculation is as follows:
The estimation of above-mentioned three formula is that k represents k-th of atural object classification, t and t in prior probability, average and standard deviation, formula respectively + 1 represents the estimate used in current and next iteration respectively, and i, j represents the line number and columns of NDVI images, X respectively The NDVI values that the i-th row j is arranged in (i, j) expression NDVI images, and conditional probability p (X (i, j) | ωi) calculating and full probability p (X (i, J) value) is drawn by the respective formula in step (1);As the adjacent p (ω iterated to calculate twicei)、miAnd σiValue be less than it is given Threshold epsilon (ε=10-8) when iteration ends;
(6) the initial vegetation area of piecemeal is determined, it is assumed that the NDVI averages value of each classification obtained in upper step is by from big To small sequence, m1, m2, mn, corresponding variance is σi, then initial vegetation distribution map is represented by VI (i, j), wherein 1 represents vegetation Region, 0 represents nonvegetated area domain, then
Wherein threshold value T is tried to achieve by equation below
It is preferred that, the connective mark in above-mentioned steps 1.4 uses 4 neighborhoods or 8 neighborhoods.
It is preferred that, above-mentioned steps two specifically include following steps:
Step 2.1 has RGB color space to transform to HSI color spaces to multispectral image, is calculated and planted by below equation By brightness, colourity, the result of variations of saturation degree
Step 2.2 computation vision characteristic perception parameter, according to initial vegetative patch as sample, calculates each patch The brightness of vegetation, colourity, the mean μ of saturation degreex(H, S, I), variances sigmax(H, S, I), calculates normalization tone parameter of saturation
Wherein, H (i, j) is tone, and (i, j) represents pixel coordinate, μkThe average of k-th of vegetative patch, its corresponding side Difference is σk, HL, HUFor the lower and upper limit of vegetation tone
It is preferred that, above-mentioned steps three are specially:
For panchromatic wave-band video generation gray level co-occurrence matrixes, 6 parametric textures of gray level co-occurrence matrixes are calculated:Contrast, Diversity, angle energy, entropy, homogeneity, auto-correlation coefficient, specific formula for calculation are as follows:
The selection of textural characteristics computation window mouthful and the resolution ratio of panchromatic image and multispectral image have relation, it is assumed that panchromatic The spatial resolution of wave band is n times of multispectral image, then the window that texture is calculated is 2n-l, is calculated once every n pixel.
It is preferred that, above-mentioned steps four are specially:Using each initial vegetative patch as " seed ", the method increased using region The pixel that will be adjacent to " seed " patch is attached on seed region.
It is preferred that, the criterion that above-mentioned zone increases is decided by vote using textural characteristics, visually-perceptible characteristic synthetic using multiple features To determine.
It is preferred that, the criterion that above-mentioned zone increases is as follows:
(1) to each initial vegetative patch, brightness (I), normalization tone (H), saturation degree (S), contrast are counted respectively (Contrast), diversity (Dissimilarity), angle energy (Energy), entropy (Entropy), homogeneity (Homogeneity), the average of auto-correlation coefficient and features described above parameter, variance, it is assumed that its value is μ, σ;To initial vegetation 8 neighborhood territory pixels of patch, according to each feature, whether be vegetation, discriminant function is defined as follows if judging it
Wherein, M (i, j) is the characteristic parameter calculated, in T2Judgment threshold is represented, i, j represents the ranks number of pixel, 1 Vegetation area is represented, 0 represents nonvegetated area domain, and then obtains the corresponding Potential vegetation administrative division map of above-mentioned 9 characteristic parameters difference;
(2) multiple features voting determines final vegetation area, and this 9 width Potential vegetation administrative division map is voted, that is, existed Each pixel is when when having 6 and its above, feature is collectively labeled as 1 on vegetation area figure, it is believed that be final vegetation area.
Relative to prior art, the urban vegetation extraction method of the multiple-spatial resolution remote sensing image that the present invention is provided have with Lower advantage:
(1) present invention employs the vegetation threshold value automatic selecting method based on maximal mathematical expectation, data processing is improved Middle automaticity.Piecemeal treatment technology is employed in addition, it is to avoid view picture image uses same threshold decision, it is to avoid difference The feature difference that vegetation pattern (such as meadow, coniferous forest, broad-leaf forest) is showed in remotely-sensed data, enhance vegetation threshold value from Adaptability.
(2) present invention proposes initial vegetation area using NDVI, and high-resolution multi-spectral data are combined on this basis Visually-perceptible feature, the textural characteristics of panchromatic wave-band, the method voted using multiple features determines final vegetation area. This method considers the vision of high spatial resolution, textural characteristics, improves the accuracy of vegetation extraction.
Brief description of the drawings
Fig. 1 is schematic flow sheet of the present invention.
Embodiment
Understand for the ease of those of ordinary skill in the art and implement the present invention, below in conjunction with the accompanying drawings and embodiment The present invention is described in further detail.
This programme has panchromatic and multi light spectrum hands high-resolution for IKONOS, QuickBird, WorldView etc. Remote sensing image, it is full-automatic to extract urban vegetation region.Specific step includes:
Step 1: the initial patch of multispectral image vegetation based on high spatial resolution is extracted.
Calculate NDVI vegetation spectral indexes.Specific formula for calculation is shown in (1)
Wherein NIR represents the pixel value of near infrared band, and R represents the pixel value of red band.
Piecemeal is carried out to NDVI images, per block size 300*300 pixels, in image edge part less than 300*300 pixels When, it is defined by actual size.
Segmentation threshold differentiation vegetation and non-vegetation are obtained automatically to the NDVI block datas of each 300*300 pixel, obtained Obtain initial vegetation area.The method for obtaining segmentation threshold is as follows:
Because vegetation has strong absorption in red band, and there is strong reflection in near-infrared, by red, near infrared band meter The NDVI indexes of calculation enhance this characteristic of vegetation, can preferably react the difference of vegetation and other types of ground objects.NDVI's Codomain scope [- 1,1], for vegetation, NDVI is typically larger.
The several ground mulching type of correspondence is (with ω in each piecemeal NDVIiRepresent), it is assumed that each atural object classification Gaussian density function distribution is obeyed in NDVI distributions, then probability-distribution function p (x) total NDVI in each piecemeal is represented by
Wherein, n represents the total quantity of classification, ωiI-th of atural object classification is represented, average and variance use m respectivelyiTable Show, its probability density function is expressed as formula (3), p (ωi) be each atural object classification probability density, its meet Condition see formula (4).
Under these conditions, the quantity n of estimation atural object classification, each class can be converted into by solving the threshold value of vegetation and non-vegetation Other ωiAverage and variance point miDetailed process is as follows:
Count the frequency histogram f (x) of NDVI in piecemeal image.Search for the maximum NDVI of NDVI in piecemeal imagemaxWith Minimum value NDVImin, by NDVIminTo NDVImax256 parts are equally divided into, statistics NDVI values appear in every a interval pixel count Amount, obtains NDVI histogram f (x).
The single order for calculating NDVI histogram f (x) is reciprocal, and NDVI frequency histograms are found for 0 position according to single order inverse Maximum point, minimum point in figure.
The quantity of maximum point is counted, ground mulching categorical measure n is determined.Each is used as using the NDVI values at maximum The initial mean value m of ground mulching classificationi, initial σ is calculated by formula (5) with two neighboring minimum pointi.
σi=(NDVI2-NDVI1)/4 (5)
NDVI2, NDVI1Corresponding NDVI values at two neighboring minimum are represented respectively.
Wherein ncount is the number of all pixels of the NDVI values between two neighboring minimum, and n_total is piecemeal Total number of pixels.
Based on maximal mathematical expectation (EM) algorithm, p (ω are estimated using loop iterationi)、miEM algorithms pass through circulation Iteration, each iteration is by asking two steps of desired value and expectation maximization to constitute;The former according to the currency of parameter to be estimated, from Observe data in direct estimation probability density desired value, the latter by maximizing this expectation come the estimator of undated parameter, This two step in whole iterative process successively alternately, until iterative process convergence;Specific formula for calculation is as follows:
Three formulas estimation above is that k represents k-th of atural object classification, t and t in prior probability, average and standard deviation, formula respectively + 1 represents the estimate used in current and next iteration respectively, and i, j represents the line number and columns of NDVI images, X respectively The NDVI values that the i-th row j is arranged in (i, j) expression NDVI images, and conditional probability p (X (i, j) | ωi) calculating see (3), full probability p The value of (X (i, j)) is provided by formula (2);As the adjacent p (ω iterated to calculate twicei)、miAnd σiValue be less than given threshold epsilon (ε =10-8) when iteration ends.
Determine the initial vegetation area of piecemeal.Assuming that the NDVI average values of each classification obtained in upper step are by from big to small Sequence, m1, m2, mn, corresponding variance is σi, then initial vegetation distribution map is represented by VI (i, j), wherein 1 represents vegetation region Domain, 0 represents nonvegetated area domain.
Then
Wherein threshold value T is shown in formula
1.3 steps are repeated, the vegetation prime area of all piecemeals is obtained.Merge the vegetation area in all piecemeals, obtain just Beginning vegetation area.Then connective mark is carried out to whole image vegetation area, the pixel coordinate of each vegetative patch is obtained. Connectedness mark can use 4 neighborhoods or 8 neighborhoods.
Step 2: to the multispectral image computation vision perceptual parameters of high spatial resolution.
2.1 pairs of multispectral images have RGB color space to transform to HSI color spaces.
The image of corresponding red, green, blue wave band in high-resolution multi-spectral image, calculates vegetation bright according to formula (10) Degree, colourity, the result of variations of saturation degree.
2.2 computation vision characteristic perception parameters.
According to initial vegetative patch as sample, the brightness of the vegetation of each patch, colourity, the average of saturation degree are calculated μx(H, S, I), variances sigmaxStatistical parameter values such as (H, S, I).Calculate normalization tone parameter of saturation
In formula, (i, j) represents pixel coordinate, μkThe average of k-th of vegetative patch, its corresponding variance is σ k, HL, HUFor The lower and upper limit of vegetation tone, are shown in formula (11)
Step 3: calculating textural characteristics parameter to the panchromatic wave-band image of high spatial resolution
For panchromatic wave-band video generation gray level co-occurrence matrixes, 6 parametric textures of gray level co-occurrence matrixes are calculated:Contrast (Con), diversity (Dis), angle energy (ASM), entropy (Ent), homogeneity (Hom), auto-correlation coefficient (Cor), specific formula are shown in (11-17).The selection of textural characteristics computation window mouthful and the resolution ratio of panchromatic image and multispectral image have relation, and false color is panchromatic The spatial resolution of wave band is n times of multispectral image, then the window that texture is calculated is 2n-1, is calculated once every n pixel. Such as resolution ratio of quickbird images panchromatic wave-band image is 0.6 1m, and the resolution ratio of multispectral image is 2.4 4m, meter The size of window is 7 × 7 when calculating panchromatic wave-band texture, and a textural characteristics are calculated every 4 pixels.So just can will be panchromatic The textural characteristics of image are corresponding with the visually-perceptible feature and NDVI of multispectral image.
Step 4: the vegetation area automatic increase of combination features, obtains vegetation area distribution map
Using each initial vegetative patch as " seed ", the method increased using region will be adjacent to the pixel of " seed " patch It is attached on seed region.Region increase criterion using textural characteristics, visually-perceptible characteristic synthetic using multiple features voting come It is determined that.
The criterion that region increases is as follows
To each initial vegetative patch, brightness (I), normalization tone (H), saturation degree (S), contrast are counted respectively (Contrast), diversity (Dissimilarity), angle energy (Energy), entropy (Entropy), homogeneity (Homogeneity), auto-correlation coefficient, average, the variance of this 9 characteristic parameters, it is assumed that its value is μ, σ.To initial vegetation spot 8 neighborhood territory pixels of block, according to each feature, whether be vegetation, discriminant function is defined as follows if judging it
Wherein, M (i, j) refers to the characteristic parameter calculated, is exactly 3 visual signature parameters described in (1), 6 lines Characteristic parameter is managed, 9, i.e. brightness, normalization tone, saturation degree, contrast, diversity, angle energy, an entropy is had, homogeneity, from phase Relation number.Herein with M come instead of any one variable therein.Judgment threshold, i are represented in T2, j represents the ranks number of pixel, 1 Vegetation area is represented, 0 represents nonvegetated area domain.
So, can with brightness, normalization tone, saturation degree, contrast (Contrast), diversity (Dissimilarity), Angle energy (Energy), entropy (Entropy), homogeneity (Homogeneity), auto-correlation coefficient, this 9 characteristic parameters can divide A width Potential vegetation administrative division map is not obtained.
Multiple features voting determines final vegetation area.This 9 width Potential vegetation administrative division map is voted, i.e., in vegetation Each pixel is when having 6 and its features above is collectively labeled as 1 on administrative division map, it is believed that be final vegetation area.
The urban vegetation extraction method for the multiple-spatial resolution remote sensing image that the present invention is provided, employs image block segmentation Processing, accelerates image processing speed.According to NDVI features, using the automatic of the adaptive dynamic threshold of maximal mathematical expectation algorithm Selection.The panchromatic wave-band of remote sensing image, multi light spectrum hands are made full use of, to what is extracted from panchromatic wave-band and multi light spectrum hands NDVI features, visual signature, textural characteristics are integrated, and are carried out vegetation area judgement to vegetation, are improved accuracy.The present invention is provided Method also there is certain applicability, the NDVI vegetation indexs in such as this method can change EVI indexes, SR indexes into.
It is described above, be only illustrate the present invention specific implementation case, be not limited to the present invention can Practical range, such as those skilled in the art without departing from indicated by the present invention spirit with principle under completed all Equivalent change or modification, should be covered by the scope of the claims in the present invention.

Claims (7)

1. a kind of urban vegetation extraction method of multiple-spatial resolution remote sensing image, it is characterised in that methods described mainly include with Lower step:
Step 1: the initial patch of multispectral image vegetation based on high spatial resolution is extracted;Specifically include following steps:
Step 1.1 calculates NDVI vegetation spectral indexes, and calculation formula is
<mrow> <mi>N</mi> <mi>D</mi> <mi>V</mi> <mi>I</mi> <mo>=</mo> <mfrac> <mrow> <mi>N</mi> <mi>I</mi> <mi>R</mi> <mo>-</mo> <mi>R</mi> </mrow> <mrow> <mi>N</mi> <mi>I</mi> <mi>R</mi> <mo>+</mo> <mi>R</mi> </mrow> </mfrac> </mrow>
Wherein NIR represents the pixel value of near infrared band, and R represents the pixel value of red band;
Step 1.2 carries out piecemeal to NDVI images, per block size 300*300 pixels, in image edge part less than 300*300 pictures When plain, it is defined by actual size;
Step 1.3 obtains segmentation threshold to the NDVI block datas of each 300*300 pixel and distinguishes vegetation and non-vegetation automatically, Obtain initial vegetation area;
Step 1.4 repeat step 1.3, obtains the vegetation prime area of all piecemeals, merges the vegetation area in all piecemeals, obtains To initial vegetation area, connective mark is carried out to whole image vegetation area, the pixel coordinate of each vegetative patch is obtained;
The acquisition of segmentation threshold is obtained by the following method in the step 1.3:
(1) the several ground mulching type of correspondence in each piecemeal NDVI, with ωiRepresent, it is assumed that the NDVI of each atural object classification Gaussian density function distribution is obeyed in distribution, then probability-distribution function p (x) total NDVI in each piecemeal is represented by:
<mrow> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;omega;</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>|</mo> <msub> <mi>&amp;omega;</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> </mrow>
Wherein, n represents the total quantity of classification, ωiI-th of atural object classification is represented, average and variance use m respectivelyiRepresent, its is general Rate density fonction is expressed as
<mrow> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>|</mo> <msub> <mi>&amp;omega;</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <msqrt> <mrow> <mn>2</mn> <msubsup> <mi>&amp;pi;&amp;sigma;</mi> <mi>i</mi> <mn>2</mn> </msubsup> </mrow> </msqrt> </mfrac> <mi>exp</mi> <mo>{</mo> <mo>-</mo> <mfrac> <msup> <mrow> <mo>(</mo> <mi>x</mi> <mo>-</mo> <msub> <mi>m</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mrow> <mn>2</mn> <msubsup> <mi>&amp;sigma;</mi> <mi>i</mi> <mn>2</mn> </msubsup> </mrow> </mfrac> <mo>}</mo> </mrow>
Wherein, p (ωi) be each atural object classification probability density, its meet condition be:
<mrow> <munderover> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;omega;</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mn>1</mn> </mrow>
Under these conditions, the quantity n of estimation atural object classification can be converted into by solving the threshold value of vegetation and non-vegetation, each classification ωiAverage and variance point miComprise the following steps that:
(2) the maximum NDVI of NDVI in the frequency histogram f (x) of NDVI in piecemeal image, search piecemeal image is countedmaxWith Minimum value NDVImin, by NDVIminTo NDVImax256 parts are equally divided into, statistics NDVI values appear in every a interval pixel count Amount, obtains NDVI histogram f (x);
(3) single order for calculating NDVI histogram f (x) is reciprocal, and NDVI frequency histograms are found for 0 position according to single order inverse In maximum point, minimum point;
(4) quantity of maximum point is counted, ground mulching categorical measure n is determined, each is used as using the NDVI values at maximum The initial mean value m of ground mulching classificationi, initial σ is calculated with two neighboring minimum pointiFormula it is as follows:
σi=(NDVI2-NDVI1)/4
Wherein, NDVI2, NDVIiCorresponding NDVI values at two neighboring minimum are represented respectively;
<mrow> <mi>p</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;omega;</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <mi>n</mi> <mi>c</mi> <mi>o</mi> <mi>u</mi> <mi>n</mi> <mi>t</mi> </mrow> <mrow> <mi>n</mi> <mo>_</mo> <mi>t</mi> <mi>o</mi> <mi>t</mi> <mi>a</mi> <mi>l</mi> </mrow> </mfrac> </mrow> 1
Wherein, wherein ncount is the number of all pixels of the NDVI values between two neighboring minimum, and n_total is piecemeal Total number of pixels;
(5) maximal mathematical expectation (EM) algorithm is based on, p (ω are estimated using loop iterationi)、miBy loop iteration, every time Iteration is by asking two steps of desired value and expectation maximization to constitute;The step of seeking desired value according to the currency of parameter to be estimated, Updated from observation data the step of the desired value of direct estimation probability density, expectation maximization by maximizing this expectation The estimator of parameter, this two step in whole iterative process successively alternately, until iterative process convergence;Specific formula for calculation It is as follows:
<mrow> <msup> <mi>p</mi> <mrow> <mi>t</mi> <mo>+</mo> <mn>1</mn> </mrow> </msup> <mrow> <mo>(</mo> <msub> <mi>&amp;omega;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <munder> <mi>&amp;Sigma;</mi> <mrow> <mi>X</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>&amp;Element;</mo> <msub> <mi>M</mi> <mi>D</mi> </msub> </mrow> </munder> <mfrac> <mrow> <msup> <mi>p</mi> <mi>t</mi> </msup> <mrow> <mo>(</mo> <msub> <mi>&amp;omega;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <msup> <mi>p</mi> <mi>t</mi> </msup> <mrow> <mo>(</mo> <mi>X</mi> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>/</mo> <msub> <mi>&amp;omega;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mrow> <msup> <mi>p</mi> <mi>t</mi> </msup> <mrow> <mo>(</mo> <mi>X</mi> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>)</mo> </mrow> </mfrac> </mrow> <mrow> <mi>I</mi> <mo>*</mo> <mi>I</mi> </mrow> </mfrac> </mrow>
<mrow> <msubsup> <mi>m</mi> <mi>k</mi> <mrow> <mi>t</mi> <mo>+</mo> <mn>1</mn> </mrow> </msubsup> <mrow> <mo>(</mo> <msub> <mi>&amp;omega;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <munder> <mi>&amp;Sigma;</mi> <mrow> <mi>X</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>&amp;Element;</mo> <msub> <mi>M</mi> <mi>D</mi> </msub> </mrow> </munder> <mfrac> <mrow> <msup> <mi>p</mi> <mi>t</mi> </msup> <mrow> <mo>(</mo> <msub> <mi>&amp;omega;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <msup> <mi>p</mi> <mi>t</mi> </msup> <mrow> <mo>(</mo> <mi>X</mi> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>/</mo> <msub> <mi>&amp;omega;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mrow> <msup> <mi>p</mi> <mi>t</mi> </msup> <mrow> <mo>(</mo> <mi>X</mi> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>)</mo> </mrow> </mfrac> <mi>X</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> <mrow> <munder> <mi>&amp;Sigma;</mi> <mrow> <mi>X</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>&amp;Element;</mo> <msub> <mi>M</mi> <mi>D</mi> </msub> </mrow> </munder> <mfrac> <mrow> <msup> <mi>p</mi> <mi>t</mi> </msup> <mrow> <mo>(</mo> <msub> <mi>&amp;omega;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <msup> <mi>p</mi> <mi>t</mi> </msup> <mrow> <mo>(</mo> <mi>X</mi> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> <mo>)</mo> <mo>/</mo> <msub> <mi>&amp;omega;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> </mrow> <mrow> <msup> <mi>p</mi> <mi>t</mi> </msup> <mrow> <mo>(</mo> <mi>X</mi> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> <mo>)</mo> <mo>)</mo> </mrow> </mrow> </mfrac> </mrow> </mfrac> </mrow>
<mrow> <msup> <mrow> <mo>(</mo> <msubsup> <mi>&amp;sigma;</mi> <mi>k</mi> <mn>2</mn> </msubsup> <mo>)</mo> </mrow> <mrow> <mi>t</mi> <mo>+</mo> <mn>1</mn> </mrow> </msup> <mrow> <mo>(</mo> <msub> <mi>&amp;omega;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <munder> <mi>&amp;Sigma;</mi> <mrow> <mi>X</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>&amp;Element;</mo> <msub> <mi>M</mi> <mi>D</mi> </msub> </mrow> </munder> <mfrac> <mrow> <msup> <mi>p</mi> <mi>t</mi> </msup> <mrow> <mo>(</mo> <msub> <mi>&amp;omega;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <msup> <mi>p</mi> <mi>t</mi> </msup> <mrow> <mo>(</mo> <mi>X</mi> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> <mo>)</mo> <mo>/</mo> <msub> <mi>&amp;omega;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> </mrow> <mrow> <msup> <mi>p</mi> <mi>t</mi> </msup> <mrow> <mo>(</mo> <mi>X</mi> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>)</mo> </mrow> </mfrac> <msup> <mrow> <mo>&amp;lsqb;</mo> <mi>X</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>-</mo> <msubsup> <mi>m</mi> <mi>k</mi> <mi>t</mi> </msubsup> <mo>&amp;rsqb;</mo> </mrow> <mn>2</mn> </msup> </mrow> <mrow> <munder> <mi>&amp;Sigma;</mi> <mrow> <mi>X</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>&amp;Element;</mo> <msub> <mi>M</mi> <mi>D</mi> </msub> </mrow> </munder> <mfrac> <mrow> <msup> <mi>p</mi> <mi>t</mi> </msup> <mrow> <mo>(</mo> <msub> <mi>&amp;omega;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <msup> <mi>p</mi> <mi>t</mi> </msup> <mrow> <mo>(</mo> <mi>X</mi> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> <mo>)</mo> <mo>/</mo> <msub> <mi>&amp;omega;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> </mrow> <mrow> <msup> <mi>p</mi> <mi>t</mi> </msup> <mrow> <mo>(</mo> <mi>X</mi> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> <mo>)</mo> <mo>)</mo> </mrow> </mrow> </mfrac> </mrow> </mfrac> </mrow>
The estimation of above-mentioned three formula is that k represents k-th of atural object classification in prior probability, average and standard deviation, formula respectively, t and t+1 points The estimate used in current and next iteration is not represented, and i, j represents the line number and columns of NDVI images, X (i, j) respectively The NDVI values that the i-th row j is arranged in expression NDVI images, and conditional probability p (X (i, j) | ωi) calculating and full probability p (X (i, j)) Value is drawn by the respective formula in step (1);As the adjacent p (ω iterated to calculate twicei)、miAnd σiValue be less than given threshold Iteration ends during value ε, threshold epsilon=10-8
(6) the initial vegetation area of piecemeal is determined, it is assumed that the NDVI average values of each classification obtained in upper step are by from big to small Sequence, m1, m2, mn, corresponding variance is σi, then initial vegetation distribution map is represented by VI (i, j), wherein 1 represents vegetation region Domain, 0 represents nonvegetated area domain, then
<mrow> <mi>V</mi> <mi>I</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mrow> <mi>N</mi> <mi>D</mi> <mi>V</mi> <mi>I</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>&gt;</mo> <mi>T</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mi>N</mi> <mi>D</mi> <mi>V</mi> <mi>I</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>&amp;le;</mo> <mi>T</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow>
Wherein threshold value T is tried to achieve by equation below
<mrow> <mi>T</mi> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <msub> <mi>m</mi> <mn>1</mn> </msub> <mo>-</mo> <mn>1.5</mn> <msub> <mi>&amp;sigma;</mi> <mn>1</mn> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>m</mi> <mn>1</mn> </msub> <mo>-</mo> <msub> <mi>m</mi> <mn>2</mn> </msub> <mo>&gt;</mo> <mn>1.5</mn> <mrow> <mo>(</mo> <msub> <mi>&amp;sigma;</mi> <mn>1</mn> </msub> <mo>+</mo> <msub> <mi>&amp;sigma;</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>m</mi> <mn>2</mn> </msub> <mo>+</mo> <mn>1.5</mn> <msub> <mi>&amp;sigma;</mi> <mn>2</mn> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>m</mi> <mn>1</mn> </msub> <mo>-</mo> <msub> <mi>m</mi> <mn>2</mn> </msub> <mo>&amp;le;</mo> <mn>1.5</mn> <mrow> <mo>(</mo> <msub> <mi>&amp;sigma;</mi> <mn>1</mn> </msub> <mo>+</mo> <msub> <mi>&amp;sigma;</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow>
Step 2: to the multispectral image computation vision perceptual parameters of high spatial resolution;
Step 3: calculating textural characteristics parameter to the panchromatic wave-band image of high spatial resolution;
Step 4: the vegetation area automatic increase of combination features, obtains vegetation area distribution map.
2. the urban vegetation extraction method of multiple-spatial resolution remote sensing image according to claim 1, it is characterised in that:Institute The connective mark stated in step 1.4 uses 4 neighborhoods or 8 neighborhoods.
3. the urban vegetation extraction method of multiple-spatial resolution remote sensing image according to claim 1, it is characterised in that:Institute State step 2 and specifically include following steps:
Step 2.1 has RGB color space to transform to HSI color spaces to multispectral image, and it is bright to calculate vegetation by below equation Degree, colourity, the result of variations of saturation degree
<mrow> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mtable> <mtr> <mtd> <mrow> <mi>H</mi> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mi>&amp;theta;</mi> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>i</mi> <mi>f</mi> <mi> </mi> <mi>G</mi> <mo>&amp;GreaterEqual;</mo> <mi>B</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>2</mn> <mi>&amp;pi;</mi> <mo>-</mo> <mi>&amp;theta;</mi> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>o</mi> <mi>t</mi> <mi>h</mi> <mi>e</mi> <mi>r</mi> <mi>w</mi> <mi>i</mi> <mi>s</mi> <mi>e</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow> </mtd> <mtd> <mrow> <mi>&amp;theta;</mi> <mo>=</mo> <mi>arccos</mi> <mfrac> <mrow> <mo>(</mo> <mi>R</mi> <mo>-</mo> <mi>G</mi> <mo>)</mo> <mo>+</mo> <mo>(</mo> <mi>R</mi> <mo>-</mo> <mi>B</mi> <mo>)</mo> </mrow> <mrow> <mn>2</mn> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <mi>R</mi> <mo>-</mo> <mi>G</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <mrow> <mo>(</mo> <mi>R</mi> <mo>-</mo> <mi>B</mi> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>G</mi> <mo>-</mo> <mi>B</mi> <mo>)</mo> </mrow> </mrow> </msqrt> </mrow> </mfrac> </mrow> </mtd> </mtr> </mtable> </mtd> </mtr> <mtr> <mtd> <mrow> <mi>S</mi> <mo>=</mo> <mn>1</mn> <mo>-</mo> <mfrac> <mn>3</mn> <mrow> <mi>R</mi> <mo>+</mo> <mi>G</mi> <mo>+</mo> <mi>B</mi> </mrow> </mfrac> <mi>min</mi> <mrow> <mo>(</mo> <mi>R</mi> <mo>,</mo> <mi>G</mi> <mo>,</mo> <mi>B</mi> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mi>I</mi> <mo>=</mo> <mrow> <mo>(</mo> <mi>R</mi> <mo>+</mo> <mi>G</mi> <mo>+</mo> <mi>B</mi> <mo>)</mo> </mrow> <mo>/</mo> <mn>3</mn> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow>
Step 2.2 computation vision characteristic perception parameter, according to initial vegetative patch as sample, calculates the vegetation of each patch Brightness, colourity, the mean μ of saturation degreex(H, S, I), variances sigmax(H, S, I), calculates normalization tone parameter of saturation
<mrow> <msup> <mi>H</mi> <mo>&amp;prime;</mo> </msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mn>1</mn> <mo>-</mo> <mfrac> <mrow> <mo>|</mo> <mi>H</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>&amp;mu;</mi> <mi>k</mi> </msub> <mo>|</mo> </mrow> <msub> <mi>&amp;mu;</mi> <mi>k</mi> </msub> </mfrac> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>i</mi> <mi>f</mi> <mi> </mi> <msub> <mi>H</mi> <mi>L</mi> </msub> <mo>&lt;</mo> <mi>H</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>&lt;</mo> <msub> <mi>H</mi> <mi>U</mi> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>0</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>o</mi> <mi>t</mi> <mi>h</mi> <mi>e</mi> <mi>r</mi> <mi>w</mi> <mi>i</mi> <mi>s</mi> <mi>e</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow>
Wherein, H (i, j) is tone, and (i, j) represents pixel coordinate, μkThe average of k-th of vegetative patch, its corresponding variance is σk, HL, HUFor the lower and upper limit of vegetation tone
<mrow> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <msub> <mi>H</mi> <mi>L</mi> </msub> <mo>=</mo> <msub> <mi>&amp;mu;</mi> <mi>k</mi> </msub> <mo>-</mo> <mn>2</mn> <msub> <mi>&amp;sigma;</mi> <mi>k</mi> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>H</mi> <mi>U</mi> </msub> <mo>=</mo> <msub> <mi>&amp;mu;</mi> <mi>k</mi> </msub> <mo>+</mo> <mn>2</mn> <msub> <mi>&amp;sigma;</mi> <mi>k</mi> </msub> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>.</mo> </mrow>
4. the urban vegetation extraction method of multiple-spatial resolution remote sensing image according to claim 1, it is characterised in that institute Stating step 3 is specially:
For panchromatic wave-band video generation gray level co-occurrence matrixes, 6 parametric textures of gray level co-occurrence matrixes are calculated:It is contrast, different Property, angle energy, entropy, homogeneity, auto-correlation coefficient, specific formula for calculation are as follows:
<mrow> <mi>C</mi> <mi>O</mi> <mi>N</mi> <mo>=</mo> <munder> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </munder> <msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>-</mo> <mi>j</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mi>p</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow>
<mrow> <mi>D</mi> <mi>I</mi> <mi>S</mi> <mo>=</mo> <munder> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </munder> <mi>p</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>|</mo> <mi>i</mi> <mo>-</mo> <mi>j</mi> <mo>|</mo> </mrow>
<mrow> <mi>A</mi> <mi>S</mi> <mi>M</mi> <mo>=</mo> <munder> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </munder> <mi>p</mi> <msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow>
<mrow> <mi>E</mi> <mi>N</mi> <mi>T</mi> <mo>=</mo> <munder> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </munder> <mi>p</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mi>log</mi> <mrow> <mo>(</mo> <mi>p</mi> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> <mo>)</mo> <mo>)</mo> </mrow> </mrow>
<mrow> <mi>H</mi> <mi>O</mi> <mi>M</mi> <mo>=</mo> <munder> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </munder> <mfrac> <mrow> <mi>p</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> <mrow> <mn>1</mn> <mo>+</mo> <msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>-</mo> <mi>j</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </mfrac> </mrow>
<mrow> <mi>C</mi> <mi>O</mi> <mi>R</mi> <mo>=</mo> <munder> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> </munder> <mi>p</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mfrac> <mrow> <mo>(</mo> <mi>i</mi> <mo>-</mo> <msub> <mi>&amp;mu;</mi> <mi>i</mi> </msub> <mo>)</mo> <mo>(</mo> <mi>j</mi> <mo>-</mo> <msub> <mi>&amp;mu;</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mrow> <msub> <mi>&amp;sigma;</mi> <mi>x</mi> </msub> <msub> <mi>&amp;sigma;</mi> <mi>y</mi> </msub> </mrow> </mfrac> </mrow>
The selection of textural characteristics computation window mouthful and the resolution ratio of panchromatic image and multispectral image have relation, it is assumed that panchromatic wave-band Spatial resolution be n times of multispectral image, then the window that texture is calculated is 2n-1, every n pixel calculating once.
5. the urban vegetation extraction method of multiple-spatial resolution remote sensing image according to claim 1, it is characterised in that institute Stating step 4 is specially:Using each initial vegetative patch as " seed ", the method increased using region will be adjacent to " seed " spot The pixel of block is attached on seed region.
6. the urban vegetation extraction method of multiple-spatial resolution remote sensing image according to claim 5, it is characterised in that institute The criterion for stating region growth is decided by vote to determine using textural characteristics, visually-perceptible characteristic synthetic using multiple features.
7. the urban vegetation extraction method of multiple-spatial resolution remote sensing image according to claim 6, it is characterised in that institute The criterion for stating region growth is as follows:
(1) to each initial vegetative patch, brightness, normalization tone, saturation degree, contrast, diversity, angle energy are counted respectively Amount, entropy, homogeneity, the average of auto-correlation coefficient and features described above parameter, variance, it is assumed that its value is μ, σ;To initial vegetation spot 8 neighborhood territory pixels of block, according to each feature, whether be vegetation, discriminant function is defined as follows if judging it
<mrow> <msub> <mi>T</mi> <mn>2</mn> </msub> <mo>=</mo> <mi>&amp;mu;</mi> <mo>-</mo> <msqrt> <mn>2</mn> </msqrt> <mi>&amp;sigma;</mi> </mrow>
<mrow> <mi>V</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mn>1</mn> <mo>,</mo> <mi>i</mi> <mi>f</mi> <mi> </mi> <mi>M</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>&gt;</mo> <msub> <mi>T</mi> <mn>2</mn> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>0</mn> <mo>,</mo> <mi>o</mi> <mi>t</mi> <mi>h</mi> <mi>e</mi> <mi>r</mi> <mi>w</mi> <mi>i</mi> <mi>s</mi> <mi>e</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow>
Wherein, M (i, j) is the characteristic parameter calculated, in T2Judgment threshold, i are represented, j represents the ranks number of pixel, and 1 represents to plant Nonvegetated area domain is represented by region, 0, and then obtains the corresponding Potential vegetation administrative division map of above-mentioned 9 characteristic parameters difference;
(2) multiple features voting determines final vegetation area, this 9 width Potential vegetation administrative division map is voted, i.e., in vegetation Each pixel is when the feature for having 6 and its above is collectively labeled as 1 on administrative division map, it is believed that be final vegetation area.
CN201510186167.6A 2015-04-17 2015-04-17 The urban vegetation extraction method of multiple-spatial resolution remote sensing image Expired - Fee Related CN104851113B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510186167.6A CN104851113B (en) 2015-04-17 2015-04-17 The urban vegetation extraction method of multiple-spatial resolution remote sensing image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510186167.6A CN104851113B (en) 2015-04-17 2015-04-17 The urban vegetation extraction method of multiple-spatial resolution remote sensing image

Publications (2)

Publication Number Publication Date
CN104851113A CN104851113A (en) 2015-08-19
CN104851113B true CN104851113B (en) 2017-11-03

Family

ID=53850740

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510186167.6A Expired - Fee Related CN104851113B (en) 2015-04-17 2015-04-17 The urban vegetation extraction method of multiple-spatial resolution remote sensing image

Country Status (1)

Country Link
CN (1) CN104851113B (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102623403B1 (en) * 2015-11-08 2024-01-12 어그로잉 리미티드 Methods for aerial image acquisition and analysis
CN105405148B (en) * 2015-11-17 2017-11-14 中国科学院遥感与数字地球研究所 A kind of remote sensing image Chinese white poplar recognition methods of combination trees shadow character
CN105651384B (en) * 2016-01-13 2018-01-16 南京大学 A kind of all-optical information acquisition system
CN107657631B (en) * 2016-12-23 2018-09-25 航天星图科技(北京)有限公司 A kind of generation method of the microwave spectrum mixed image based on vegetation distribution
CN107194187B (en) * 2017-06-06 2019-08-06 国家基础地理信息中心 Take the ground mulching sample size calculation method of space composition and configuration into account
CN108009542B (en) * 2017-11-01 2021-06-15 华中农业大学 Weed image segmentation method in rape field environment
CN108089850B (en) * 2018-01-02 2020-11-24 北京建筑大学 Method for updating earth surface coverage product increment based on image collaborative segmentation and ecological geographical partitioning rule base
CN109033543B (en) * 2018-06-29 2020-06-30 北京师范大学 Method, device and equipment for estimating vegetation coverage of heterogeneous land areas
CN108986113A (en) * 2018-07-06 2018-12-11 航天星图科技(北京)有限公司 A kind of block parallel multi-scale division algorithm based on LLTS frame
CN109583378A (en) * 2018-11-30 2019-04-05 东北大学 A kind of vegetation coverage extracting method and system
CN110826480A (en) * 2019-11-04 2020-02-21 湖南城市学院 Water body automatic extraction method based on ant colony algorithm
CN111402182B (en) * 2020-03-18 2023-04-28 中国资源卫星应用中心 Middle-divided image synthesis method based on land coverage information
CN111814809B (en) * 2020-07-17 2024-07-02 国家基础地理信息中心 Remote sensing image vegetation information extraction method and device
CN114273252A (en) * 2021-11-26 2022-04-05 山东安信种苗股份有限公司 Intelligent vegetable seedling grading method
CN116188809B (en) * 2023-05-04 2023-08-04 中国海洋大学 Texture similarity judging method based on visual perception and sequencing driving
CN116740580A (en) * 2023-08-16 2023-09-12 山东绿博园市政工程有限公司 Garden engineering data processing method based on remote sensing technology
CN117132423B (en) * 2023-08-22 2024-04-12 深圳云创友翼科技有限公司 Park management system
CN117152634B (en) * 2023-09-12 2024-06-04 广东省水利水电科学研究院 Multi-source satellite image floating plant identification method and system based on chromaticity index

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101832769A (en) * 2010-03-30 2010-09-15 中国农业大学 Method and system for estimating vegetation coverage degree in diggings based on close-up photography
CN103093233A (en) * 2012-12-03 2013-05-08 中国环境科学研究院 Forest classification method based on object-oriented high-resolution remote sensing image
CN103593858A (en) * 2013-11-13 2014-02-19 北京林业大学 Method for filtering out green vegetation data in ground laser radar scanning data
EP2801951A1 (en) * 2013-05-08 2014-11-12 Honeywell International Inc. Aerial image segmentation for refineries

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7038694B1 (en) * 2002-03-11 2006-05-02 Microsoft Corporation Automatic scenery object generation
JP5481140B2 (en) * 2009-09-28 2014-04-23 株式会社日立ソリューションズ Spectrum analyzer

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101832769A (en) * 2010-03-30 2010-09-15 中国农业大学 Method and system for estimating vegetation coverage degree in diggings based on close-up photography
CN103093233A (en) * 2012-12-03 2013-05-08 中国环境科学研究院 Forest classification method based on object-oriented high-resolution remote sensing image
EP2801951A1 (en) * 2013-05-08 2014-11-12 Honeywell International Inc. Aerial image segmentation for refineries
CN103593858A (en) * 2013-11-13 2014-02-19 北京林业大学 Method for filtering out green vegetation data in ground laser radar scanning data

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《一种面向对象的遥感影像城市绿地提取方法》;李成范等;《测绘科学》;20110930;第36卷(第5期);全文 *
《面向对象的城市绿地信息提取方法研究》;熊轶群等;《华东师范大学学报(自然科学版)》;20060731(第4期);第85页第1-2段、第5-6段, 第86页第8段以及图3,第85页最后一段 *

Also Published As

Publication number Publication date
CN104851113A (en) 2015-08-19

Similar Documents

Publication Publication Date Title
CN104851113B (en) The urban vegetation extraction method of multiple-spatial resolution remote sensing image
CN108573276B (en) Change detection method based on high-resolution remote sensing image
CN103971115B (en) Automatic extraction method for newly-increased construction land image spots based on NDVI and PanTex index
CN101840581B (en) Method for extracting profile of building from satellite remote sensing image
CN102902956B (en) A kind of ground visible cloud image identifying processing method
CN107862667A (en) A kind of city shadow Detection and minimizing technology based on high-resolution remote sensing image
CN108009542A (en) Weed images dividing method under rape field environment
CN105550685B (en) The large format remote sensing image area-of-interest exacting method of view-based access control model attention mechanism
CN104951799B (en) A kind of SAR remote sensing image oil spilling detection recognition method
CN103077515B (en) Multi-spectral image building change detection method
CN106611421B (en) The SAR image segmentation method constrained based on feature learning and sketch line segment
Zheng et al. Large-scale oil palm tree detection from high-resolution remote sensing images using faster-rcnn
CN107832797B (en) Multispectral image classification method based on depth fusion residual error network
CN104217440B (en) A kind of method extracting built-up areas from remote sensing images
CN112488050A (en) Color and texture combined aerial image scene classification method and system
CN104102928B (en) A kind of Classifying Method in Remote Sensing Image based on texture primitive
CN104835196B (en) A kind of vehicle mounted infrared image colorization three-dimensional rebuilding method
CN107818303A (en) Unmanned plane oil-gas pipeline image automatic comparative analysis method, system and software memory
CN107292328A (en) The remote sensing image shadow Detection extracting method and system of multiple dimensioned multiple features fusion
CN110879992A (en) Grassland surface covering object classification method and system based on transfer learning
CN107992856A (en) High score remote sensing building effects detection method under City scenarios
JP6358862B2 (en) Forest phase analysis apparatus and program
CN110458208A (en) Hyperspectral image classification method based on information measure
CN107564016A (en) A kind of Multi-Band Remote Sensing Images segmentation of integrally object light spectrum information and labeling method
CN107169946A (en) Image interfusion method based on non-negative sparse matrix Yu hypersphere color transformation

Legal Events

Date Code Title Description
DD01 Delivery of document by public notice

Addressee: Dian Yuanyong

Document name: Notification of Acceptance of Patent Application

C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20171103

Termination date: 20180417