CN104851113A - Urban vegetation automatic extraction method of multiple-spatial resolution remote sensing image - Google Patents

Urban vegetation automatic extraction method of multiple-spatial resolution remote sensing image Download PDF

Info

Publication number
CN104851113A
CN104851113A CN201510186167.6A CN201510186167A CN104851113A CN 104851113 A CN104851113 A CN 104851113A CN 201510186167 A CN201510186167 A CN 201510186167A CN 104851113 A CN104851113 A CN 104851113A
Authority
CN
China
Prior art keywords
vegetation
ndvi
sigma
omega
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510186167.6A
Other languages
Chinese (zh)
Other versions
CN104851113B (en
Inventor
佃袁勇
姚崇怀
徐永荣
周志翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong Agricultural University
Original Assignee
Huazhong Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong Agricultural University filed Critical Huazhong Agricultural University
Priority to CN201510186167.6A priority Critical patent/CN104851113B/en
Publication of CN104851113A publication Critical patent/CN104851113A/en
Application granted granted Critical
Publication of CN104851113B publication Critical patent/CN104851113B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Other Investigation Or Analysis Of Materials By Electrical Means (AREA)
  • Image Processing (AREA)

Abstract

The present invention discloses an urban vegetation automatic extraction method of multiple-spatial resolution remote sensing images. The method comprises the steps of extracting vegetation initial plaques based on high-spatial resolution multispectral images, calculating visual perception parameters for the high-spatial resolution multispectral images, calculating textural feature parameters for high-spatial resolution panchromatic Images, automatic growing of multi-feature comprehensive vegetation regions, and obtaining a vegetation region distribution map. With adoption of image sectional division treatment, the image processing speed is accelerated. According to normalized difference vegetation index (NDVI) features, automatic selection of maximal mathematical expectation algorithm adaptive dynamic thresholds is adopted. Panchromatic wave bands and multi-spectrum wave bands of the remote sensing images are fully utilized, the NDVI features, visual features and textural features of the panchromatic wave bands and multi-spectrum wave bands are synthesized, vegetation region judgment is performed on the vegetation, and the accuracy is raised.

Description

The urban vegetation extraction method of multiple-spatial resolution remote sensing image
Technical field
The present invention relates to a kind of urban vegetation extraction method of multiple-spatial resolution remote sensing image, belong to remote sensing image field.
Background technology
Urban vegetation has solid carbon, releases foster function, in maintenance urban ecological environment, have vital role.The urban vegetation extracted based on high-resolution remote sensing image covers, and is the important indicator of assessment ecocity, garden-like cities.The method that present stage utilizes high-resolution remote sensing image to extract urban vegetation mainly contains following several:
(1) artificial visual decipher interpretation.The method mainly with artificial interpretation vegetation area, vegetation border of then sketching.The feature that Main Basis vegetation shows in the picture in visual interpretation comprises shape, size, color and tint, shade, position, texture relation etc.Visual interpretation adopts the method for manual work screen digitization, and automaticity is lower, and subjectivity is strong, needs abundant image interpretation practical experience, and has good understanding and cognition to extracted thematic information.But due to various information in remote sensing image can be utilized fully, remain the important method of remote sensing image process.
(2) OO supervised classification method.The method mainly utilizes the color of vegetation area in image, spatial texture characteristic, on OO Iamge Segmentation basis, calculate the various features of each object, (it comprises: spectrum average to comprise spectrum, brightness, mean square deviation, ratio, the spectral signature relevant to field, spectral signature relevant to parent object etc.) shape (comprises area, girth, the length of side, tight ness rating, length breadth ratio and the shape facility relevant to subobject and parent object), texture is (according to the second moment (energy) that gray level co-occurrence matrixes defines, contrast, relevant, entropy, unfavourable balance square) and topological characteristic etc., finally select corresponding feature to carry out object-based classification to extract.
These extracting method above-mentioned all existing defects and problems, comprising:
(1) when extracting vegetation area, needing to select sample in advance, generating priori, but not based on image self-characteristic.Have man-machine interaction in this process, artificial influence factor is many.
(2) automaticity is not high.The yardstick split in assorting process, the threshold value of vegetative breakdown, feature all need man-machine interactively to select, and these pilot process have had a strong impact on the automaticity that vegetation is extracted.In addition, in cutting procedure, entire image adopts same threshold value, does not take into full account difference (as meadow, coniferous forest, the broad-leaf forest etc.) feature difference on image between different vegetation types.
(3) only make use of the multispectral data in high spatial resolution image, do not have to utilize the feature with more high spatial resolution panchromatic wave-band, and in reality, the remotely-sensed data PAN and multi-spectral image as high spatial resolutions such as WorldView, QuickBird bundles.Only make use of the multispectral data in high spatial resolution image, do not have to utilize the feature with more high spatial resolution panchromatic wave-band, and in reality, the remotely-sensed data PAN and multi-spectral image as high spatial resolutions such as WorldView, QuickBird bundles.
Summary of the invention
In order to solve the problem, the present invention is directed to IKONOS, QuickBird, WorldView etc. and there is high-resolution remote sensing image that is panchromatic and multi light spectrum hands, full-automatic extraction urban vegetation region, proposes a kind of urban vegetation extraction method of multiple-spatial resolution remote sensing image.
The invention provides a kind of urban vegetation extraction method of multiple-spatial resolution remote sensing image, mainly comprise the following steps:
Step one, to extract based on the initial patch of the multispectral image vegetation of high spatial resolution;
Step 2, multispectral image computation vision perceptual parameters to high spatial resolution;
Step 3, textural characteristics parameter is calculated to the panchromatic wave-band image of high spatial resolution;
The vegetation area automatic increase of step 4, combination features, obtains vegetation area distribution plan.
Preferably, above-mentioned steps one specifically comprises the following steps:
Step 1.1 calculates NDVI vegetation spectrum index, and computing formula is
NDVI = NIR - R NIR + R
Wherein NIR represents the pixel value of near-infrared band, and R represents the pixel value of red band;
Step 1.2 pair NDVI image carries out piecemeal, every block size 300*300 pixel, when image edge part is less than 300*300 pixel, is as the criterion with the size of reality;
Step 1.3 distinguishes vegetation and non-vegetation to the NDVI block data automatic acquisition segmentation threshold of each 300*300 pixel, obtains initial vegetation area;
Step 1.4 repeats step 1.3, obtains the vegetation prime area of all piecemeals, merges the vegetation area in all piecemeals, obtain initial vegetation area, carries out connectedness mark, obtain the pixel coordinate of each vegetative patch to whole image vegetation area.
Preferably, in above-mentioned steps 1.3, the acquisition of segmentation threshold obtains by the following method:
(1) corresponding several ground mulching type in each piecemeal NDVI, with ω irepresent, suppose that Gaussian density function distribution is obeyed in the NDVI distribution of each atural object classification, then probability distribution function p (x) that in each piecemeal, NDVI is total can be expressed as:
p ( x ) = Σ i = 1 n p ( ω i ) p ( x | ω i )
Wherein, other total quantity of n representation class, ω irepresent i-th atural object classification, average and variance use m respectively i, represent, its probability density function is expressed as
p ( x | ω i ) = 1 2 π σ i 2 exp { - ( x - m i ) 2 2 σ i 2 }
Wherein, p (ω i) be the probability density of each atural object classification, its condition met is:
Σ i = 1 n p ( ω i ) = 1
Under these conditions, the threshold value solving vegetation and non-vegetation can be converted into the quantity n estimating atural object classification, the ω of each classification iaverage and variance divide m i, concrete steps are as follows:
(2) frequency histogram f (x) of NDVI in piecemeal image is added up, the maximal value NDVI of NDVI in search piecemeal image maxwith minimum value NDVI min, by NDVI minto NDVI maxbe equally divided into 256 parts, statistics NDVI value appears at every a interval pixel quantity, obtains histogram f (x) of NDVI;
(3) single order calculating histogram f (x) of NDVI is reciprocal, is maximum point, minimum point in the position searching NDVI frequency histogram of 0 according to single order inverse;
(4) add up the quantity of maximum point, determine ground mulching categorical measure n, using the NDVI value at maximum value place as the initial mean value m of each ground mulching classification i, calculate initial σ with adjacent two minimum points iformula as follows:
σ i=(NDVI 2-NDVI 1)/4
Wherein, NDVI 2, NDVI 1represent the NDVI value that adjacent two minimal value places are corresponding respectively;
p ( ω i ) = ncount n _ total
Wherein, wherein ncount is the number of all pixels of NDVI value between adjacent two minimal values, and n_total is total number of pixels of piecemeal;
(5) based on maximal mathematical expectation (EM) algorithm, loop iteration estimation p (ω is adopted i), m i, eM algorithm is by loop iteration, and each iteration forms by asking expectation value and expectation maximization two steps; The former is according to the currency of parameter to be estimated, the expectation value of direct estimation probability density from observation data, and the latter carrys out the estimator of undated parameter by maximizing this expectation, and this two step hockets successively in whole iterative process, until iteration converges; Specific formula for calculation is as follows:
p t + 1 ( ω k ) = Σ X ( i , j ) ∈ M D p t ( ω k ) p t ( X ( i , j ) / ω k ) p t ( X ( i , j ) ) I * I
m k t + 1 ( ω k ) = Σ X ( i , j ) ∈ M D p t ( ω k ) p t ( X ( i , j ) / ω k ) p t ( X ( i , j ) ) X ( i , j ) Σ X ( i , j ) ∈ M D p t ( ω k ) p t ( X ( i , j ) / ω k ) p t ( X ( i , j ) )
( σ k 2 ) t + 1 ( ω k ) = Σ X ( i , j ) ∈ M D p t ( ω k ) p t ( X ( i , j ) / ω k ) p t ( X ( i , j ) ) [ X ( i , j ) - m k t ] 2 Σ X ( i , j ) ∈ M D p t ( ω k ) p t ( X ( i , j ) / ω k ) p t ( X ( i , j ) )
What above-mentioned three formulas were estimated is prior probability, average and standard deviation respectively, in formula, k represents a kth atural object classification, t and t+1 represents estimated value current and that next iteration is used respectively, i, j represents line number and the columns of NDVI image respectively, and X (i, j) represents the NDVI value that in NDVI image, the i-th row j arranges, conditional probability p (X (i, j) | ω i) calculating and the value of total probability p (X (i, j)) drawn by the respective formula in step (1); As the p (ω of adjacent twice iterative computation i), m iand σ ivalue be less than given threshold epsilon (ε=10 -8) time iteration ends;
(6) determining the initial vegetation area of piecemeal, supposing that the NDVI average value of each classification obtained in upper step is by sorting from big to small, m 1, m 2, m n, corresponding variance is σ i, then initial vegetation distribution plan can be expressed as VI (i, j), and wherein 1 represents vegetation area, and 0 represents territory, nonvegetated area, then
VI ( i , j ) = 1 NDVI ( i , j ) > T 0 NDVI ( i , j ) ≤ T
Wherein threshold value T is tried to achieve by following formula
T = m 1 - 1.5 σ 1 m 1 - m 2 > 1.5 ( σ 1 + σ 2 ) m 2 + 1.5 σ 2 m 1 - m 2 ≤ 1.5 ( σ 1 + σ 2 ) .
Preferably, connectedness mark employing 4 neighborhood in above-mentioned steps 1.4 or 8 neighborhoods.
Preferably, above-mentioned steps two specifically comprises the following steps:
Step 2.1 pair multispectral image has RGB color space transformation to HSI color space, by the result of variations of the brightness of following formulae discovery vegetation, colourity, saturation degree
H = θ , if G ≥ B 2 π - θ , otherwise θ = arccos ( R - G ) + ( R - B ) 2 ( R - G ) 2 + ( R - B ) ( G - B ) S = 1 - 3 R + G + B min ( R , G , B ) I = ( R + G + B ) / 3 ;
Step 2.2 computation vision characteristic perception parameter, according to initial vegetative patch as sample, calculates the average μ of the brightness of the vegetation of each patch, colourity, saturation degree x(H, S, I), variances sigma x(H, S, I), calculates normalization tone parameter of saturation
H &prime; ( i , j ) = 1 - | H ( i , j ) - &mu; k | &mu; k 0 , otherwise , if H L < H ( i , j ) < H U
Wherein, H (i, j) is tone, and (i, j) represents pixel coordinate, μ kthe average of a kth vegetative patch, the variance of its correspondence is σ k, H l, H ufor lower limit and the upper limit of vegetation tone
H L = &mu; k - 2 &sigma; k H U = &mu; k + 2 &sigma; k .
Preferably, above-mentioned steps three is specially:
For panchromatic wave-band video generation gray level co-occurrence matrixes, calculate 6 parametric textures of gray level co-occurrence matrixes: contrast, diversity, angle energy, entropy, homogeneity, coefficient of autocorrelation, specific formula for calculation is as follows:
CON = &Sigma; i , j ( i - j ) 2 p ( i , j )
DIS = &Sigma; i , j p ( i - j ) | i - j |
ASM = &Sigma; i , j p ( i - j ) 2
ENT = &Sigma; i , j p ( i - j ) log ( p ( i , j ) )
HOM = &Sigma; i , j p ( i , j ) 1 + ( i - j ) 2
COR = &Sigma; i , j p ( i , j ) = ( i - &mu; i ) ( j - &mu; j ) &sigma; x &sigma; y
The selection of textural characteristics computation window mouth and the resolution of panchromatic image and multispectral image have relation, and suppose that the spatial resolution of panchromatic wave-band is n times of multispectral image, then the window that texture calculates is 2n-l, calculates once every n pixel.
Preferably, above-mentioned steps four is specially: be " seed " with each initial vegetative patch, and the pixel adjacent to " seed " patch is attached on seed region by the method adopting region to increase.
Preferably, the criterion that above-mentioned zone increases adopts textural characteristics, visually-perceptible characteristic synthetic to adopt multiple features voting to determine.
Preferably, the criterion of above-mentioned zone growth is as follows:
(1) to each initial vegetative patch, add up brightness (I), normalization tone (H), saturation degree (S), contrast (Contrast), diversity (Dissimilarity), angle energy (Energy), entropy (Entropy) respectively, average, the variance of homogeneity (Homogeneity), coefficient of autocorrelation and above-mentioned characteristic parameter, suppose that its value is μ, σ; To 8 neighborhood territory pixels of initial vegetative patch, according to each feature, judge whether it is vegetation, and discriminant function is defined as follows
T 2 = &mu; - 2 &sigma;
V ( i , j ) = 1 , if M ( i , j ) > T 2 0 , otherwise
Wherein, the characteristic parameter of M (i, j) for calculating, at T 2represent judgment threshold, i, j represent the ranks number of pixel, and 1 represents vegetation area, and 0 represents territory, nonvegetated area, and then obtain above-mentioned 9 characteristic parameters Potential vegetation areal map corresponding respectively;
(2) multiple features voting determine final vegetation area, this 9 width Potential vegetation areal map is voted, namely on vegetation area figure each pixel when have 6 and above time feature be labeled as 1 time, think final vegetation area.
Relative to prior art, the urban vegetation extraction method of multiple-spatial resolution remote sensing image provided by the invention has following advantage:
(1) present invention employs the vegetation threshold value automatic selecting method based on maximal mathematical expectation, improve Data processing automaticity.Have employed piecemeal treatment technology in addition, view picture image is avoided to adopt same threshold decision, avoid the feature difference that different vegetation types (as meadow, coniferous forest, broad-leaf forest etc.) shows in remotely-sensed data, enhance the adaptivity of vegetation threshold value.
(2) the present invention adopts NDVI to propose initial vegetation area, combines the visually-perceptible feature of high-resolution multi-spectral data, the textural characteristics of panchromatic wave-band on this basis, and the method adopting multiple features to vote determines final vegetation area.This method considers vision, the textural characteristics of high spatial resolution, improves the accuracy that vegetation is extracted.
Accompanying drawing explanation
Fig. 1 is schematic flow sheet of the present invention.
Embodiment
Understand for the ease of those of ordinary skill in the art and implement the present invention, below in conjunction with the drawings and the specific embodiments, the present invention is described in further detail.
This programme has high-resolution remote sensing image that is panchromatic and multi light spectrum hands for IKONOS, QuickBird, WorldView etc., full-automatic extraction urban vegetation region.Concrete step comprises:
Step one, to extract based on the initial patch of the multispectral image vegetation of high spatial resolution.
Calculate NDVI vegetation spectrum index.Specific formula for calculation is shown in (1)
NDVI = NIR - R NIR + R - - - ( 1 )
Wherein NIR represents the pixel value of near-infrared band, and R represents the pixel value of red band.
Piecemeal is carried out to NDVI image, every block size 300*300 pixel, when image edge part is less than 300*300 pixel, be as the criterion with the size of reality.
Vegetation and non-vegetation are distinguished to the NDVI block data automatic acquisition segmentation threshold of each 300*300 pixel, obtains initial vegetation area.The method obtaining segmentation threshold is as follows:
Because vegetation has strong absorption at red band, and there is strong reflection near infrared, enhanced this characteristic of vegetation by the NDVI index of the calculating of red, near-infrared band, the difference of vegetation and other types of ground objects can better be reacted.The codomain scope [-1,1] of NDVI, for vegetation, NDVI is generally larger.
In each piecemeal NDVI, corresponding several ground mulching type is (with ω irepresent), suppose that Gaussian density function distribution is obeyed in the NDVI distribution of each atural object classification, then probability distribution function p (x) that in each piecemeal, NDVI is total can be expressed as
p ( x ) = &Sigma; i = 1 n p ( &omega; i ) p ( x | &omega; i ) - - - ( 2 )
Wherein, other total quantity of n representation class, ω irepresent i-th atural object classification, average and variance use m respectively i, represent, its probability density function is expressed as formula (3), p (ω i) be the probability density of each atural object classification, its condition met is shown in formula (4).
p ( x | &omega; i ) = 1 2 &pi; &sigma; i 2 exp { - ( x - m i ) 2 2 &sigma; i 2 } - - - ( 3 )
&Sigma; i = 1 n p ( &omega; i ) = 1 - - - ( 4 )
Under these conditions, the threshold value solving vegetation and non-vegetation can be converted into the quantity n estimating atural object classification, the ω of each classification iaverage and variance divide m i, detailed process is as follows:
Frequency histogram f (x) of NDVI in statistics piecemeal image.The maximal value NDVI of NDVI in search piecemeal image maxwith minimum value NDVI min, by NDVI minto NDVI maxbe equally divided into 256 parts, statistics NDVI value appears at every a interval pixel quantity, obtains histogram f (x) of NDVI.
The single order calculating histogram f (x) of NDVI is reciprocal, is maximum point, minimum point in the position searching NDVI frequency histogram of 0 according to single order inverse.
The quantity of statistics maximum point, determines ground mulching categorical measure n.Using the NDVI value at maximum value place as the initial mean value m of each ground mulching classification i, calculate initial σ with adjacent two minimum points by formula (5) i.
σ i=(NDVI 2-NDVI 1)/4 (5)
NDVI 2, NDVI 1represent the NDVI value that adjacent two minimal value places are corresponding respectively.
p ( &omega; i ) = ncount n _ total
Wherein ncount is the number of all pixels of NDVI value between adjacent two minimal values, and n_total is total number of pixels of piecemeal.
Based on maximal mathematical expectation (EM) algorithm, adopt loop iteration estimation p (ω i), m i, eM algorithm is by loop iteration, and each iteration forms by asking expectation value and expectation maximization two steps; The former is according to the currency of parameter to be estimated, the expectation value of direct estimation probability density from observation data, and the latter carrys out the estimator of undated parameter by maximizing this expectation, and this two step hockets successively in whole iterative process, until iteration converges; Specific formula for calculation is as follows:
p t + 1 ( &omega; k ) = &Sigma; X ( i , j ) &Element; M D p t ( &omega; k ) p t ( X ( i , j ) / &omega; k ) p t ( X ( i , j ) ) I * I - - - ( 5 )
m k t + 1 ( &omega; k ) = &Sigma; X ( i , j ) &Element; M D p t ( &omega; k ) p t ( X ( i , j ) / &omega; k ) p t ( X ( i , j ) ) X ( i , j ) &Sigma; X ( i , j ) &Element; M D p t ( &omega; k ) p t ( X ( i , j ) / &omega; k ) p t ( X ( i , j ) ) - - - ( 6 )
( &sigma; k 2 ) t + 1 ( &omega; k ) = &Sigma; X ( i , j ) &Element; M D p t ( &omega; k ) p t ( X ( i , j ) / &omega; k ) p t ( X ( i , j ) ) [ X ( i , j ) - m k t ] 2 &Sigma; X ( i , j ) &Element; M D p t ( &omega; k ) p t ( X ( i , j ) / &omega; k ) p t ( X ( i , j ) ) - - - ( 7 )
What three formulas were estimated above is prior probability, average and standard deviation respectively, in formula, k represents a kth atural object classification, t and t+1 represents estimated value current and that next iteration is used respectively, i, j represents line number and the columns of NDVI image respectively, and X (i, j) represents the NDVI value that in NDVI image, the i-th row j arranges, conditional probability p (X (i, j) | ω i) calculating see (3), the value of total probability p (X (i, j)) is provided by formula (2); As the p (ω of adjacent twice iterative computation i), m iand σ ivalue be less than given threshold epsilon (ε=10 -8) time iteration ends.
Determine the initial vegetation area of piecemeal.Suppose that the NDVI average value of each classification obtained in upper step is by sorting from big to small, m 1, m 2, m n, corresponding variance is σ i, then initial vegetation distribution plan can be expressed as VI (i, j), and wherein 1 represents vegetation area, and 0 represents territory, nonvegetated area.
Then
VI ( i , j ) = 1 NDVI ( i , j ) > T 0 NDVI ( i , j ) &le; T - - - ( 8 )
Wherein threshold value T is shown in formula
T = m 1 - 1.5 &sigma; 1 m 1 - m 2 > 1.5 ( &sigma; 1 + &sigma; 2 ) m 2 + 1.5 &sigma; 2 m 1 - m 2 &le; 1.5 ( &sigma; 1 + &sigma; 2 ) - - - ( 9 )
Repeat 1.3 steps, obtain the vegetation prime area of all piecemeals.Merge the vegetation area in all piecemeals, obtain initial vegetation area.Then connectedness mark is carried out to whole image vegetation area, obtain the pixel coordinate of each vegetative patch.Connective mark can adopt 4 neighborhoods or 8 neighborhoods.
Step 2, multispectral image computation vision perceptual parameters to high spatial resolution.
2.1 pairs of multispectral images have RGB color space transformation to HSI color space.
The image of red, green, blue wave band corresponding in high-resolution multi-spectral image, calculates the result of variations of vegetation brightness, colourity, saturation degree according to formula (10).
H = &theta; , if G &GreaterEqual; B 2 &pi; - &theta; , otherwise &theta; = arccos ( R - G ) + ( R - B ) 2 ( R - G ) 2 + ( R - B ) ( G - B ) S = 1 - 3 R + G + B min ( R , G , B ) I = ( R + G + B ) / 3 - - - ( 10 )
2.2 computation vision characteristic perception parameters.
According to initial vegetative patch as sample, calculate the average μ of the brightness of the vegetation of each patch, colourity, saturation degree x(H, S, I), variances sigma xstatistical parameter values such as (H, S, I).Calculate normalization tone parameter of saturation
H &prime; ( i , j ) = 1 - | H ( i , j ) - &mu; k | &mu; k 0 , otherwise , if H L < H ( i , j ) < H U - - - ( 11 )
In formula, (i, j) represents pixel coordinate, μ kthe average of a kth vegetative patch, the variance of its correspondence is σ k, H l, H ufor lower limit and the upper limit of vegetation tone, see formula (11)
H L = &mu; k - 2 &sigma; k H U = &mu; k + 2 &sigma; k - - - ( 11 )
Step 3, textural characteristics parameter is calculated to the panchromatic wave-band image of high spatial resolution
For panchromatic wave-band video generation gray level co-occurrence matrixes, calculate 6 parametric textures of gray level co-occurrence matrixes: contrast (Con), diversity (Dis), angle energy (ASM), entropy (Ent), homogeneity (Hom), coefficient of autocorrelation (Cor), concrete formula is shown in (11-17).The selection of textural characteristics computation window mouth and the resolution of panchromatic image and multispectral image have relation, and the spatial resolution of false color panchromatic wave-band is n times of multispectral image, then the window that texture calculates is 2n-1, calculates once every n pixel.Such as the resolution of quickbird image panchromatic wave-band image is 0.6 1m, and the resolution of multispectral image is 2.4 4m, and the size calculating window during panchromatic wave-band texture is 7 × 7, calculates a textural characteristics every 4 pixels.So just can by the visually-perceptible feature of the textural characteristics of panchromatic image and multispectral image and NDVI corresponding.
CON = &Sigma; i , j ( i - j ) 2 p ( i , j ) - - - ( 11 )
DIS = &Sigma; i , j p ( i - j ) | i - j | - - - ( 12 )
ASM = &Sigma; i , j p ( i - j ) 2 - - - ( 13 )
ENT = &Sigma; i , j p ( i - j ) log ( p ( i , j ) ) - - - ( 14 )
HOM = &Sigma; i , j p ( i , j ) 1 + ( i - j ) 2 - - - ( 15 )
COR = &Sigma; i , j p ( i , j ) = ( i - &mu; i ) ( j - &mu; j ) &sigma; x &sigma; y - - - ( 16 )
The vegetation area automatic increase of step 4, combination features, obtains vegetation area distribution plan
Be " seed " with each initial vegetative patch, the pixel adjacent to " seed " patch is attached on seed region by the method adopting region to increase.The criterion that region increases adopts textural characteristics, visually-perceptible characteristic synthetic to adopt multiple features voting to determine.
The criterion that region increases is as follows
To each initial vegetative patch, add up brightness (I), normalization tone (H), saturation degree (S), contrast (Contrast), diversity (Dissimilarity), angle energy (Energy), entropy (Entropy) respectively, homogeneity (Homogeneity), coefficient of autocorrelation, the average of these 9 characteristic parameters, variance, suppose that its value is μ, σ.To 8 neighborhood territory pixels of initial vegetative patch, according to each feature, judge whether it is vegetation, and discriminant function is defined as follows
T 2 = &mu; - 2 &sigma; - - - ( 17 )
V ( i , j ) = 1 , if M ( i , j ) > T 2 0 , otherwise - - - ( 18 )
Wherein, M (i, j) refers to the characteristic parameter calculated, be exactly the visual signature parameter of 3 described in (1), 6 textural characteristics parameters, have 9, i.e. brightness, normalization tone, saturation degree, contrast, diversity, angle energy, entropy, homogeneity, coefficient of autocorrelation.Herein with any one variable that M replaces wherein.Represent judgment threshold at T2, i, j represent the ranks number of pixel, and 1 represents vegetation area, and 0 represents territory, nonvegetated area.
Like this, can brightness, normalization tone, saturation degree, contrast (Contrast), diversity (Dissimilarity), angle energy (Energy), entropy (Entropy), homogeneity (Homogeneity), coefficient of autocorrelation, these 9 characteristic parameters can obtain a width Potential vegetation areal map respectively.
Final vegetation area is determined in multiple features voting.Voted by this 9 width Potential vegetation areal map, namely on vegetation area figure, each pixel, when having 6 and above feature is labeled as 1, thinks final vegetation area.
The urban vegetation extraction method of multiple-spatial resolution remote sensing image provided by the invention, have employed image block dividing processing, accelerates image processing speed.According to NDVI feature, adopt the automatic selection of the adaptive dynamic threshold of maximal mathematical expectation algorithm.The panchromatic wave-band of remote sensing image, multi light spectrum hands are made full use of, to the NDVI feature extracted from panchromatic wave-band and multi light spectrum hands, visual signature, textural characteristics comprehensively, vegetation area judgement is carried out to vegetation, improves accuracy.Method provided by the invention also has certain applicability, as the NDVI vegetation index in this method can change EVI index, SR index etc. into.
The above, only that specific embodiment of the invention case is described, and be not used to limit of the present invention can practical range, such as all equivalences that those skilled in the art complete under the spirit do not departed from indicated by the present invention and principle change or modify, and must be covered by the scope of the claims in the present invention.

Claims (9)

1. a urban vegetation extraction method for multiple-spatial resolution remote sensing image, is characterized in that described method mainly comprises the following steps:
Step one, to extract based on the initial patch of the multispectral image vegetation of high spatial resolution;
Step 2, multispectral image computation vision perceptual parameters to high spatial resolution;
Step 3, textural characteristics parameter is calculated to the panchromatic wave-band image of high spatial resolution;
The vegetation area automatic increase of step 4, combination features, obtains vegetation area distribution plan.
2. the urban vegetation extraction method of multiple-spatial resolution remote sensing image according to claim 1, is characterized in that: described step one specifically comprises the following steps:
Step 1.1 calculates NDVI vegetation spectrum index, and computing formula is
NDVI = NIR - R NIR + R
Wherein NIR represents the pixel value of near-infrared band, and R represents the pixel value of red band;
Step 1.2 pair NDVI image carries out piecemeal, every block size 300*300 pixel, when image edge part is less than 300*300 pixel, is as the criterion with the size of reality;
Step 1.3 distinguishes vegetation and non-vegetation to the NDVI block data automatic acquisition segmentation threshold of each 300*300 pixel, obtains initial vegetation area;
Step 1.4 repeats step 1.3, obtains the vegetation prime area of all piecemeals, merges the vegetation area in all piecemeals, obtain initial vegetation area, carries out connectedness mark, obtain the pixel coordinate of each vegetative patch to whole image vegetation area.
3. the urban vegetation extraction method of multiple-spatial resolution remote sensing image according to claim 2, is characterized in that: in described step 1.3, the acquisition of segmentation threshold obtains by the following method:
(1) corresponding several ground mulching type in each piecemeal NDVI, with ω irepresent, suppose that Gaussian density function distribution is obeyed in the NDVI distribution of each atural object classification, then probability distribution function p (x) that in each piecemeal, NDVI is total can be expressed as:
p ( x ) = &Sigma; i = 1 n p ( &omega; i ) p ( x | &omega; i )
Wherein, other total quantity of n representation class, ω irepresent i-th atural object classification, average and variance use m respectively i, represent, its probability density function is expressed as
p ( x | &omega; i ) = 1 2 &pi; &sigma; i 2 exp { - ( x - m i ) 2 2 &sigma; i 2 }
Wherein, p (ω i) be the probability density of each atural object classification, its condition met is:
&Sigma; i = 1 n p ( &omega; i ) = 1
Under these conditions, the threshold value solving vegetation and non-vegetation can be converted into the quantity n estimating atural object classification, the ω of each classification iaverage and variance divide m i, concrete steps are as follows:
(2) frequency histogram f (x) of NDVI in piecemeal image is added up, the maximal value NDVI of NDVI in search piecemeal image maxwith minimum value NDVI min, by NDVI minto NDVI maxbe equally divided into 256 parts, statistics NDVI value appears at every a interval pixel quantity, obtains histogram f (x) of NDVI;
(3) single order calculating histogram f (x) of NDVI is reciprocal, is maximum point, minimum point in the position searching NDVI frequency histogram of 0 according to single order inverse;
(4) add up the quantity of maximum point, determine ground mulching categorical measure n, using the NDVI value at maximum value place as the initial mean value m of each ground mulching classification i, calculate initial σ with adjacent two minimum points iformula as follows:
σ i=(NDVI 2-NDVI 1)/4
Wherein, NDVI 2, NDVI 1represent the NDVI value that adjacent two minimal value places are corresponding respectively;
p ( &omega; i ) = ncount n _ total
Wherein, wherein ncount is the number of all pixels of NDVI value between adjacent two minimal values, and n_total is total number of pixels of piecemeal;
(5) based on maximal mathematical expectation (EM) algorithm, loop iteration estimation p (ω is adopted i), m i, by loop iteration, each iteration forms by asking expectation value and expectation maximization two steps; Ask the step of expectation value according to the currency of parameter to be estimated, the expectation value of direct estimation probability density from observation data, the step of expectation maximization carrys out the estimator of undated parameter by maximizing this expectation, this two step hockets successively in whole iterative process, until iteration converges; Specific formula for calculation is as follows:
p t + 1 ( &omega; k ) = &Sigma; X ( i , j ) &Element; M D p t ( &omega; k ) p t ( X ( i , j ) / &omega; k ) p t ( X ( i , j ) ) I * I
m k t + 1 ( &omega; k ) = &Sigma; X ( i , j ) &Element; M D p t ( &omega; k ) p t ( X ( i , j ) / &omega; k ) p t ( X ( i , j ) ) X ( i , j ) &Sigma; X ( i , j ) &Element; M D p t ( &omega; k ) p t ( X ( i , j ) / &omega; k ) p t ( X ( i , j ) )
( &sigma; k 2 ) t + 1 ( &omega; k ) = &Sigma; X ( i , j ) &Element; M D p t ( &omega; k ) p t ( X ( i , j ) / &omega; k ) p t ( X ( i , j ) ) [ X ( i , j ) - m k t ] 2 &Sigma; X ( i , j ) &Element; M D p t ( &omega; k ) p t ( X ( i , j ) / &omega; k ) p t ( X ( i , j ) )
What above-mentioned three formulas were estimated is prior probability, average and standard deviation respectively, in formula, k represents a kth atural object classification, t and t+1 represents estimated value current and that next iteration is used respectively, i, j represents line number and the columns of NDVI image respectively, and X (i, j) represents the NDVI value that in NDVI image, the i-th row j arranges, conditional probability p (X (i, j) | ω i) calculating and the value of total probability p (X (i, j)) drawn by the respective formula in step (1); As the p (ω of adjacent twice iterative computation i), m iand σ ivalue be less than given threshold epsilon (ε=10 -8) time iteration ends;
(6) determining the initial vegetation area of piecemeal, supposing that the NDVI average value of each classification obtained in upper step is by sorting from big to small, m 1, m 2, m n, corresponding variance is σ i, then initial vegetation distribution plan can be expressed as VI (i, j), and wherein 1 represents vegetation area, and 0 represents territory, nonvegetated area, then
VI ( i , j ) = 1 NDVI ( i , j ) > T 0 NDVI ( i , j ) &le; T
Wherein threshold value T is tried to achieve by following formula
T = m 1 - 1.5 &sigma; 1 m 1 - m 2 > 1.5 ( &sigma; 1 + &sigma; 2 ) m 2 + 1.5 &sigma; 2 m 1 - m 2 &le; 1.5 ( &sigma; 1 + &sigma; 2 ) .
4. the urban vegetation extraction method of multiple-spatial resolution remote sensing image according to claim 2, is characterized in that: connectedness mark employing 4 neighborhood in described step 1.4 or 8 neighborhoods.
5. the urban vegetation extraction method of multiple-spatial resolution remote sensing image according to claim 1, is characterized in that: described step 2 specifically comprises the following steps:
Step 2.1 pair multispectral image has RGB color space transformation to HSI color space, by the result of variations of the brightness of following formulae discovery vegetation, colourity, saturation degree
H = &theta; , ifG &GreaterEqual; B 2 &pi; - &theta; , otherwise &theta; = arccos ( R - G ) + ( R - B ) 2 ( R - G ) 2 + ( R - B ) ( G - B ) S = 1 - 3 R + G + B min ( R , G , B ) I = ( R + G + B ) / 3 ;
Step 2.2 computation vision characteristic perception parameter, according to initial vegetative patch as sample, calculates the average μ of the brightness of the vegetation of each patch, colourity, saturation degree x(H, S, I), variances sigma x(H, S, I), calculates normalization tone parameter of saturation
H &prime; ( i , j ) = 1 - | H ( i , j ) - &mu; k | &mu; k , if H L < H ( i , j ) < H U 0 , otherwise
Wherein, H (i, j) is tone, and (i, j) represents pixel coordinate, μ kthe average of a kth vegetative patch, the variance of its correspondence is σ k, H l, H ufor lower limit and the upper limit of vegetation tone
H L = &mu; k - 2 &sigma; k H U = &mu; k + 2 &sigma; k .
6. the urban vegetation extraction method of multiple-spatial resolution remote sensing image according to claim 1, is characterized in that described step 3 is specially:
For panchromatic wave-band video generation gray level co-occurrence matrixes, calculate 6 parametric textures of gray level co-occurrence matrixes: contrast, diversity, angle energy, entropy, homogeneity, coefficient of autocorrelation, specific formula for calculation is as follows:
CON = &Sigma; i , j ( i - j ) 2 p ( i , j )
DIS = &Sigma; i , j p ( i , j ) | i - j |
ASM = &Sigma; i , j p ( i , j ) 2
ENT = &Sigma; i , j p ( i , j ) log ( p ( i , j ) )
HOM = &Sigma; i , j p ( i , j ) 1 + ( i - j ) 2
COR = &Sigma; i , j p ( i , j ) ( i - &mu; i ) ( j - &mu; j ) &sigma; x &sigma; y
The selection of textural characteristics computation window mouth and the resolution of panchromatic image and multispectral image have relation, and suppose that the spatial resolution of panchromatic wave-band is n times of multispectral image, then the window that texture calculates is 2n-1, calculates once every n pixel.
7. the urban vegetation extraction method of multiple-spatial resolution remote sensing image according to claim 1, it is characterized in that described step 4 is specially: with each initial vegetative patch for " seed ", the pixel adjacent to " seed " patch is attached on seed region by the method adopting region to increase.
8. the urban vegetation extraction method of multiple-spatial resolution remote sensing image according to claim 7, is characterized in that the criterion that described region increases adopts textural characteristics, visually-perceptible characteristic synthetic to adopt multiple features voting to determine.
9. the urban vegetation extraction method of multiple-spatial resolution remote sensing image according to claim 8, is characterized in that the criterion that described region increases is as follows:
(1) to each initial vegetative patch, add up brightness, normalization tone, saturation degree, contrast, diversity, angle energy, entropy respectively, average, the variance of homogeneity, coefficient of autocorrelation and above-mentioned characteristic parameter, suppose that its value is μ, σ; To 8 neighborhood territory pixels of initial vegetative patch, according to each feature, judge whether it is vegetation, and discriminant function is defined as follows
T 2 = &mu; - 2 &sigma;
V ( i , j ) = 1 , ifM ( i , j ) > T 2 0 , otherwise
Wherein, the characteristic parameter of M (i, j) for calculating, at T 2represent judgment threshold, i, j represent the ranks number of pixel, and 1 represents vegetation area, and 0 represents territory, nonvegetated area, and then obtain above-mentioned 9 characteristic parameters Potential vegetation areal map corresponding respectively;
(2) multiple features voting determine final vegetation area, this 9 width Potential vegetation areal map is voted, namely on vegetation area figure each pixel when have 6 and above time feature be labeled as 1 time, think final vegetation area.
CN201510186167.6A 2015-04-17 2015-04-17 The urban vegetation extraction method of multiple-spatial resolution remote sensing image Expired - Fee Related CN104851113B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510186167.6A CN104851113B (en) 2015-04-17 2015-04-17 The urban vegetation extraction method of multiple-spatial resolution remote sensing image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510186167.6A CN104851113B (en) 2015-04-17 2015-04-17 The urban vegetation extraction method of multiple-spatial resolution remote sensing image

Publications (2)

Publication Number Publication Date
CN104851113A true CN104851113A (en) 2015-08-19
CN104851113B CN104851113B (en) 2017-11-03

Family

ID=53850740

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510186167.6A Expired - Fee Related CN104851113B (en) 2015-04-17 2015-04-17 The urban vegetation extraction method of multiple-spatial resolution remote sensing image

Country Status (1)

Country Link
CN (1) CN104851113B (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105405148A (en) * 2015-11-17 2016-03-16 中国科学院遥感与数字地球研究所 Remote sensing tomentosa recognition method in combination with tree shadow feature
WO2017121058A1 (en) * 2016-01-13 2017-07-20 南京大学 All-optical information acquisition system
CN107194187A (en) * 2017-06-06 2017-09-22 国家基础地理信息中心 Take the ground mulching sample size computational methods of space composition and configuration into account
CN107657631A (en) * 2016-12-23 2018-02-02 航天星图科技(北京)有限公司 A kind of generation method of the microwave spectrum mixed image based on vegetation distribution
CN108009542A (en) * 2017-11-01 2018-05-08 华中农业大学 Weed images dividing method under rape field environment
CN108089850A (en) * 2018-01-02 2018-05-29 北京建筑大学 A kind of ground mulching product increment updating method based on image collaboration segmentation with ecogeography zoning ordinance storehouse
CN108369635A (en) * 2015-11-08 2018-08-03 阿格洛英公司 The method with analysis is obtained for aerial image
CN108986113A (en) * 2018-07-06 2018-12-11 航天星图科技(北京)有限公司 A kind of block parallel multi-scale division algorithm based on LLTS frame
CN109033543A (en) * 2018-06-29 2018-12-18 北京师范大学 A kind of heterogeneous area's vegetation fraction estimation method, device and equipment of earth's surface
CN109583378A (en) * 2018-11-30 2019-04-05 东北大学 A kind of vegetation coverage extracting method and system
CN110826480A (en) * 2019-11-04 2020-02-21 湖南城市学院 Water body automatic extraction method based on ant colony algorithm
CN111402182A (en) * 2020-03-18 2020-07-10 中国资源卫星应用中心 Land-coverage-information-based midsplit image synthesis method
CN111814809A (en) * 2020-07-17 2020-10-23 国家基础地理信息中心 Remote sensing image vegetation information extraction method and device
CN114273252A (en) * 2021-11-26 2022-04-05 山东安信种苗股份有限公司 Intelligent vegetable seedling grading method
CN116188809A (en) * 2023-05-04 2023-05-30 中国海洋大学 Texture similarity judging method based on visual perception and sequencing driving
CN116740580A (en) * 2023-08-16 2023-09-12 山东绿博园市政工程有限公司 Garden engineering data processing method based on remote sensing technology
CN117132423A (en) * 2023-08-22 2023-11-28 珠海市经典电子有限公司 Park management system
CN117152634A (en) * 2023-09-12 2023-12-01 广东省水利水电科学研究院 Multi-source satellite image floating plant identification method and system based on chromaticity index

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050171754A1 (en) * 2002-03-11 2005-08-04 Microsoft Corporation Automatic scenery object generation
CN101832769A (en) * 2010-03-30 2010-09-15 中国农业大学 Method and system for estimating vegetation coverage degree in diggings based on close-up photography
JP2011069757A (en) * 2009-09-28 2011-04-07 Hitachi Solutions Ltd Spectral analysis apparatus
CN103093233A (en) * 2012-12-03 2013-05-08 中国环境科学研究院 Forest classification method based on object-oriented high-resolution remote sensing image
CN103593858A (en) * 2013-11-13 2014-02-19 北京林业大学 Method for filtering out green vegetation data in ground laser radar scanning data
EP2801951A1 (en) * 2013-05-08 2014-11-12 Honeywell International Inc. Aerial image segmentation for refineries

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050171754A1 (en) * 2002-03-11 2005-08-04 Microsoft Corporation Automatic scenery object generation
JP2011069757A (en) * 2009-09-28 2011-04-07 Hitachi Solutions Ltd Spectral analysis apparatus
CN101832769A (en) * 2010-03-30 2010-09-15 中国农业大学 Method and system for estimating vegetation coverage degree in diggings based on close-up photography
CN103093233A (en) * 2012-12-03 2013-05-08 中国环境科学研究院 Forest classification method based on object-oriented high-resolution remote sensing image
EP2801951A1 (en) * 2013-05-08 2014-11-12 Honeywell International Inc. Aerial image segmentation for refineries
CN103593858A (en) * 2013-11-13 2014-02-19 北京林业大学 Method for filtering out green vegetation data in ground laser radar scanning data

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李成范等: "《一种面向对象的遥感影像城市绿地提取方法》", 《测绘科学》 *
熊轶群等: "《面向对象的城市绿地信息提取方法研究》", 《华东师范大学学报(自然科学版)》 *

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108369635A (en) * 2015-11-08 2018-08-03 阿格洛英公司 The method with analysis is obtained for aerial image
CN105405148B (en) * 2015-11-17 2017-11-14 中国科学院遥感与数字地球研究所 A kind of remote sensing image Chinese white poplar recognition methods of combination trees shadow character
CN105405148A (en) * 2015-11-17 2016-03-16 中国科学院遥感与数字地球研究所 Remote sensing tomentosa recognition method in combination with tree shadow feature
WO2017121058A1 (en) * 2016-01-13 2017-07-20 南京大学 All-optical information acquisition system
CN107657631B (en) * 2016-12-23 2018-09-25 航天星图科技(北京)有限公司 A kind of generation method of the microwave spectrum mixed image based on vegetation distribution
CN107657631A (en) * 2016-12-23 2018-02-02 航天星图科技(北京)有限公司 A kind of generation method of the microwave spectrum mixed image based on vegetation distribution
CN107194187A (en) * 2017-06-06 2017-09-22 国家基础地理信息中心 Take the ground mulching sample size computational methods of space composition and configuration into account
CN107194187B (en) * 2017-06-06 2019-08-06 国家基础地理信息中心 Take the ground mulching sample size calculation method of space composition and configuration into account
CN108009542B (en) * 2017-11-01 2021-06-15 华中农业大学 Weed image segmentation method in rape field environment
CN108009542A (en) * 2017-11-01 2018-05-08 华中农业大学 Weed images dividing method under rape field environment
CN108089850B (en) * 2018-01-02 2020-11-24 北京建筑大学 Method for updating earth surface coverage product increment based on image collaborative segmentation and ecological geographical partitioning rule base
CN108089850A (en) * 2018-01-02 2018-05-29 北京建筑大学 A kind of ground mulching product increment updating method based on image collaboration segmentation with ecogeography zoning ordinance storehouse
CN109033543A (en) * 2018-06-29 2018-12-18 北京师范大学 A kind of heterogeneous area's vegetation fraction estimation method, device and equipment of earth's surface
CN109033543B (en) * 2018-06-29 2020-06-30 北京师范大学 Method, device and equipment for estimating vegetation coverage of heterogeneous land areas
CN108986113A (en) * 2018-07-06 2018-12-11 航天星图科技(北京)有限公司 A kind of block parallel multi-scale division algorithm based on LLTS frame
CN109583378A (en) * 2018-11-30 2019-04-05 东北大学 A kind of vegetation coverage extracting method and system
CN110826480A (en) * 2019-11-04 2020-02-21 湖南城市学院 Water body automatic extraction method based on ant colony algorithm
CN111402182A (en) * 2020-03-18 2020-07-10 中国资源卫星应用中心 Land-coverage-information-based midsplit image synthesis method
CN111402182B (en) * 2020-03-18 2023-04-28 中国资源卫星应用中心 Middle-divided image synthesis method based on land coverage information
CN111814809A (en) * 2020-07-17 2020-10-23 国家基础地理信息中心 Remote sensing image vegetation information extraction method and device
CN114273252A (en) * 2021-11-26 2022-04-05 山东安信种苗股份有限公司 Intelligent vegetable seedling grading method
CN116188809A (en) * 2023-05-04 2023-05-30 中国海洋大学 Texture similarity judging method based on visual perception and sequencing driving
CN116188809B (en) * 2023-05-04 2023-08-04 中国海洋大学 Texture similarity judging method based on visual perception and sequencing driving
CN116740580A (en) * 2023-08-16 2023-09-12 山东绿博园市政工程有限公司 Garden engineering data processing method based on remote sensing technology
CN117132423A (en) * 2023-08-22 2023-11-28 珠海市经典电子有限公司 Park management system
CN117132423B (en) * 2023-08-22 2024-04-12 深圳云创友翼科技有限公司 Park management system
CN117152634A (en) * 2023-09-12 2023-12-01 广东省水利水电科学研究院 Multi-source satellite image floating plant identification method and system based on chromaticity index

Also Published As

Publication number Publication date
CN104851113B (en) 2017-11-03

Similar Documents

Publication Publication Date Title
CN104851113B (en) The urban vegetation extraction method of multiple-spatial resolution remote sensing image
CN108573276B (en) Change detection method based on high-resolution remote sensing image
CN106651872A (en) Prewitt operator-based pavement crack recognition method and system
CN103632363B (en) Object level high-resolution remote sensing image change detecting method based on Multiscale Fusion
CN103971115B (en) Automatic extraction method for newly-increased construction land image spots based on NDVI and PanTex index
CN103927741B (en) SAR image synthesis method for enhancing target characteristics
CN101840581B (en) Method for extracting profile of building from satellite remote sensing image
CN103559500B (en) A kind of multi-spectral remote sensing image terrain classification method based on spectrum Yu textural characteristics
CN103077515B (en) Multi-spectral image building change detection method
CN107862667A (en) A kind of city shadow Detection and minimizing technology based on high-resolution remote sensing image
CN102902956B (en) A kind of ground visible cloud image identifying processing method
CN104217440B (en) A kind of method extracting built-up areas from remote sensing images
CN105956557A (en) Object-oriented timing sequence remote sensing image cloud coverage area automatic detection method
CN107832797B (en) Multispectral image classification method based on depth fusion residual error network
CN110390255A (en) High-speed rail environmental change monitoring method based on various dimensions feature extraction
CN112488050A (en) Color and texture combined aerial image scene classification method and system
CN107818303A (en) Unmanned plane oil-gas pipeline image automatic comparative analysis method, system and software memory
CN104268590A (en) Blind image quality evaluation method based on complementarity combination characteristics and multiphase regression
CN103927558A (en) Winter wheat remote sensing recognition method based on hardness change detection
CN107292328A (en) The remote sensing image shadow Detection extracting method and system of multiple dimensioned multiple features fusion
CN104952070A (en) Near-rectangle guide based remote-sensing cornfield image segmentation method
CN107169946A (en) Image interfusion method based on non-negative sparse matrix Yu hypersphere color transformation
CN107992856A (en) High score remote sensing building effects detection method under City scenarios
CN110458208A (en) Hyperspectral image classification method based on information measure
CN104573662B (en) A kind of cloud sentences method and system

Legal Events

Date Code Title Description
DD01 Delivery of document by public notice

Addressee: Dian Yuanyong

Document name: Notification of Acceptance of Patent Application

C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20171103

Termination date: 20180417