CN110070545A - A kind of method that textural characteristics density in cities and towns automatically extracts cities and towns built-up areas - Google Patents

A kind of method that textural characteristics density in cities and towns automatically extracts cities and towns built-up areas Download PDF

Info

Publication number
CN110070545A
CN110070545A CN201910212324.4A CN201910212324A CN110070545A CN 110070545 A CN110070545 A CN 110070545A CN 201910212324 A CN201910212324 A CN 201910212324A CN 110070545 A CN110070545 A CN 110070545A
Authority
CN
China
Prior art keywords
image
cities
towns
density
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910212324.4A
Other languages
Chinese (zh)
Other versions
CN110070545B (en
Inventor
胡龙飞
仲波
罗小波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University of Post and Telecommunications
Original Assignee
Chongqing University of Post and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University of Post and Telecommunications filed Critical Chongqing University of Post and Telecommunications
Priority to CN201910212324.4A priority Critical patent/CN110070545B/en
Publication of CN110070545A publication Critical patent/CN110070545A/en
Application granted granted Critical
Publication of CN110070545B publication Critical patent/CN110070545B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration using non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20056Discrete and fast Fourier transform, [DFT, FFT]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A30/00Adapting or protecting infrastructure or their operation
    • Y02A30/60Planning or developing urban green infrastructure

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

A kind of method that textural characteristics density in cities and towns automatically extracts cities and towns built-up areas is claimed in the present invention; the invention belongs to remote sensing image information interpretation fields; this method is directed to high-definition remote sensing data; remote sensing images cities and towns textural characteristics density is constructed using multiple dimensioned simple linear iterative algorithm, to automatically extract cities and towns built-up areas.Sensibility of this method first with the different texture feature of J-M distance analysis gray level co-occurrence matrixes in cities and towns are extracted, by texture difference feature as classification foundation;Then the super-pixel segmentation of multiple scales is carried out using simple linear Iterative Clustering (SLIC), using super-pixel as the range for calculating local feature density, calculate repeated segmentation after characteristic density image and be overlapped, construct the characteristic density image of image to be classified.The result that this method is extracted has preferable percentage of head rice and accuracy, especially high at the cities and towns edge that integrality aspect, this method are extracted and actual edge degree of conformity, can reach artificial detection effect.

Description

A kind of method that textural characteristics density in cities and towns automatically extracts cities and towns built-up areas
Technical field
The invention belongs to the methods that remote sensing image information extracts field, and in particular to be a kind of distant for high-resolution Feel image (10m and higher resolution), be based on multilayer super-pixel segmentation, construct cities and towns textural characteristics density, realizes that automation mentions The method for taking cities and towns built-up areas in image.
Background technique
Urban residential areas based on remote sensing image refers to, claps in satellite-remote-sensing image, Aerial photography image or unmanned plane In the image taken the photograph, identifies and mark cities and towns built-up areas.Range, distribution and the variation of cities and towns built-up areas are for urban planning, soil It is of great significance using resource allocation and environmental monitoring, while being also the data basis of many research directions, largely ground The concern for the person of studying carefully.However complicated terrestrial information and various urban settlement become very by the extraction in cities and towns with challenge The task of property.With the rapid development of World Airways aerospace industry, the image data quality of acquisition and the raising of resolution ratio are high The image of resolution ratio becomes easier to obtain, and has clearly cities and towns texture information in high-resolution remote sensing image, for based on distant Feel image information and extracts the significant data source that town information provides.
Cities and towns built-up areas extracting method is broadly divided into two classes, the method based on spectrum and the method based on texture.Based on light The method of spectrum refers to, by the multispectral information of middle high-resolution data, analyze cities and towns and non-cities and towns on the curve of spectrum Difference distinguishes, and is chiefly used in extracting the cities and towns of large area and variation detection.However, middle high-resolution data is by resolution ratio Limitation, it is difficult to obtain high-precision town-wide.Method based on texture refers to, in high-resolution data so that cities and towns and The texture difference in non-cities and towns distinguishes.Method based on texture can be divided into again: method based on marginal density is based on straight line The method of detection, the method based on skin texture detection model.Wherein first two excessively relies on the quality of input data, and requires number According to high resolution in 2 meters.Method based on skin texture detection model is widely used, usually using Gabor filtering and gray scale symbiosis Matrix constructs textural characteristics, and wherein Gabor filtering is higher to the resolution requirement of data, and the side based on gray level co-occurrence matrixes Method can be applied to and a variety of resolution datas.It is existing that very high requirement is had based on quality of the method for texture to data, lead to Often need the data higher than 5 meters of resolution ratio.However high-resolution data obtains cost height, is unfavorable for large-scale cities and towns and builds up It extracts in area.In addition existing method all excessively relies on the quality of data itself and the feature accuracy of extraction, however remote sensing image " foreign matter is with spectrum " phenomenon is generally existing, there is largely natural feature on a map similar with cities and towns texture, causes largely accidentally to extract.
The remote sensing satellite data that spatial resolution is 10 meters, not only possess clearly texture information, but also cities and towns built-up areas Inner vein information is smoother, data can also Free Acquisition, but so far cities and towns extract on using less.The present invention is by root The characteristics of according to 10 meters of resolution remote sense satellite image data, probes into the textural characteristics for being suitble to extract cities and towns, and proposes a kind of utilization The method that super-pixel segmentation carrys out construction feature density improves the precision that cities and towns are extracted.
Summary of the invention
Present invention seek to address that the above problem of the prior art.The result for proposing a kind of extraction has preferable percentage of head rice It is especially high at the cities and towns edge that integrality aspect, this method are extracted and actual edge degree of conformity with accuracy, it can reach The method of artificial detection effect.Technical scheme is as follows:
A kind of method that textural characteristics density in cities and towns automatically extracts cities and towns built-up areas comprising following steps:
1) remote sensing image for obtaining 10m resolution ratio visible light wave range, and is cut, spliced, cloud is gone to operate, obtain to Extract region;
2) grayscale image is constructed to pretreated remote sensing image, Fourier transformation is based on to grayscale image and filters out low frequency Texture obtains the grayscale image of high fdrequency component;
3) for the grayscale image of high fdrequency component obtained in step 2), the gray level co-occurrence matrixes in several directions are calculated Otherness feature, and extract rotational invariance textural characteristics;
4) image of the rotational invariance textural characteristics composition obtained step 3), is mapped as 0 to 255 gray level image, Characteristic area is extracted using Otsu threshold method OTSU;
5) SLIC algorithm is clustered by simple linear iteration and more rulers is carried out to area image to be extracted obtained in step 1) Degree segmentation obtains multilayer super-pixel distributed image and obtains the characteristic density image under each scale as unit of super-pixel, folds Add all characteristic density images, to construct the characteristic density image of fining.
Further, the step 1) image of acquisition is cut, is spliced, cloud removing, specifically include: a scape Remote sensing images generally can not completely cover region to be extracted, need to download the adjacent image of more scapes, use the Map of ENVI software Then the image that Based Mosaic function stitching image is completely covered uses the Subset Data of ENVI software From ROIs function crops the part extracted outside region according to the range (vector file .shp) in region to be extracted.Remote sensing Ground object cannot be obtained well in image to be processed by the part that cloud covers by very universal the case where cloud block in image Extraction effect, need to carry out cloud removal processing.Commonly remove cloud method are as follows: it is cloudless to find the region for manual identified cloud sector domain Other period images, cut out the area image, and be spliced in image to be processed, obtain the clean image of cloudless covering. The completion of ENVI software can be used in pretreatment work.
Further, the step 2) is based on Fourier transformation to grayscale image and filters out low frequency texture, obtains high fdrequency component Grayscale image, specifically include: space area image being transformed into frequency domain using two-dimensional fast fourier transform FFT, uses Gauss Low-pass filter is transformed into spatial domain with inverse fast fourier transform to frequency domain images filter, by filter result, obtains background Former grayscale image and background video are done additive operation, the grayscale image for the low frequency component that is inhibited by image.
Further, described that space area image is transformed into frequency domain, formula using two-dimensional fast fourier transform FFT Are as follows:N respectively indicates the width and height of image, (x, y) indicates the space coordinate of image, and (u, v) indicates that the coordinate of frequency image, f (x, y) are input picture, and F (u, v) is to turn Change obtained Fourier frequency area image, with gauss low frequency filter to frequency domain images filter:Wherein H (x, y) is F ' (u, v)=F (u, v) H (u, v) Gaussian filtering weight function, D indicate that gaussian filtering window midpoint (u, v) arrives the Euclidean distance at Gauss window center, and σ indicates Gauss The degree of weight function extension, F (x, y) is frequency domain image, and F ' (x, y) is filtered frequency domain image, filter result is used inverse fast Fast Fourier transformation is transformed into spatial domain, obtains background video f ' (x, y), by former grayscale image f (x, y) and background video f ' (x, y) does additive operation, the grayscale image for the low frequency component that is inhibited:G (x, y)=f (x, y)-f ' (x, y), G (x, y) are the high fdrequency component image finally obtained.
Further, the step 3) constructs several directions using high fdrequency component image obtained in previous step Gray level co-occurrence matrixes otherness feature, and rotational invariance textural characteristics are extracted, building process is as follows;
The gray level co-occurrence matrixes are the matrixes that a size is n × n, and n is the gray level of image, gray level co-occurrence matrixes It is defined as follows: M (i, j)=[m (i, j, Δ x, Δ y);I, j ∈ (1,2 ... n)] matrix of the gray level co-occurrence matrixes M for n × n, the i-th row, The element value of jth column are as follows: meet positional shift in the window ranges that image some size be w × w for (Δ x, Δ y), and gray value The pixel of respectively i and j is to the number of appearance, otherness formulaIts Inm(i,j)For the i-th row of gray level co-occurrence matrixes M, the value of jth column element is constructed with n1 direction N1 kind meeting co-occurrence matrix, so that the otherness feature in n1 kind direction be calculated, the minimum value chosen on n1 direction constructs rotation Turn invariance textural characteristics.N1 direction be respectively (Δ x, Δ y) ∈ (1,0), (2,0), (2,1), (1,1), (1,2), (- 2,1),(-1,1),(1,2), (0,2),(0,1)}。
Further, the step 4) extracts characteristic area using Otsu threshold method OTSU, specifically includes:
1) gray level image for being 0-255 by characteristic image Linear Mapping counts grey level histogram, i.e., each gray level Number of pixels;
2) normalization histogram.The number of pixels of i.e. each gray level is divided by sum of all pixels;
3) maximum between-cluster variance is calculated.If t is gray threshold, statistics 0 accounts for the ratio of whole sub-picture to the pixel of t gray level Example w0, the average gray value u of statistics 0 to t gray level0, the pixel of statistics t to 255 gray levels accounts for the ratio w of whole sub-picture1, system Average gray value u of the meter t to 255 gray levels1, calculate g=w0×w1× (u0-u1)2, 0-255 successively is set by t, makes g most Big t value is threshold value, and the part greater than t is characterized region, and the part less than t is other regions.
Further, the super-pixel that the step 5) is obtained by linear iteraction clustering algorithm SLIC is at cellular uniform Arrangement, specifically includes: carrying out the segmentation of n times different scale to the visible light wave range of image based on SLIC algorithm, scale is respectively {Ni, i ∈ 1,2,3 ... n }, NiThe average pixel quantity of i-th segmentation super-pixel is represented, wherein N1For smallest partition scale, Nn For maximum fractionation scale, from N1To NnSuccessively with identical step-length S increase, the n super-pixel layer divided is expressed as { Li,i ∈ 1,2,3 ... n }, with { Pij, j ∈ 1,2,3 ... } and indicate LiJ-th of super-pixel, NumijIndicate PijPixel quantity;
The otherness characteristic image for being 23 based on window size in step 3), obtains characteristic pattern with OSTU method and is denoted as, base In LfCalculate all PijCharacteristic density Dij:Num in formulafRepresent PijIn range, in LfMiddle corresponding range Characteristic point quantity, obtain each LiCharacteristic density L 'i, finally add up L 'iObtain final characteristic density distribution Ld:It indicates for the corresponding pixel of each characteristic density image to be added, obtains total characteristic density Ld, pass through The above method obtains characteristic density, is finally superimposed each layer and obtains characteristic density image Ld, to density image LdSuitable threshold is set Value, the biggish part of density is that cities and towns are extracted as a result, set threshold value is between 0-1, the threshold value of setting will multiplied by point The number of plies cut, the part greater than threshold value will be extracted as cities and towns.
It advantages of the present invention and has the beneficial effect that:
The present invention is based on the advantages of frequency filtering to propose the filtering method of a kind of pair of remote sensing images, and this method can filter Except the interference of low-frequency information in original image, therefore, step (3) calculate texture template image in, the gray value of cities and towns texture with The gray value of natural texture increases gap, to obtain more accurate characteristic area using OSTU algorithm.
The present invention proposes the computer capacity using super-pixel local feature density the most, and the line of demarcation for introducing atural object is made For limitation, atural object different from each other does not calculate characteristic density in same subrange, the characteristic density performed better than Distribution by density variation inside and outside the increase cities and towns built-up areas of superposition multilayer feature density, while having refined cities and towns edge. More accurate town information is obtained, use experience threshold value, which extracts, can obtain better result.
Pass through the extraction experiment of the MSIL1C visible light wave range remote sensing image of Sentinel-2, cities and towns proposed by the present invention The extraction of built-up areas extracting method is as a result, integrity degree and the more existing main stream approach of accuracy improve a lot.
Detailed description of the invention
The Sentinel-2MSIL1C luminance picture that Fig. 1 is
Fig. 2 is the image after frequency filtering image
Fig. 3 is the otherness invariable rotary feature being calculated
Fig. 4 is the characteristic area of OSTU binaryzation
Fig. 5 is the characteristic density being calculated by multilayer super-pixel
Fig. 6 is that result is extracted in threshold method cities and towns
Fig. 7 method flow diagram
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, detailed Carefully describe.Described example is certain embodiments of the invention.
The technical solution that the present invention solves above-mentioned technical problem is:
The present invention is directed to the characteristics of atural object in 10 meters of remote sensing images, chooses the differential rotation Invariance feature in 10 directions, Characteristic area is obtained using OUST threshold method.Aiming at the problem that already existing method excessively dependence characteristics extract.The present invention mentions A kind of method calculating local feature density out, segmentation feature of this method based on SLIC fitting atural object edge, building multilayer are super Pixel reduces outside the city characteristic density in super-pixel internal calculation characteristic density, to realize downtown areas and non-cities and towns The high-precision in region divides.
Context of methods technical solution is as follows.
(1) remote sensing image of 10m resolution ratio visible light wave range is obtained, and carries out related pretreatment;
(2) grayscale image is constructed, low-frequency information is filtered out based on Fourier transformation, obtains the grayscale image of high fdrequency component;
(3) for high frequency image obtained in step (2), the gray level co-occurrence matrixes otherness feature of multiple directions is calculated, And extract rotational invariance textural characteristics;
(4) the rotational invariance texture template image for obtaining step (3) is mapped as 0 to 255 gray level image, uses Otsu threshold method (OTSU) extracts characteristic area;
(5) (simple linear iterative clustering is clustered by simple linear iteration;SLIC) algorithm Multi-scale division is carried out to the very color composite diagram of Sentinel-2, obtains multilayer super-pixel distributed image.It is single with super-pixel Position, obtains the characteristic density image under each scale, is superimposed all characteristic density images, to construct the characteristic density of fining Image.
(6) suitable density threshold is finally chosen, downtown areas is extracted.
The invention will be further described below:
(1) image of acquisition cut, spliced, cloud removing, obtaining clean complete visible light wave range image.
(2) using the luminance information building grayscale image of the multiband image handled, frequency domain filter then is made to the image Wave processing.Space area image is transformed into frequency domain, formula using two-dimensional fast fourier transform (FFT) are as follows:N respectively indicates the width and height of image, (x, Y) space coordinate of image is indicated, (u, v) indicates that the coordinate of frequency image, f (x, y) are input picture, and F (u, v) is to convert The Fourier frequency area image arrived.With gauss low frequency filter to frequency domain images filter:Wherein H (x, y) is F ' (u, v)=F (u, v) H (u, v) Gaussian filtering weight function, D indicate that gaussian filtering window midpoint (u, v) arrives the Euclidean distance at Gauss window center, and σ indicates Gauss The degree of weight function extension.F (x, y) is frequency domain image, and F ' (x, y) is filtered frequency domain image, filter result is used inverse fast Fast Fourier transformation is transformed into spatial domain, obtains background video f ' (x, y), by former grayscale image f (x, y) and background video f ' (x, y) does additive operation, the grayscale image for the low frequency component that is inhibited:G (x, y)=f (x, y)-f ' (x, y).G (x, y) is most The high fdrequency component image obtained eventually.
(3) gray level co-occurrence matrixes be a size be n × n matrix, n be image gray level (usual gray level image 256) gray level is.Gray level co-occurrence matrixes are defined as follows: M (i, j)=[m (i, j, Δ x, Δ y);I, j ∈ (1,2 ... n) gray scale Co-occurrence matrix M is the matrix of n × n, the i-th row, the element value of jth column are as follows: in the window ranges that some size of image is w × w Meeting positional shift is (Δ x, Δ y), and gray value is respectively number of the pixel to appearance of i and j.The present invention is with 10 kinds of positions Set relationship building gray level co-occurrence matrixes, 10 kinds of positional relationships be respectively (Δ x, Δ y) ∈ (1,0), (2,0), (2,1), (1,1), (1,2),(-2,1),(-1,1),(1,2),(0,2),(0,1)}.Otherness formula is WhereinFor the i-th row of gray level co-occurrence matrixes M, the value of jth column element.With 10 direction structures Built in 10 can co-occurrence matrix so that the otherness feature in 10 kinds of directions be calculated choose the minimum value structure on 10 directions Build rotational invariance textural characteristics.
(4) Otsu threshold method (Nobuyuki Otsu;It OTSU) is a classical automatic threshold algorithm, it is according to image Gamma characteristic, divide the image into 2 part of background and target.Inter-class variance between background and target is bigger, illustrates composition figure The difference of 2 parts of picture is bigger, when partial target mistake is divided into background or part background mistake is divided into target and all 2 parts can be caused poor Do not become smaller.Therefore, the maximum segmentation of inter-class variance is made to mean misclassification probability minimum.The rotation that we obtain step (3) is not It is denaturalized texture template image, 0 to 255 gray level image is mapped as, obtains two parts image, the big part of gray value with OSTU The characteristic area as to be extracted.
(5) simple linear Iterative Clustering (simple linear iterative clustering;It SLIC) is one Kind efficient super-pixel segmentation algorithm, the super-pixel obtained by SLIC algorithm at cellular evenly distributed, each super-pixel it is big It is small close, and each super-pixel can defer to the adaptive deformation of boundary progress of base map, can be bonded atural object edge and be arranged Column.SLIC algorithm is calculated based on CIELAB color space, (l, a, the b) color and space coordinate of each pixel (x, y) forms 5 dimensional vectors, and with the similitude between two pixels of euclidean distance metric, distance calculation formula is as follows: I in formula, j respectively represent two pixels, dcIndicate color distance, dsRepresentation space distance, D Indicate the similarity of two pixels, NcFor maximum color distance in desired class, NsFor maximum space distance desired in class.Picture Member determines the subordinate super-pixel of pixel with cluster centre distance D.Wherein NcAnd NsIt is two important fingers for influencing cluster result Mark.Under normal circumstances, NsIt is determined by the number of cluster centre:N indicates that the sum of all pixels of image, K are cluster The number at center is unique input parameter of algorithm.Using the average pixel number of super-pixel as measurement super-pixel in the present invention The scale of segmentation.
Segmentation the present invention is based on SLIC algorithm to the visible light wave range progress n times different scale of image, scale are respectively {Ni, i ∈ 1,2,3 ... n }, NiRepresent the average pixel quantity of i-th segmentation super-pixel.Wherein N1For smallest partition scale, Nn For maximum fractionation scale, from N1To NnSuccessively with identical step-length S increase.Divide n obtained super-pixel layer and is expressed as { Li,i ∈ 1,2,3 ... n }, with { Pij, j ∈ 1,2,3 ... } and indicate LiJ-th of super-pixel, NumijIndicate PijPixel quantity.Step (3) the otherness characteristic image for being 23 based on window size in, obtains characteristic pattern with OSTU method and is denoted as Lf.Based on LfCalculate institute There is PijCharacteristic density Dij:Num in formulafRepresent PijIn range, in LfThe characteristic point of middle corresponding range Quantity.Obtain each LiCharacteristic density L 'i, finally add up L 'iObtain final characteristic density distribution Ld:It indicates for the corresponding pixel of each characteristic density image to be added, obtains total characteristic density Ld.Setting N1For 300, S=10, n=5,5 layers of super-pixel segmentation are obtained as a result, obtaining characteristic density by the above method respectively, finally It is superimposed each layer and obtains characteristic density image Ld
(6) based on characteristic density image L obtained in step (5)dSuitable threshold value is set, and the biggish part of density is The result extracted for cities and towns.Usual threshold value, which is set as 0.7 × n, can obtain preferable as a result, n is the segmentation number of plies.
1. example is using 2017 Beijing Nian9Yue of the Sentinel-2A satellite the north emitted by European space office Image, Serial No. " S2A_MSIL1C_20170922T025541_N0205_R032_T50 ".The present invention is based on the lines of image Information extraction town-wide is managed, under normal circumstances without doing radiation calibration and atmospheric correction, need to only do cloud removing and cutting. Obtain image as shown in Figure 1.
2. obtained Sentinel-2A satellite data has multiple wave bands, we take the maximum value of three wave bands of visible light to obtain To brightness image, as shown in Figure 2.Luminance picture is by obtaining the grayscale image of removal low-frequency information such as after frequency filtering Shown in Fig. 3.The value of σ is set as in exampleM, N respectively indicates the width and height of image.
3. the image greyscale obtained in previous step is downgraded to 0-31,32 are set by window size w, calculates 10 The texture difference feature (dissimilarity) in direction, is minimized to obtain characteristic image, as shown in Fig. 4.
4. the characteristic image that previous step obtains is mapped to the gray level of 0-255, by Otsu threshold method selected threshold, Two parts are divided the image into, take the biggish part of gray value as characteristic area.As shown in Figure 5
5. setting smallest partition scale is 300 pixels by simple linear Iterative Clustering, setting step-length is 10, It is 340 that maximum fractionation scale, which is arranged, so carries out 5 segmentations, obtains 5 layers of super-pixel segmentation image.It is calculated based on super-pixel every The characteristic density of layer super-pixel, 5 layers of characteristic density image that superposition calculation obtains are built into total characteristic density image.As a result As shown in Figure 6.
6. finally setting characteristic density threshold value, the part greater than threshold value is downtown areas, and the threshold value of example setting is 3.5
Cities and towns extracting method proposed by the present invention can extract accurate town-wide.With the Comparative result of human interpretation, The percentage of head rice of above example is 0.96, accuracy 0.92.
The above embodiment is interpreted as being merely to illustrate the present invention rather than limit the scope of the invention. After the content for having read record of the invention, technical staff can be made various changes or modifications the present invention, these are equivalent Variation and modification equally fall into the scope of the claims in the present invention.

Claims (7)

1. a kind of method that textural characteristics density in cities and towns automatically extracts cities and towns built-up areas, which comprises the following steps:
1) remote sensing image of 10m resolution ratio visible light wave range is obtained, and cut, splice, cloud is gone to operate, area to be extracted is obtained Domain;
2) grayscale image is constructed to pretreated remote sensing image, Fourier transformation is based on to grayscale image and filters out low frequency texture, Obtain the grayscale image of high fdrequency component;
3) for the grayscale image of high fdrequency component obtained in step 2), the gray level co-occurrence matrixes otherness in several directions is calculated Feature, and extract rotational invariance textural characteristics;
4) image of the rotational invariance textural characteristics composition obtained step 3), is mapped as 0 to 255 gray level image, uses Otsu threshold method OTSU extracts characteristic area;
5) multiple dimensioned point is carried out to area image to be extracted obtained in step 1) using simple linear iteration cluster SLIC algorithm It cuts, obtains multilayer super-pixel distributed image, as unit of super-pixel, calculate the characteristic density image under each scale, be superimposed institute There is characteristic density image, constructs the characteristic density image of fining.
2. the method that a kind of cities and towns textural characteristics density according to claim 1 automatically extracts cities and towns built-up areas, feature It is, the step 1) cuts the image of acquisition, splices, cloud removing, and specifically include: a scape remote sensing images are usually not Region to be extracted can be completely covered, needs to download the adjacent image of more scapes, uses the Map Based Mosaic function of ENVI software The image that energy stitching image is completely covered, then uses the Subset Data from ROIs function of ENVI software, according to The range (vector file .shp) in region to be extracted crops the part extracted outside region;Ground object is by cloud in remote sensing images The case where blocking is very universal, cannot obtain good extraction effect by the part that cloud covers in image to be processed, need to carry out Cloud removal processing, removes cloud method are as follows: manual identified cloud sector domain finds other cloudless period images of the region, cuts out the area Area image, and be spliced in image to be processed, obtain the clean image of cloudless covering;Pretreatment work is complete using ENVI software At.
3. the method that a kind of cities and towns textural characteristics density according to claim 1 automatically extracts cities and towns built-up areas, feature It is, the step 2) is based on Fourier transformation to grayscale image and filters out low frequency texture, obtains the grayscale image of high fdrequency component, has Body includes: that space area image is transformed into frequency domain using two-dimensional fast fourier transform FFT, with gauss low frequency filter to frequency Filter result is transformed into spatial domain with inverse fast fourier transform, obtains background video by rate domain images filter, by former ash degree shadow As doing additive operation with background video, the grayscale image for the low frequency component that is inhibited.
4. the method that a kind of cities and towns textural characteristics density according to claim 3 automatically extracts cities and towns built-up areas, feature It is, it is described that space area image is transformed into frequency domain, formula using two-dimensional fast fourier transform FFT are as follows:M, N respectively indicates the width and height of image, (x, y) Indicate the space coordinate of image, (u, v) indicates that the coordinate of frequency image, f (x, y) are input picture, and F (u, v) is that conversion obtains Fourier frequency area image, with gauss low frequency filter to frequency domain images filter: Wherein H (x, y) is gaussian filtering weight function to F ' (u, v)=F (u, v) H (u, v), and D indicates gaussian filtering window midpoint (u, v) To the Euclidean distance at Gauss window center, σ indicates the degree of Gauss weight function extension, and F (x, y) is frequency domain image, and F ' (x, y) is Filter result is transformed into spatial domain with inverse fast fourier transform by filtered frequency domain image, obtain background video f ' (x, Y), former grayscale image f (x, y) and background video f ' (x, y) are done into additive operation, the grayscale image for the low frequency component that is inhibited:G (x, y)=f (x, y)-f ' (x, y), G (x, y) is final The high fdrequency component image of acquisition.
5. the method that a kind of cities and towns textural characteristics density according to claim 1 automatically extracts cities and towns built-up areas, feature It is, the step 3) constructs the gray scale symbiosis in several directions using high fdrequency component grayscale image obtained in previous step Matrix difference feature, and rotational invariance textural characteristics are extracted, process is as follows;
The gray level co-occurrence matrixes are the matrixes that a size is n × n, and n is the gray level of image, and gray level co-occurrence matrixes define such as Under: M (i, j)=[m (i, j, Δ x, Δ y);I, j ∈ (1,2 ... n)] matrix of the gray level co-occurrence matrixes M for n × n, the i-th row, the The element value of j column are as follows: meet positional shift in the window ranges that image some size be w × w for (Δ x, Δ y), and gray scale Value is respectively number of the pixel to appearance of i and j, and otherness formula is Whereinm(i, j)For the i-th row of gray level co-occurrence matrixes M, the value of jth column element is constructed with n1 direction N1 kind gray level co-occurrence matrixes choose the minimum value building on n1 direction so that the otherness feature in n1 kind direction be calculated Rotational invariance textural characteristics.N1 direction be respectively (Δ x, Δ y) ∈ (1,0), (2,0), (2,1), (1,1), (1,2), (- 2,1), (- 1,1), (1,2), (0,2), (0,1) }.
6. the method that a kind of cities and towns textural characteristics density according to claim 1 automatically extracts cities and towns built-up areas, feature It is, the step 4) extracts characteristic area using Otsu threshold method OTSU, it specifically includes:
1) gray level image for being 0-255 by characteristic image Linear Mapping counts grey level histogram, i.e., the pixel of each gray level Number;
2) normalization histogram.The number of pixels of i.e. each gray level is divided by sum of all pixels;
3) maximum between-cluster variance is calculated.If t is gray threshold, statistics 0 accounts for the ratio w of whole sub-picture to the pixel of t gray level0, system Meter 0 arrives the average gray value u of t gray level0, the pixel of statistics t to 255 gray levels accounts for the ratio w of whole sub-picture1, count t to 255 The average gray value u of gray level1, calculate g=w0×w1×(u0-u1)2, 0-255 successively is set by t, makes the maximum t value of g i.e. For threshold value, the part greater than t is characterized region, and the part less than t is other regions.
7. the method that a kind of cities and towns textural characteristics density according to claim 1 automatically extracts cities and towns built-up areas, feature It is, the super-pixel that the step 5) is obtained by linear iteraction clustering algorithm SLIC is specifically included at cellular evenly distributed: The segmentation of n times different scale is carried out to the visible light wave range of image based on SLIC algorithm, scale is respectively { Ni, i ∈ 1,2, 3...n }, NiThe average pixel quantity of i-th segmentation super-pixel is represented, wherein N1For smallest partition scale, NnFor maximum fractionation ruler Degree, from N1To NnSuccessively with identical step-length S increase, the n super-pixel layer divided is expressed as { Li, i ∈ 1,2,3...n }, With { Pij, j ∈ 1,2,3... } and indicate LiJ-th of super-pixel, NumijIndicate PijPixel quantity;
The otherness characteristic image for being 23 based on window size in step 3), obtains characteristic pattern with OSTU method and is denoted as, and is based on LfMeter Calculate all PijCharacteristic density Dij:Num in formulafRepresent PijIn range, in LfThe characteristic point of middle corresponding range Quantity, obtain each LiCharacteristic density L 'i, finally add up L 'iObtain final characteristic density distribution Ld:It indicates for the corresponding pixel of each characteristic density image to be added, obtains total characteristic density Ld, by upper The method of stating obtains characteristic density, is finally superimposed each layer and obtains characteristic density image Ld, to density image LdSuitable threshold value is set, The biggish part of density is cities and towns extraction as a result, set threshold value is between 0-1, and the threshold value of setting will be multiplied by segmentation The number of plies, the part greater than threshold value will be extracted as cities and towns.
CN201910212324.4A 2019-03-20 2019-03-20 Method for automatically extracting urban built-up area by urban texture feature density Active CN110070545B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910212324.4A CN110070545B (en) 2019-03-20 2019-03-20 Method for automatically extracting urban built-up area by urban texture feature density

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910212324.4A CN110070545B (en) 2019-03-20 2019-03-20 Method for automatically extracting urban built-up area by urban texture feature density

Publications (2)

Publication Number Publication Date
CN110070545A true CN110070545A (en) 2019-07-30
CN110070545B CN110070545B (en) 2023-05-26

Family

ID=67366390

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910212324.4A Active CN110070545B (en) 2019-03-20 2019-03-20 Method for automatically extracting urban built-up area by urban texture feature density

Country Status (1)

Country Link
CN (1) CN110070545B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111739073A (en) * 2020-06-24 2020-10-02 刘秀萍 Efficient and rapid image registration optimization method for handheld device
CN114037747A (en) * 2021-11-25 2022-02-11 佛山技研智联科技有限公司 Image feature extraction method and device, computer equipment and storage medium
CN114943897A (en) * 2022-05-31 2022-08-26 南京大学 Town development boundary defining method based on superpixel segmentation
CN116310806A (en) * 2023-02-28 2023-06-23 北京理工大学珠海学院 Intelligent agriculture integrated management system and method based on image recognition

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006230910A (en) * 2005-02-28 2006-09-07 Konica Minolta Medical & Graphic Inc Image processor and image processing method
US20090212199A1 (en) * 2007-12-28 2009-08-27 Fujifilm Corporation Radiation image detection apparatus and manufacturing method of radiation image detector
CN101702019A (en) * 2009-11-26 2010-05-05 上海电机学院 Method for extracting residential areas in SAR image
CN102687140A (en) * 2009-12-30 2012-09-19 诺基亚公司 Methods and apparatuses for facilitating content-based image retrieval
CN103065296A (en) * 2012-12-14 2013-04-24 中南大学 High-resolution remote sensing image residential area extraction method based on edge feature
CN103745453A (en) * 2013-12-11 2014-04-23 河海大学 Town information extraction method based on Google Earth remote sensing image
CN103824309A (en) * 2014-03-12 2014-05-28 武汉大学 Automatic extracting method of urban built-up area border
CN104182754A (en) * 2014-08-19 2014-12-03 山东临沂烟草有限公司 Rural resident point information extraction method based on high-resolution remote-sensing image
CN104376330A (en) * 2014-11-19 2015-02-25 西安电子科技大学 Polarization SAR image ship target detection method based on superpixel scattering mechanism
CN104508707A (en) * 2012-05-28 2015-04-08 奥普托斯股份有限公司 Methods and apparatus for image processing, and laser scanning ophthalmoscope having an image processing apparatus
CN104835016A (en) * 2015-05-27 2015-08-12 北京搜狐新媒体信息技术有限公司 Crowd density calculation method and device
CN105046262A (en) * 2015-06-29 2015-11-11 中国人民解放军国防科学技术大学 Robust extended local binary pattern textural feature extraction method
CN105096315A (en) * 2015-06-19 2015-11-25 西安电子科技大学 Method for segmenting heterogeneous super-pixel SAR (Synthetic Aperture Radar) image based on Gamma distribution
CN107229917A (en) * 2017-05-31 2017-10-03 北京师范大学 A kind of several remote sensing image general character well-marked target detection methods clustered based on iteration
CN107330875A (en) * 2017-05-31 2017-11-07 河海大学 Based on the forward and reverse heterogeneous water body surrounding enviroment change detecting method of remote sensing images
CN108171193A (en) * 2018-01-08 2018-06-15 西安电子科技大学 Polarization SAR Ship Target Detection method based on super-pixel local message measurement
CN109271864A (en) * 2018-08-17 2019-01-25 武汉烽火凯卓科技有限公司 A kind of crowd density estimation method based on wavelet transformation and support vector machines

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006230910A (en) * 2005-02-28 2006-09-07 Konica Minolta Medical & Graphic Inc Image processor and image processing method
US20090212199A1 (en) * 2007-12-28 2009-08-27 Fujifilm Corporation Radiation image detection apparatus and manufacturing method of radiation image detector
CN101702019A (en) * 2009-11-26 2010-05-05 上海电机学院 Method for extracting residential areas in SAR image
CN102687140A (en) * 2009-12-30 2012-09-19 诺基亚公司 Methods and apparatuses for facilitating content-based image retrieval
CN104508707A (en) * 2012-05-28 2015-04-08 奥普托斯股份有限公司 Methods and apparatus for image processing, and laser scanning ophthalmoscope having an image processing apparatus
CN103065296A (en) * 2012-12-14 2013-04-24 中南大学 High-resolution remote sensing image residential area extraction method based on edge feature
CN103745453A (en) * 2013-12-11 2014-04-23 河海大学 Town information extraction method based on Google Earth remote sensing image
CN103824309A (en) * 2014-03-12 2014-05-28 武汉大学 Automatic extracting method of urban built-up area border
CN104182754A (en) * 2014-08-19 2014-12-03 山东临沂烟草有限公司 Rural resident point information extraction method based on high-resolution remote-sensing image
CN104376330A (en) * 2014-11-19 2015-02-25 西安电子科技大学 Polarization SAR image ship target detection method based on superpixel scattering mechanism
CN104835016A (en) * 2015-05-27 2015-08-12 北京搜狐新媒体信息技术有限公司 Crowd density calculation method and device
CN105096315A (en) * 2015-06-19 2015-11-25 西安电子科技大学 Method for segmenting heterogeneous super-pixel SAR (Synthetic Aperture Radar) image based on Gamma distribution
CN105046262A (en) * 2015-06-29 2015-11-11 中国人民解放军国防科学技术大学 Robust extended local binary pattern textural feature extraction method
CN107229917A (en) * 2017-05-31 2017-10-03 北京师范大学 A kind of several remote sensing image general character well-marked target detection methods clustered based on iteration
CN107330875A (en) * 2017-05-31 2017-11-07 河海大学 Based on the forward and reverse heterogeneous water body surrounding enviroment change detecting method of remote sensing images
CN108171193A (en) * 2018-01-08 2018-06-15 西安电子科技大学 Polarization SAR Ship Target Detection method based on super-pixel local message measurement
CN109271864A (en) * 2018-08-17 2019-01-25 武汉烽火凯卓科技有限公司 A kind of crowd density estimation method based on wavelet transformation and support vector machines

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
SCOTT G J: "Fusion of deep convolutional neural networks for land cover classification of high-resolution imagery", 《IEEE GEOSCIENCE & REMOTE SENSING LETTERS.2017,14(9)》 *
姜慧伟: "利用多特征融合的面向对象的高分辨率遥感影像变化检测", 《陕西理工学院学报(自然科学版)》 *
张宁新等: "高分辨率遥感影像居民区检测算法研究", 《计算机工程与科学》 *
林祥国等: "融合直角点和直角边特征的高分辨率遥感影像居民点提取方法", 《测绘学报》 *
郑庆庆等: "改进的基于区域合并的纹理图像分割方法", 《华中科技大学学报(自然科学版)》 *
郭春梅等: "融合显著度时空上下文的超像素跟踪算法", 《模式识别与人工智能》 *
高杨等: "基于复合矢量场的改进ACM模型与医学图像分割", 《计算机工程与应用》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111739073A (en) * 2020-06-24 2020-10-02 刘秀萍 Efficient and rapid image registration optimization method for handheld device
CN114037747A (en) * 2021-11-25 2022-02-11 佛山技研智联科技有限公司 Image feature extraction method and device, computer equipment and storage medium
CN114943897A (en) * 2022-05-31 2022-08-26 南京大学 Town development boundary defining method based on superpixel segmentation
CN114943897B (en) * 2022-05-31 2023-11-24 南京大学 Town development boundary demarcating method based on super-pixel segmentation
CN116310806A (en) * 2023-02-28 2023-06-23 北京理工大学珠海学院 Intelligent agriculture integrated management system and method based on image recognition
CN116310806B (en) * 2023-02-28 2023-08-29 北京理工大学珠海学院 Intelligent agriculture integrated management system and method based on image recognition

Also Published As

Publication number Publication date
CN110070545B (en) 2023-05-26

Similar Documents

Publication Publication Date Title
CN109446992B (en) Remote sensing image building extraction method and system based on deep learning, storage medium and electronic equipment
CN108573276B (en) Change detection method based on high-resolution remote sensing image
CN107358260B (en) Multispectral image classification method based on surface wave CNN
Li et al. Object-oriented classification of high-resolution remote sensing imagery based on an improved colour structure code and a support vector machine
Li et al. A review of remote sensing image classification techniques: The role of spatio-contextual information
CN110070545A (en) A kind of method that textural characteristics density in cities and towns automatically extracts cities and towns built-up areas
CN107067405B (en) Remote sensing image segmentation method based on scale optimization
CN106709517B (en) Mangrove forest identification method and system
CN105139015B (en) A kind of remote sensing images Clean water withdraw method
CN110334762A (en) A kind of feature matching method combining ORB and SIFT based on quaternary tree
CN104318051B (en) The rule-based remote sensing of Water-Body Information on a large scale automatic extracting system and method
CN110598564B (en) OpenStreetMap-based high-spatial-resolution remote sensing image transfer learning classification method
Chen et al. Optimal segmentation of a high-resolution remote-sensing image guided by area and boundary
CN107480620A (en) Remote sensing images automatic target recognition method based on heterogeneous characteristic fusion
CN107273813A (en) Geographical space elements recognition system based on high score satellite remote sensing date
CN111709901A (en) Non-multiple multi/hyperspectral remote sensing image color homogenizing method based on FCM cluster matching and Wallis filtering
Chen et al. A new process for the segmentation of high resolution remote sensing imagery
CN106919950B (en) The brain MR image segmentation of probability density weighting geodesic distance
CN108710862A (en) A kind of high-resolution remote sensing image Clean water withdraw method
CN107506769A (en) A kind of extracting method and system of urban water-body information
CN113807437A (en) Crest line and valley line extraction method based on DBSCAN cluster analysis
CN114266947A (en) Classification method and device based on fusion of laser point cloud and visible light image
Zhou et al. Stratified Object‐Oriented Image Classification Based on Remote Sensing Image Scene Division
Manaf et al. Hybridization of SLIC and Extra Tree for Object Based Image Analysis in Extracting Shoreline from Medium Resolution Satellite Images.
Huang et al. Multi-feature combined for building shadow detection in GF-2 Images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant