CN107423771B - Two-time-phase remote sensing image change detection method - Google Patents

Two-time-phase remote sensing image change detection method Download PDF

Info

Publication number
CN107423771B
CN107423771B CN201710659362.5A CN201710659362A CN107423771B CN 107423771 B CN107423771 B CN 107423771B CN 201710659362 A CN201710659362 A CN 201710659362A CN 107423771 B CN107423771 B CN 107423771B
Authority
CN
China
Prior art keywords
image
remote sensing
difference
class
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710659362.5A
Other languages
Chinese (zh)
Other versions
CN107423771A (en
Inventor
王鑫
黄晶
储艳丽
黄凤辰
高红民
石爱业
徐立中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hohai University HHU
Original Assignee
Hohai University HHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hohai University HHU filed Critical Hohai University HHU
Priority to CN201710659362.5A priority Critical patent/CN107423771B/en
Publication of CN107423771A publication Critical patent/CN107423771A/en
Application granted granted Critical
Publication of CN107423771B publication Critical patent/CN107423771B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images

Abstract

The invention discloses a two-time phase remote sensing image change detection method. Firstly, a method for fusing gray scale features and texture features of two time-phase remote sensing images is designed to construct a difference image, so that the problem of insufficient information amount when the difference image is constructed based on a single type of features is solved; secondly, a fast fuzzy C-means method is provided to carry out two classifications (change classification and non-change classification) on the fused difference image, in each iteration process, the convergence speed of the fuzzy C-means algorithm is improved by modifying the membership degree of the pixel points nearest to the centers of the various classes and adopting the points with the highest gray scale value and the lowest gray scale value in the difference image as the initial clustering centers of the change class and the unchanged class respectively. Based on the two improvement points, the change detection of the two-time phase remote sensing image can be effectively and quickly realized.

Description

Two-time-phase remote sensing image change detection method
Technical Field
The invention relates to an effective two-time phase remote sensing image change detection method, and belongs to the technical field of image processing.
Background
The remote sensing image change detection means that the remote sensing images of the same geographical area in different time phases are compared and analyzed to obtain the ground feature change information of the area in the period. In recent years, remote sensing image transformation detection technology has been widely applied to a plurality of fields, such as environmental monitoring, agricultural research, natural disaster assessment, forest vegetation change monitoring and the like.
The currently common remote sensing image change detection method is mainly used for constructing a difference image based on the characteristics of texture characteristics, edge characteristics or shape characteristics and the like of the remote sensing image, and different types of characteristics focus on describing detail information of different aspects of the image. However, the detection of the change only by means of a single type of feature is limited, and useful information is easily lost in the operation. Therefore, the invention comprehensively utilizes the gray value information and the textural feature information of the remote sensing image and constructs a difference image by overlapping the gray value difference map and the textural feature statistic difference map of the two time-phase remote sensing images. The superposed difference image comprehensively considers the gray value information and the textural feature information of the remote sensing image, overcomes the defect of insufficient information amount when the difference image is constructed by using single features, and provides richer original image information for subsequent detection operation.
Fuzzy C-means clustering is one of the most widely applied and better-effective clustering algorithms among a plurality of fuzzy clustering algorithms. The fuzzy C-means algorithm does not strictly divide the objects to be classified into a certain class any more, but applies the probability theory to describe the degree of the objects belonging to different classes. The soft division by using the fuzzy theory can describe the real world more objectively, thereby being widely applied.
The main principle of the fuzzy C-means clustering method is to minimize an objective function through iteration to obtain an optimal solution. In each iterative operation, the membership degrees of all sample points need to be calculated, the clustering center needs to be updated, and the convergence rate is low. Meanwhile, the clustering effect of the fuzzy C-means clustering method depends on the selection of an initial value, when the initial clustering center is not properly selected, the convergence speed of the algorithm is also influenced, the result is easy to fall into a local minimum value, and the global optimal solution is difficult to obtain. In consideration of the two defects of the fuzzy C-means clustering method, the invention provides an improved method for detecting the changed area and the unchanged area in the difference image on the basis of the fuzzy C-means clustering method.
Disclosure of Invention
The purpose of the invention is as follows: aiming at the problems in the prior art, the invention provides an effective two-time phase remote sensing image change detection method, which is based on feature fusion and fuzzy C-Means (FCM), effectively combines the information of different feature difference images, and improves the anti-noise performance and the change detection accuracy; meanwhile, the convergence rate of the FCM algorithm is further improved.
The technical scheme is as follows: a two-time phase remote sensing image change detection method comprises the following steps:
the method comprises the following steps: and giving a two-time phase remote sensing image to be detected, and designing a method for fusing gray level features and texture features to construct a difference image.
(1) Two-time-phase remote sensing image I to be detected1,I2(wherein I)1Is a first phase image, I2The second time phase image is M multiplied by N in size) and respectively carries out preprocessing operations such as radiation correction and geometric correction;
(2) calculating image I1,I2Gray value difference map X0
X0(i,j)=|I1(i,j)-I2(i,j)|;(1≤i≤M,1≤j≤N)
Wherein, I1(I, j) and I2(I, j) is a two-time phase image I1And I2Gray values of corresponding pixel points;
(3) for remote sensing image I1And I2And extracting texture features of the raw materials. On the basis of comprehensively considering factors such as algorithm calculated amount, detection effect and the like, 4 texture feature statistics, namely energy, contrast, correlation and entropy are selected for calculation.
(4) Separately computing images I1And I2The difference matrix of the 4 texture feature statistics.
To make the gray value difference map X0The value of the texture feature difference map D has the same value interval as that of the texture feature difference map D, and the texture feature difference map D needs to be normalized to obtain a normalized texture feature difference map D':
Figure BDA0001370138970000021
wherein D ismaxIs the maximum value in matrix D;
(5) fusion grey value difference map X0And the texture feature difference image D' to obtain a final difference image X:
Figure BDA0001370138970000022
the difference image X comprises two time-phase remote sensing images I1And I2The gray value information and the texture feature information of the image to be used as an input image of a subsequent improved fuzzy C-means method to obtain a final change detection result.
Step two: a fast fuzzy C-means method is provided, and two classifications (change classification and non-change classification) are carried out on the fused difference image.
(1) Initializing parameters of a fuzzy C mean value, wherein a fuzzy weighting index m is 2, the maximum iteration time T is 30, the minimum error epsilon for stopping iteration is 0.0001, the initial iteration time T is 1, and the total clustering number C is 2;
(2) taking the difference image X as an input image of the improved fuzzy C mean value method, wherein the total number of pixel points is n, C1And c2Respectively representing a changed class and an unchanged class in the image X. v. of1And v2Respectively representing two types of clustering centers, initializing the clustering centers:
v1 0=Xmax
v2 0=Xmin
wherein, XmaxAnd XminRespectively the maximum value and the minimum value of the X gray value of the difference image. Selecting C with the smallest distance ratio1The distance between the pixel points and the center of the variation class is obviously smaller than the distance between the pixel points and the center of the unchanged class, and the pixel points are used for representing the number of the points belonging to the variation class; selecting C with the largest distance ratio2Points whose distances to the center of the variation class are clearIs significantly greater than the distance to the center of the unchanged class and is used to indicate the number of points belonging to the unchanged class.
(3) Calculating membership degree matrix u of all pixel pointsik(i=1,...,c):
Figure BDA0001370138970000031
Wherein x iskRepresents the k-th pixel point, viIs the cluster center of the i-th class, vjThe cluster center of the jth class is, the total number c of clusters is 2, and n is the total number of image pixels.
(4) Respectively calculating all pixel points in the difference image X to two classes c1And c2V of the cluster center1 t-1And v2 t-1Is of Euclidean distance M1And M2And calculating the distance ratio K between all the pixel points and the two types of centers:
M1={m11,m12,…m1n},M2={m21,m22,…m2n}
Figure BDA0001370138970000032
(5) sorting all values in the distance ratio K and selecting C with the smallest distance ratio1Points, the distance from the pixel points to the center of the variation class is obviously less than the distance to the center of the unchanged class, and then the pixel points have the maximum probability of belonging to the variation class and are marked as xk∈c1Modifying the degree of membership u of these points1k=1,u2k0; selecting C with the largest distance ratio2Points, the distance from the pixel points to the center of the variation class is obviously larger than the distance to the center of the unchanged class, and then the pixel points have the maximum probability of belonging to the unchanged class and are marked as xk∈c2Modifying the degree of membership u of these points1k=0,u2k=1;
(6) Calculating a change class center v1 tAnd unchanged class center v2 t
Figure BDA0001370138970000041
And updating C selected in the step (5)1A and C2The gray values of the points are respectively the clustering center values;
(7) if T > T (T represents a predetermined threshold for the number of iterations and T represents the number of iterations that have been performed), or
Figure BDA0001370138970000042
(wherein the minimum error ε of stopping iteration is 0.0001) and
Figure BDA0001370138970000043
stopping iteration to obtain a final membership matrix uik(i ═ 1, 2); otherwise t is t +1, C is increased1And C2And jumping to the step (3);
(8) according to the final membership matrix uik(i is 1,2), judging the category attribute of all pixel points: when u is1k>u2k(k is 1,2, …, n), then pixel k belongs to the changed class, otherwise pixel k belongs to the unchanged class;
(9) and outputting a final change detection result.
By adopting the technical scheme, the invention has the following beneficial effects:
(1) the method of the invention constructs the difference image by adopting the method of fusing the gray scale feature and the texture feature of the two time phase images, and solves the problem of insufficient information quantity when constructing the difference image based on the single type of feature
(2) In the iteration process of the fuzzy C-means algorithm, the method accelerates the convergence speed of the algorithm by modifying the membership degree of the pixel point closest to the class center. In addition, the points with the highest and lowest gray values in the difference image are used as the initial clustering centers of the variable class and the non-variable class, so that the convergence speed of the fuzzy C-means algorithm is further improved.
Drawings
FIG. 1 is a flow chart of a method according to an embodiment of the present invention.
Detailed Description
The present invention is further illustrated by the following examples, which are intended to be purely exemplary and are not intended to limit the scope of the invention, as various equivalent modifications of the invention will occur to those skilled in the art upon reading the present disclosure and fall within the scope of the appended claims.
As shown in fig. 1, the two-time phase remote sensing image change detection method includes the following steps:
(1) two-time-phase remote sensing image I to be detected1,I2(wherein I)1Is a first phase image, I2The second time phase image is M multiplied by N in size) and respectively carries out preprocessing operations such as radiation correction and geometric correction;
(2) calculating image I1,I2Gray value difference map X0
X0(i,j)=|I1(i,j)-I2(i,j)|;(1≤i≤M,1≤j≤N)
Wherein, I1(I, j) and I2(I, j) is a two-time phase image I1And I2Gray values of corresponding pixel points;
(3) for remote sensing image I1And I2And extracting texture features of the raw materials. On the basis of comprehensively considering factors such as algorithm calculated amount, detection effect and the like, 4 texture feature statistics, namely energy, contrast, correlation and entropy are selected for calculation:
energy:
Figure BDA0001370138970000051
wherein, P (i, j) is the normalized gray level co-occurrence matrix, and (i, j) represents the matrix pixel coordinate. The energy represents the uniformity and compactness of the texture in the image, and the energy value is larger when the value difference in the gray level co-occurrence matrix is larger.
Contrast ratio:
Figure BDA0001370138970000052
contrast describes the contrast in brightness of pixel values in an image, visually expressed as the sharpness of the image and the vividness of the image colors.
Correlation:
Figure BDA0001370138970000053
wherein, mux,μy
Figure BDA0001370138970000057
Respectively represent
Figure BDA0001370138970000054
And
Figure BDA0001370138970000055
mean and standard deviation of. The correlation describes the similarity degree of each row and each column in the matrix, and the consistency of the image texture is reflected.
Entropy:
Figure BDA0001370138970000056
entropy describes the average amount of information of an image, reflecting the complexity of the texture in the image.
With a first time phase image I1For example, a window with the size of m × n is slid on the image, and 4 kinds of texture feature statistics in each window are calculated as the texture feature value of the central pixel point in the window. The distance of each movement is 1 pixel point, and 0-degree scanning, 45-degree scanning, 90-degree scanning and 135-degree scanning are sequentially performed. When the window moves on the whole image, obtaining texture data of 4 scanning directions and overlapping to finally obtain an image I1Energy matrix D of11Contrast matrix D12Correlation matrix D13And entropy matrix D14The sizes are M × N. Obtaining the image I in the same way2Energy matrix D of21Contrast matrix D22Correlation matrix D23And entropy matrix D24
(4) Separately computing images I1And I2Of 4 texture feature statisticsAnd (5) a difference matrix. Taking the energy matrix as an example, image I is calculated1,I2Energy difference matrix D of1
D1(i,j)=|D11(i,j)-D21(i,j)|;(1≤i≤M,1≤j≤N)
Obtaining a contrast difference matrix D by the same method2Correlation difference matrix D3Sum entropy difference matrix D4And superposing the 4 feature statistic difference matrixes to obtain a texture feature difference matrix D. To make the gray value difference map X0The value of the texture feature difference map D has the same value interval as that of the texture feature difference map D, and the texture feature difference map D needs to be normalized to obtain a normalized texture feature difference map D':
Figure BDA0001370138970000061
wherein D ismaxIs the maximum value in matrix D;
(5) fusion grey value difference map X0And the texture feature difference image D' to obtain a final difference image X:
Figure BDA0001370138970000062
the difference image X comprises two time-phase remote sensing images I1And I2The gray value information and the texture feature information of the image to be used as an input image of a subsequent improved fuzzy C-means method to obtain a final change detection result.
(6) Initializing parameters of a fuzzy C mean value, wherein a fuzzy weighting index m is 2, the maximum iteration number T is 30, the minimum error epsilon of stopping iteration is 0.0001, the initial iteration number T is 1, the total number of clusters C is 2, and C1And C2Is selected according to the difference of the input images;
(7) taking the difference image X as an input image of the improved fuzzy C mean value method, wherein the total number of pixel points is n, C1And c2Respectively representing a changed class and an unchanged class in the image X. v. of1And v2Respectively representing cluster centers of two classesInitializing a clustering center:
v1 0=Xmax
v2 0=Xmin
wherein, XmaxAnd XminRespectively the maximum value and the minimum value of the X gray value of the difference image;
(8) calculating membership degree matrix u of all pixel pointsik(i=1,2):
Figure BDA0001370138970000071
(9) Respectively calculating all pixel points in the difference image X to two classes c1And c2V of the cluster center1 t-1And v2 t-1Is of Euclidean distance M1And M2And calculating the distance ratio K between all the pixel points and the two types of centers:
M1={m11,m12,…m1n},M2={m21,m22,…m2n}
Figure BDA0001370138970000072
(10) sorting all values in the distance ratio K and selecting C with the smallest distance ratio1Points, the distance from the pixel points to the center of the variation class is obviously less than the distance to the center of the unchanged class, and then the pixel points have the maximum probability of belonging to the variation class and are marked as xk∈c1Modifying the degree of membership u of these points1k=1,u2k0; selecting C with the largest distance ratio2Points, the distance from the pixel points to the center of the variation class is obviously larger than the distance to the center of the unchanged class, and then the pixel points have the maximum probability of belonging to the unchanged class and are marked as xk∈c2Modifying the degree of membership u of these points1k=0,u2k=1;
(11) Calculating a change class center v1 tAnd unchanged class center v2 t
Figure BDA0001370138970000081
And updating C selected in the step (10)1A and C2The gray values of the points are respectively the clustering center values;
(12) if T > T, or
Figure BDA0001370138970000082
And is
Figure BDA0001370138970000083
Stopping iteration to obtain a final membership matrix uik(i ═ 1, 2); otherwise t is t +1, C is increased1And C2And jumping to the step (8);
(13) according to the final membership matrix uik(i is 1,2), judging the category attribute of all pixel points: when u is1k>u2k(k is 1,2, …, n), then pixel k belongs to the changed class, otherwise pixel k belongs to the unchanged class;
(14) and outputting a final change detection result.

Claims (5)

1. A two-time phase remote sensing image change detection method is characterized by comprising the following steps:
the method comprises the following steps: giving a two-time phase remote sensing image to be detected, and designing a method for fusing gray level features and texture features to construct a difference image;
step two: providing a fast fuzzy C-means method, and classifying the fused difference images into a variable class and a non-variable class;
the second step comprises the following steps:
(1) initializing parameters of the fuzzy C mean value; the parameters comprise a fuzzy weighting index m, a maximum iteration time T, a minimum error epsilon for stopping iteration, an initial iteration time T and a total clustering number c;
(2) taking the difference image X as an input image of the improved fuzzy C mean value method, wherein the total number of pixel points is n, C1And c2Respectively representing a changed class and an unchanged class in the image X; v. of1And v2Respectively representing two types of clustering centers, initializing the clustering centers:
v1 0=Xmax
v2 0=Xmin
wherein, XmaxAnd XminRespectively the maximum value and the minimum value of the X gray value of the difference image;
(3) calculating membership degree matrix u of all pixel pointsik
(4) Respectively calculating all pixel points in the difference image X to two classes c1And c2V of the cluster center1 t-1And v2 t-1Is of Euclidean distance M1And M2Calculating the distance ratio K from all the pixel points to the two types of centers;
(5) sorting all values in the distance ratio K and selecting C with the smallest distance ratio1Points, as a class of variation, are denoted as xk∈c1Modifying the degree of membership u of these points1k=1,u2k0; selecting C with the largest distance ratio2Points, as unchanged classes, are denoted as xk∈c2Modifying the degree of membership u of these points1k=0,u2k=1;xkRepresenting the kth pixel point;
(6) calculating a change class center v1 tAnd unchanged class center v2 t
Figure FDA0002200483010000011
And updating C selected in (5)1A and C2The gray values of the points are respectively the clustering center values;
(7) if t>T, or
Figure FDA0002200483010000021
And is
Figure FDA0002200483010000022
Stopping iteration to obtain a final membership matrix uik(ii) a Otherwise t is t +1, C is increased1And C2And jumping to the step (3); i is 1, 2;
(8) according to the final membership matrix uikJudging the category attributes of all pixel points: when u is1k>u2kIf yes, the pixel point k belongs to a change class, otherwise the pixel point k belongs to an unchanged class; 1,2, k 1,2, …, n;
(9) and outputting a final change detection result.
2. The two-time phase remote sensing image change detection method according to claim 1, wherein the first step comprises the following steps:
(1) two-time-phase remote sensing image I to be detected1,I2Respectively carrying out radiation correction and geometric correction preprocessing operations, wherein I1Is a first phase image, I2The second time phase image has the size of M multiplied by N;
(2) computing an image I1,I2Gray value difference map X0
X0(i,j)=|I1(i,j)-I2(i,j)|;1≤i≤M,1≤j≤N
Wherein, I1(I, j) and I2(I, j) is a two-time phase image I1And I2Gray values of corresponding pixel points;
(3) for remote sensing image I1And I2Extracting texture features of the raw materials; on the basis of comprehensively considering factors such as algorithm calculated amount, detection effect and the like, 4 texture feature statistics, namely energy, contrast, correlation and entropy are selected for calculation;
(4) separately computing images I1And I2The difference matrix of the 4 kinds of texture feature statistics;
(5) fusion grey value difference map X0And the texture feature difference image D' to obtain a final difference image.
3. The two-time phase remote sensing of claim 2The image change detection method is characterized in that the gray value difference image X is used for0The value of the texture feature difference matrix D has the same value interval, and the texture feature difference matrix D needs to be normalized to obtain a normalized texture feature difference map D':
Figure FDA0002200483010000023
wherein D ismaxIs the maximum value in the matrix D.
4. The two-time phase remote sensing image change detection method of claim 1, wherein the membership matrix
Figure FDA0002200483010000031
5. The two-time phase remote sensing image change detection method of claim 4, wherein all pixel points in the difference image X are classified into two types c1And c2V of the cluster center1 t-1And v2 t-1Is of Euclidean distance M1And M2Respectively as follows: m1={m11,m12,…m1n},M2={m21,m22,…m2n};
The distance ratio K from all the pixel points to the two types of centers is as follows:
Figure FDA0002200483010000032
CN201710659362.5A 2017-08-04 2017-08-04 Two-time-phase remote sensing image change detection method Active CN107423771B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710659362.5A CN107423771B (en) 2017-08-04 2017-08-04 Two-time-phase remote sensing image change detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710659362.5A CN107423771B (en) 2017-08-04 2017-08-04 Two-time-phase remote sensing image change detection method

Publications (2)

Publication Number Publication Date
CN107423771A CN107423771A (en) 2017-12-01
CN107423771B true CN107423771B (en) 2020-04-03

Family

ID=60436408

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710659362.5A Active CN107423771B (en) 2017-08-04 2017-08-04 Two-time-phase remote sensing image change detection method

Country Status (1)

Country Link
CN (1) CN107423771B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110232302B (en) * 2018-03-06 2020-08-25 香港理工大学深圳研究院 Method for detecting change of integrated gray value, spatial information and category knowledge
CN110276746B (en) * 2019-05-28 2022-08-19 河海大学 Robust remote sensing image change detection method
CN111192239B (en) * 2019-12-18 2023-04-25 星际空间(天津)科技发展有限公司 Remote sensing image change area detection method and device, storage medium and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020978A (en) * 2012-12-14 2013-04-03 西安电子科技大学 SAR (synthetic aperture radar) image change detection method combining multi-threshold segmentation with fuzzy clustering
CN103353989A (en) * 2013-06-18 2013-10-16 西安电子科技大学 SAR image change detection method based on priori, fusion gray level and textural feature
CN104751185A (en) * 2015-04-08 2015-07-01 西安电子科技大学 SAR image change detection method based on mean shift genetic clustering
CN106897679A (en) * 2017-02-13 2017-06-27 长江水利委员会长江科学院 A kind of semantic change detecting method and system based on improvement fuzzy C-means clustering

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020978A (en) * 2012-12-14 2013-04-03 西安电子科技大学 SAR (synthetic aperture radar) image change detection method combining multi-threshold segmentation with fuzzy clustering
CN103353989A (en) * 2013-06-18 2013-10-16 西安电子科技大学 SAR image change detection method based on priori, fusion gray level and textural feature
CN104751185A (en) * 2015-04-08 2015-07-01 西安电子科技大学 SAR image change detection method based on mean shift genetic clustering
CN106897679A (en) * 2017-02-13 2017-06-27 长江水利委员会长江科学院 A kind of semantic change detecting method and system based on improvement fuzzy C-means clustering

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Textural Features for Image Classification;ROBERT M et al.;《IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS》;19731130;第610-621页 *
UNSUPERVISED CHANGE DETECTION IN SATELLITE IMAGES USING FUZZY C-MEANS CLUSTERING AND PRINCIPAL COMPONENT ANALYSIS;M. H. Kesikoglu et al.;《International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences》;20131130;第129-132页 *
一种新的多波段遥感影像变化检测方法;杨胜等;《中国图像图形学报》;20090430;第14卷(第4期);第572-578页 *
基于SVM混合核的遥感图像变化检测;夏晨阳等;《信息技术》;20140831(第8期);第38-41页 *

Also Published As

Publication number Publication date
CN107423771A (en) 2017-12-01

Similar Documents

Publication Publication Date Title
CN110443143B (en) Multi-branch convolutional neural network fused remote sensing image scene classification method
Shendryk et al. Deep learning for multi-modal classification of cloud, shadow and land cover scenes in PlanetScope and Sentinel-2 imagery
CN106682598B (en) Multi-pose face feature point detection method based on cascade regression
CN107194336B (en) Polarized SAR image classification method based on semi-supervised depth distance measurement network
Yu et al. Context-based hierarchical unequal merging for SAR image segmentation
US8885926B2 (en) Image and data segmentation
CN111191583B (en) Space target recognition system and method based on convolutional neural network
Zhang et al. Saliency detection based on self-adaptive multiple feature fusion for remote sensing images
US9330336B2 (en) Systems, methods, and media for on-line boosting of a classifier
CN106204651B (en) A kind of method for tracking target based on improved judgement with generation conjunctive model
CN107491734B (en) Semi-supervised polarimetric SAR image classification method based on multi-core fusion and space Wishart LapSVM
CN111160407B (en) Deep learning target detection method and system
CN106408030A (en) SAR image classification method based on middle lamella semantic attribute and convolution neural network
CN108596195B (en) Scene recognition method based on sparse coding feature extraction
CN109033978B (en) Error correction strategy-based CNN-SVM hybrid model gesture recognition method
CN106778687A (en) Method for viewing points detecting based on local evaluation and global optimization
CN109448015A (en) Image based on notable figure fusion cooperates with dividing method
CN109801305B (en) SAR image change detection method based on deep capsule network
Huang et al. Automatic building change image quality assessment in high resolution remote sensing based on deep learning
CN107423771B (en) Two-time-phase remote sensing image change detection method
Grigorev et al. Depth estimation from single monocular images using deep hybrid network
Yang et al. Color texture segmentation based on image pixel classification
Han et al. The edge-preservation multi-classifier relearning framework for the classification of high-resolution remotely sensed imagery
CN111199245A (en) Rape pest identification method
CN112950780A (en) Intelligent network map generation method and system based on remote sensing image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant