CN104200471B - SAR image change detection based on adaptive weight image co-registration - Google Patents
SAR image change detection based on adaptive weight image co-registration Download PDFInfo
- Publication number
- CN104200471B CN104200471B CN201410437415.5A CN201410437415A CN104200471B CN 104200471 B CN104200471 B CN 104200471B CN 201410437415 A CN201410437415 A CN 201410437415A CN 104200471 B CN104200471 B CN 104200471B
- Authority
- CN
- China
- Prior art keywords
- pixel
- image
- difference
- graph
- fusion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 69
- 230000008859 change Effects 0.000 title claims abstract description 57
- 230000003044 adaptive effect Effects 0.000 title abstract description 3
- 230000004927 fusion Effects 0.000 claims abstract description 38
- 238000001914 filtration Methods 0.000 claims abstract description 33
- 238000000034 method Methods 0.000 claims description 59
- 238000010586 diagram Methods 0.000 claims description 10
- 238000003064 k means clustering Methods 0.000 claims description 7
- 238000010606 normalization Methods 0.000 claims description 7
- 238000005259 measurement Methods 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 claims 1
- 238000012544 monitoring process Methods 0.000 abstract description 5
- 230000036039 immunity Effects 0.000 abstract description 2
- 230000007613 environmental effect Effects 0.000 abstract 1
- 239000004576 sand Substances 0.000 abstract 1
- 238000002474 experimental method Methods 0.000 description 8
- 238000004088 simulation Methods 0.000 description 8
- 238000011156 evaluation Methods 0.000 description 4
- 238000012733 comparative method Methods 0.000 description 3
- 238000000354 decomposition reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000007499 fusion processing Methods 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 2
- 238000012804 iterative process Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000007476 Maximum Likelihood Methods 0.000 description 1
- 239000000654 additive Substances 0.000 description 1
- 230000000996 additive effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000007500 overflow downdraw method Methods 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000004321 preservation Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Landscapes
- Radar Systems Or Details Thereof (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a kind of SAR image change detection based on adaptive weight image co-registration, mainly solve the problems, such as that in prior art, single type disparity map accuracy of detection is low.Implementation step is:1. read in not two width SAR image of areal in the same time, carry out PPB filtering respectively and obtain filtered image X1And X2;2., according to filtered image, obtain differential chart DsWith log ratio figure Dl;3. respectively to DsAnd DlCarry out mean filter, obtain the differential chart D after mean filters' with medium filtering after ratio figure Dl';4. calculate fusion parameters η (i, j);5. according to fusion parameters to differential chart Ds' and ratio figure Dl' carry out merging generation disparity map D;6. disparity map D is polymerized to two different classes, obtains change-detection result.The present invention has the advantages that simple to operate, noise immunity is good and accuracy of detection is high, can be applicable to environmental monitoring, hazard prediction.
Description
Technical Field
The invention belongs to the technical field of image processing, and relates to an SAR image change detection method which can be used in the fields of urban planning, natural disaster assessment, disaster prediction, land utilization, dynamic monitoring of land coverage and the like.
Background
Synthetic aperture radar images, also called SAR images, belong to a kind of microwave radar imaging, compared with visible light and infrared remote sensing, microwave remote sensing has incomparable advantages: firstly, the microwave energy penetrates through cloud, fog, rain and snow, and has all-weather and all-day working capability. Secondly, the microwave has certain penetrating power to the ground objects. And thirdly, the imaging is carried out in a side-looking mode, so that the coverage area is large. Due to these advantages of SAR images, SAR images are increasingly becoming an important data source for change detection. The remote sensing image change detection means that two or more images in different periods of the same region are compared and analyzed, and change information of ground objects is obtained according to the difference between the images. The change detection technology can detect the change information of the radiation value and the local texture of the remote sensing image in different periods, and the characteristic enables the remote sensing image to be widely applied to the aspects of resource and environment monitoring, such as land utilization and coverage change, forest and vegetation change, urban expansion and the like, geographic space data updating, agricultural crop growth monitoring, natural disaster monitoring and evaluation, military field research and the like.
In the change detection research, scholars classify the existing multiple change detection methods into different categories, and the most common of them is the change detection method based on the difference image analysis. The method generally comprises 3 key steps: (1) preprocessing an image; (2) construction of a disparity map; (3) and extracting change information. Wherein:
the difference method and the ratio method are two most basic methods, and the difference image is obtained by subtracting the corrected two time phase remote sensing images pixel by pixel and performing division operation. When the difference value method is used for constructing the difference map, the difference value of 0 or close to 0 is regarded as a constant area, and the difference value of 0 is not regarded as a variable area. In the SAR image, due to the influence of speckle noise, a difference map is mostly constructed by adopting a ratio method, and the ratio method is insensitive to calibration errors. At present, the difference map is mainly constructed by a ratio method in the form of logarithmic ratio or mean ratio. Dekker proposes to use a logarithmic ratio method to construct a difference map, wherein the logarithmic ratio method is to take the logarithm of a ratio method, which can convert multiplicative noise into additive noise and compress the variation range of a ratio image, so that the variation of a high-pixel-value area is weakened, the pixel gray value of the outline of a part of real variation area is closer to the gray value of non-variation pixels, and the preservation of the outline of the variation area and the improvement of the variation detection precision are not facilitated. The mean ratio method is to take the neighborhood mean of the corresponding pixel and then calculate the ratio, which has stronger robustness to noise, but brings too much pseudo-variation information to the background region, namely the unchanged region. Therefore, the change detection using a single type of difference map has problems of low detection accuracy and narrow application range.
Therefore, the fusion of the single type difference maps plays a crucial role in change detection, and the existing methods for fusing the single type difference maps include methods such as parameter weight-based difference map fusion and discrete wavelet fusion. Wherein:
the method for fusing discrete wavelet transform comprises the following steps: and respectively carrying out discrete wavelet decomposition on the average ratio graph and the logarithmic ratio graph to obtain a series of subband images, namely low-frequency components, horizontal high-frequency components, vertical high-frequency components and diagonal high-frequency components, carrying out fusion processing on each decomposition layer on different scales to form wavelet decomposition representation of the fused image, and finally obtaining the fused image through inverse discrete wavelet transformation. The method has the disadvantage that the fusion process is relatively complicated.
The difference map fusion method based on the parameter weight comprises the following processes: obtaining their difference graph and log ratio graph for two images to be fused, extracting difference graph DsIn which the part with better detail information is reservedAnd logarithmic ratio map DlThe part with better anti-noise performance in the method is combined with the advantages of the two types of difference graphs and fused according to different weight parameters α, the fused difference graph is D, and the fusion formula is D- α × Ds+(1-α)*DlThe difference graph and the logarithmic ratio graph are fused according to different weights, the value of α is taken to perform multiple tests from the range of 0-1, the value with the best fusion effect is selected as the value of α to obtain the finally fused difference graph, and the method has the disadvantages that (1) the fusion result cannot be judged to be good or bad without a standard reference graph, the application range is narrow, (2) multiple tests are required to compare the fusion effect when the parameter α is selected, the parameter α cannot be self-adaptive, the parameter needs to be manually adjusted, and the operation is complicated.
Disclosure of Invention
The invention aims to provide an SAR image change detection method based on self-adaptive weight image fusion so as to improve the detection precision and solve the problems that parameters are difficult to select and cannot be self-adaptive in image fusion in the prior art.
In order to achieve the above object, the present invention achieves the steps comprising:
(1) reading in two SAR images I acquired in the same area at different time1And I2And for the two SAR images I1And I2Respectively carrying out block-based weight probability PPB filtering to obtain two filtered images X1And X2;
(2) Calculating the two filtered images X1And X2The difference of the corresponding pixel gray value is normalized to obtain a difference value graph Ds;
(3) Calculating the two filtered images X1And X2Corresponding to the quotient of the gray value of the pixel, and carrying out normalization to obtain a log ratio graph Dl;
(4) Difference map DsCarry out 11 × 11 window mean filtering, eliminating difference map DsObtaining a difference graph D after mean value filtering for the isolated pixel points in the images';
(5) Will log ratio chart DlPerforming median filtering in 3 × 3 window to suppress the log ratio diagram DlObtaining a logarithmic ratio figure D after median filtering for the isolated pixel points in the middlel';
(6) Respectively calculating difference maps Ds' 3 × 3 neighborhood Ω of each pixel point x (i, j)xMean value mu of pixelx(i, j) and variance σx(i, j), obtaining fusion parameters:
the fusion parameter η (i, j) varies with the pixel variation, and is used to represent the 3 × 3 neighborhood Ω of the measurement pixel point x (i, j)xThe homogeneous region refers to a smooth region in the image, the heterogeneous region refers to a noise part or an edge part in the image, and the fusion parameter η (i, j) of the pixel point x (i, j) in the heterogeneous region is larger than the fusion parameter η (i, j) of the pixel point x (i, j) in the homogeneous region;
(7) the mean value filtered difference map D is obtained according to the obtained fusion parameters η (i, j)s' logarithmic ratio map D after median filteringlPerforming image fusion to obtain a fused difference map D, wherein pixel points of each horizontal and vertical coordinate at i, j in the difference map D are as follows:
D(i,j)=η(i,j)×Ds′(i,j)+(1-η(i,j))×Dl′(i,j),
wherein, D's(i, j) is a point, D ', of which the horizontal and vertical coordinates are i, j respectively in the mean value filtered difference value diagram'l(i, j) are points of which the horizontal and vertical coordinates are i, j respectively in the logarithmic ratio value chart after median filtering;
(8) clustering the fused difference graph D into two different classes by using a k-means clustering method, respectively calculating mean values of the two different classes, defining the class with the larger mean value as a variable class and the class with the smaller mean value as a non-variable class, and obtaining a final change detection result.
Compared with the prior art, the invention has the following advantages:
1. difference value map D 'after mean value filtering's(i, j) and the median filtered logarithmic ratio map D'l(i, j) utilizes D 'at the time of fusion's(i, j) each pixel 3 x 3 neighborhood ΩxVariance σ ofx(i, j) and the mean value μxThe ratio η (i, j) of (i, j) is used as a fusion parameter, so that the original fusion parameter can be automatically adjusted according to different pixel points, the fusion parameter does not need to be manually adjusted through multiple tests, and the operation is simple.
2. The fusion parameter eta (i, j) adopted by the invention considers the gray information and the spatial information of the neighborhood pixels, improves the anti-noise performance and the capability of retaining the image details, and further improves the change detection precision.
Simulation experiments show that the method can obtain satisfactory change detection results for a plurality of SAR images and has better robustness.
Drawings
FIG. 1 is a flow chart of an implementation of the present invention;
FIG. 2 is a diagram of a yellow river SAR image dataset used in the simulation of the present invention;
FIG. 3 is a graph of the results of a simulation experiment conducted on FIG. 2 using the present invention and a comparative method;
FIG. 4 is a diagram of a Wttawa region SAR image dataset used in simulation according to the present invention;
FIG. 5 is a graph of the results of a simulation experiment of FIG. 4 with the present invention and comparative method;
FIG. 6 is a plot of a Burnely SAR image dataset used in the simulation of the present invention;
FIG. 7 is a graph of the results of a simulation experiment of FIG. 6 with the present invention and comparative method.
Detailed Description
Referring to fig. 1, the method comprises the following specific steps:
step 1: reading in two SAR images I which are obtained in the same area at different time1And I2。
Step 2: for two SAR images I1And I2Respectively carrying out PPB filtering to obtain filtered images X1And X2。
For the first SAR image I1Carrying out fundamental patch-based weights, namely PPB filtering, and obtaining a filtered image X1;
For the second SAR image I2PPB filtering is carried out to obtain a filtered image X2;
The PPB filtering is a filtering method based on a noise distribution model and having a more general statistical similarity criterion. When PPB filtering is carried out, the process of removing noise is expressed as a weighted maximum likelihood estimation problem, the weights are updated based on the similarity of a noise block and the similarity of a previously calculated block in an iterative process, and the iterative process improves the performance of noise removal.
And step 3: obtaining a difference image D of two SAR imagess。
3a) For the first SAR image X1Gray value X at coordinate (i, j)1(i, j) and a second SAR image X2Gray value X at coordinate (i, j)2(i, j) performing difference operation to obtain the difference value of the difference matrix D at the coordinate (i, j): d (i, j) ═ X1(i,j)-X2(i, j) |, and then a difference matrix is obtained: d ═ D (i, j) };
3b) normalizing the difference matrix to obtain a difference map DsValue D at coordinate (i, j)s(i,j):
Further obtaining a difference map: ds={Ds(i,j)}。
And 4, step 4: calculating the two filtered images X1And X2Corresponding to the quotient of the gray value of the pixel, and carrying out normalization to obtain a log ratio graph Dl。
4a) Calculating and calculating the two filtered images X by the following formula1And X2Quotient R (i, j) for the gray value of the pixel:
wherein X1(i, j) is a first SAR image X1Pixel points, X, at abscissa and ordinate i, j2(i, j) is a second SAR image X2Pixel points at the positions of the horizontal and vertical coordinates i, j;
4b) normalizing the quotient of the pixel gray values by the following formula to obtain a result of the quotient normalization of each pixel gray value:
wherein R ═ { R (i, j) };
4c) obtaining a log ratio map by using the result of quotient normalization of each pixel gray value: dl={Dl(i,j)}。
And 5: difference map DsCarrying out mean value filtering of 11 × 11 window to eliminate difference value image DsThe isolated pixel points in the image are more complete in area, the local area keeps better continuity, the image is smoother, and finally the difference image D after mean filtering is obtaineds'。
Step 6: will log ratio chart DlPerforming median filtering in 3 × 3 window to suppress the log ratio diagram DlThe isolated pixel points in the middle are used for reserving the edge information to finally obtain a logarithmic ratio chart D after median filteringl'。
And 7: the fusion parameters η (i, j) are calculated.
7a) Calculating a difference map D using the following formulas' 3 × 3 neighborhood Ω of each pixel point x (i, j)xMean value mu of pixelx(i,j)
Wherein,is 3 x 3 neighborhood omegaxThe sum of the gray values of each pixel x (i, j);
7b) according to the mean value mux(i, j), calculating a difference map D using the following formulas' 3 × 3 neighborhood Ω of each pixel point x (i, j)xVariance σ of pixelx(i,j):
Wherein,is 3 x 3 neighborhood omegaxSubtracting the neighborhood omega from the gray value of each pixel point x (i, j)xInternal mean value mux(i, j) the sum of the squares of the results;
7c) according to the mean value mux(i, j) and variance σx(i, j) calculating fusion parameters η (i, j):
the fusion parameter η (i, j) varies with the pixel, and its variation range is between 0 and 1, which is used to represent the 3 × 3 neighborhood Ω of the measurement pixel point x (i, j)xThe homogeneous region refers to a smooth region in the image, the heterogeneous region refers to a noise part or an edge part in the image, and the fusion parameter η (i, j) of the pixel point x (i, j) in the heterogeneous region is larger than the fusion parameter η (i, j) of the pixel point x (i, j) in the homogeneous region.
And 8: fusion mean filtered difference map Ds' logarithmic ratio map D after median filteringl', obtaining a fused difference map D,
according to the fusion parameters η (i, j) obtained in step 7, the difference map D after mean value filterings' logarithmic ratio map D after median filteringlPerforming image fusion to obtain a fused difference map D, wherein pixel points of each horizontal and vertical coordinate at i, j in the difference map D are as follows:
D(i,j)=η(i,j)×Ds'(i,j)+(1-η(i,j))×Dl' (i, j), and obtaining a fused difference map D ═ D (i, j) },
wherein, D's(i, j) is a point, D ', of which the horizontal and vertical coordinates are i, j respectively in the mean value filtered difference value diagram'l(i, j) is the point whose horizontal and vertical coordinates are i, j respectively in the logarithmic ratio diagram after median filtering, and since the coefficient η (i, j) is a parameter for measuring the heterogeneity of the neighborhood region x (i, j), when the value η (i, j) is small, x (i, j) is in the homogeneous region, Dl' the difference map is generated to be dominant, so that speckle noise of the image is better suppressed, x (i, j) is in a heterogeneous region when η (i, j) is large, and Ds' dominate when generating disparity maps, better preserving image detail information.
Step 9: and carrying out k-means clustering on the difference graph D to obtain a final change detection result.
9a) Randomly selecting 2 pixel points from the difference graph D as an initial clustering center;
9b) respectively calculating the distance between each pixel point x (i, j) in the difference graph D and two clustering centers, and dividing the corresponding pixel points again according to the minimum distance to classify x (i, j) into the class closest to the pixel points;
9c) recalculating the mean value of each class as a new clustering center;
9d) repeating the steps 9b) -9c) until each cluster is not changed any more, and obtaining the mean value of the two categories;
9e) and converting the pixels with small mean values into 0, namely the unchanged class, converting the pixels with large mean values into 255, namely the changed class, and outputting the obtained result graph to obtain the final change detection result.
The effect of the present invention can be further illustrated by the following simulation results:
1. conditions of the experiment
The experimental environment is as follows: windows XP, SPI, CPU Pentium (R)4, basic frequency 2.4GHZ, and software platform MatlabR2010 a.
The first data set was a partial cut-out of the SAR image data set of the yellow river, as shown in fig. 2(a) and 2(b), acquired in eastern province of the mountains at a resolution of 8m by 8 m. In this case, 2(a) is an image of 2008. 6 month, and 2(b) is an image of 2009. 6 month. The image size is 257 × 289 pixels each, and the image gray scale is 256. Fig. 2(c) is a diagram of the standard variation detection result, which includes 13423 variation pixels and 60841 non-variation pixels.
The second data set is a wotta region SAR image data set, which is composed of two Radarsat SAR images captured at different times, and the variation area is mainly caused by flood disasters, as shown in fig. 4(a) and 4(b), wherein fig. 4(a) is an image of 5 months 1997, fig. 4(b) is an image of 8 months 1997, the image sizes are 290 × 350 pixels, and the gray level is 256. FIG. 4(c) is a standard change detection graph of the data set, comprising 85451 unchanged pixels and 16049 changed pixels.
The third data set is a bernly SAR image data set, which is composed of two Radarsat SAR images taken at different times, as shown in fig. 6(a) and 6(b), where fig. 6(a) is an image of 4 months 1999, fig. 6(b) is an image of 5 months 1999, the image sizes are all 301 × 301 pixels, and the gray level is 256. Fig. 6(c) is a graph of the standard variation detection result of the data set, which includes 1269 changed pixels and 89332 unchanged pixels.
2. Evaluation index of experiment
The evaluation indexes used in the experiment are the number of missed detections FN, the number of false detections FP, the total number of errors OE, the correct detection rate PCC and the Kappa coefficient. The missing detection number is the sum of pixels which are not detected and actually change, the false detection number is the sum of pixels which are not detected and actually change and are detected as changes, the total error number is the sum of the missing detection number and the false detection number, and the correct detection rate is 1-total error number/total pixel number. The total number of pixels is N, the total number of changed points is Nc, the total number of unchanged points is Nu, the correctly detected changed points are TP, TP is total changed points-missing detection number, the correctly detected unchanged points are TN, TN is total unchanged points-false detection number, and the order is thatTo obtainIf the variation graph and the reference graph are completely consistent, the Kappa coefficient is 1, otherwise, the Kappa coefficient is 0. The Kappa coefficient contains more classified detail information and is therefore a more accurate evaluation criterion than PCC.
3. Content of the experiment
The method of the invention and the existing 4 kinds of change detection methods are used for detecting the change of 3 image data sets, and the 4 kinds of comparison methods are respectively as follows: image X filtered with PPB1And X2Difference operation is carried out to construct a difference map, and then the difference map is recorded by a K-means clustering methodA difference method; image X filtered with PPB1And X2Carrying out logarithmic ratio operation to construct a difference graph, and then marking the difference graph as a logarithmic ratio method by using a K-means clustering method; image X filtered with PPB1And X2Carrying out mean ratio operation to construct a difference graph, and then recording the difference graph as a mean ratio method by using a K-means clustering method; the method proposed by Yaoguozheng in the article "Using Combined Difference Image and K-means Cluster for SAR Image Change Detection" is referred to as the CDI-K method.
Experiment 1
The change detection was performed on the first data set of fig. 2(a) and 2(b) using the method of the present invention and 4 conventional change detection methods, 2(c) is a reference graph, and the experimental results are shown in fig. 3, where 3(a) is a graph of the change detection result by the difference method, 3(b) is a graph of the change detection result by the logarithmic ratio method, 3(c) is a graph of the change detection result by the mean ratio method, 3(d) is a graph of the change detection result by the CDI-K method, and 3(e) is a graph of the change detection result by the present invention.
Experiment 2
The second data set, FIGS. 4(a) and 4(b), was subjected to change detection using the method of the present invention and 4 conventional change detection methods, 4(c) is a reference graph, and the experimental results are shown in FIG. 5, where 5(a) is a graph of the change detection results of the difference method, 5(b) is a graph of the change detection results of the logarithmic ratio method, 5(c) is a graph of the change detection results of the mean ratio method, 5(d) is a graph of the change detection results of the CDI-K method, and 5(e) is a graph of the change detection results of the present invention.
Experiment 3
The third data set, FIGS. 6(a) and 6(b), was subjected to change detection using the method of the present invention and 4 conventional change detection methods, 6(c) is a reference graph, and the experimental results are shown in FIG. 7, where 7(a) is a graph of the change detection results of the difference method, 7(b) is a graph of the change detection results of the logarithmic ratio method, 7(c) is a graph of the change detection results of the mean ratio method, 7(d) is a graph of the change detection results of the CDI-K method, and 7(e) is a graph of the change detection results of the present invention.
4. Analysis of Experimental results
As can be seen from fig. 3, 5, and 7, the conventional single-type difference map has poor noise immunity, large total error number, low detection accuracy, and poor visual effect. The CDI-K method utilizes the parameter alpha to fuse the difference map after mean filtering and the logarithm ratio map after median filtering to generate the difference map, but the parameter is not adaptive in the fusion process, the parameter alpha needs to select the optimal result of change detection from 0 to 1 through a plurality of experiments to determine the value of the parameter alpha, the process needs to manually adjust the parameter and is complicated, only the gray information of pixels is considered, and the spatial information of neighborhoods is not considered, so the anti-noise performance is poor, and the detection precision is low. The invention enables the parameters to be self-adaptive when generating the difference map, does not need to manually adjust the parameters, greatly simplifies the experimental steps, ensures that the method is simple and efficient, considers the pixel gray information and the neighborhood space information when calculating the parameters, enhances the anti-noise capability, reduces the total error number and improves the precision of change detection.
The results of statistics of the change detection values of 3 data sets by the method of the present invention and the existing 4 change detection methods are shown in table 1.
TABLE 1 comparison of Change detection values for the present invention and the existing four methods
As can be seen from table 1, the total error number of the present invention is less for all of the three SAR images, and the PCC and Kappa coefficients are better than for the other methods.
Claims (4)
1. A SAR image change detection method based on self-adaptive weight image fusion comprises the following steps:
(1) reading in two SAR images I acquired in the same area at different time1And I2And for the two SAR images I1And I2Respectively carrying out weight probability filtering based on blocks to obtain two filtered images X1And X2;
(2) Calculating the two filtered images X1And X2The difference of the corresponding pixel gray values is normalized to obtain a difference valueDrawing Ds;
(3) Calculating the two filtered images X1And X2Corresponding to the quotient of the gray value of the pixel, and carrying out normalization to obtain a log ratio graph Dl;
(4) Difference map DsCarrying out mean value filtering of 11 × 11 window to eliminate difference value image DsObtaining a difference image D after mean value filtering for the noise pixel points in the images';
(5) Will log ratio chart DlPerforming median filtering in 3 × 3 window to suppress the log ratio diagram DlObtaining a logarithmic ratio figure D after median filtering for the isolated pixel points in the middlel';
(6) Respectively calculating difference maps Ds' 3 × 3 neighborhood Ω of each pixel point x (i, j)xMean value mu of pixel gray valuesx(i, j) and variance σx(i, j), obtaining fusion parameters:
the fusion parameter η (i, j) varies with the pixel, and its variation range is between 0 and 1, which is used to represent the 3 × 3 neighborhood Ω of the measurement pixel point x (i, j)xThe homogeneous region refers to a smooth region in the image, the heterogeneous region refers to a noise part or an edge part in the image, and the fusion parameter η (i, j) of the pixel point x (i, j) in the heterogeneous region is larger than the fusion parameter η (i, j) of the pixel point x (i, j) in the homogeneous region;
(7) the mean value filtered difference map D is obtained according to the obtained fusion parameters η (i, j)s' logarithmic ratio map D after median filteringlPerforming image fusion to obtain a fused difference map D, wherein pixel points of each horizontal and vertical coordinate at i, j in the difference map D are as follows:
D(i,j)=η(i,j)×Ds′(i,j)+(1-η(i,j))×Dl′(i,j),
wherein, D's(i, j) is a point whose horizontal and vertical coordinates are i, j respectively in the difference diagram after mean filtering, Dl' (i, j) is the horizontal and vertical coordinate of the logarithmic ratio chart after median filteringPoints other than i, j;
(8) clustering the fused difference graph D into two different classes by using a k-means clustering method, respectively calculating mean values of the two different classes, defining the class with the larger mean value as a variable class and the class with the smaller mean value as a non-variable class, and obtaining a final change detection result.
2. The method as set forth in claim 1, wherein said step (3) is performed as follows:
3a) calculating and calculating the two filtered images X by the following formula1And X2Quotient R (i, j) for the gray value of the pixel:
wherein X1(i, j) is image X1Pixel points, X, at abscissa and ordinate i, j2(i, j) is image X2Pixel points at the positions of the horizontal and vertical coordinates i, j;
3b) normalizing the quotient of the pixel gray values by the following formula to obtain a result of the quotient normalization of each pixel gray value:
wherein R ═ { R (i, j) },
3c) the result of the quotient normalization of the gray value of each pixel yields a log ratio map: dl={Dl(i,j)}。
3. The method as claimed in claim 1, wherein said step (6) of calculating a difference map Ds' 3 × 3 neighborhood Ω of each pixel point x (i, j)xMean value mu of pixelx(i, j) and variance σx(i, j), the calculation formula is as follows:
wherein,is 3 x 3 neighborhood omegaxThe sum of the gray values of each pixel x (i, j),
is 3 x 3 neighborhood omegaxSubtracting the neighborhood omega from the gray value of each pixel point x (i, j)xInternal mean value mux(i, j) sum of squares of the results.
4. The method according to claim 1, wherein the step (8) of clustering the fused difference maps D into two different classes by using a k-means clustering method comprises the following steps:
8a) randomly selecting 2 pixel points from the difference graph D as an initial clustering center;
8b) respectively calculating the distance between each pixel point x (i, j) in the difference graph D and two clustering centers, and dividing the corresponding pixel points again according to the minimum distance to classify x (i, j) into the class closest to the pixel points;
8c) recalculating the mean value of each class as a new clustering center;
8d) repeating steps 8b) -8c) until no more changes occur to each cluster.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410437415.5A CN104200471B (en) | 2014-08-30 | 2014-08-30 | SAR image change detection based on adaptive weight image co-registration |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410437415.5A CN104200471B (en) | 2014-08-30 | 2014-08-30 | SAR image change detection based on adaptive weight image co-registration |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104200471A CN104200471A (en) | 2014-12-10 |
CN104200471B true CN104200471B (en) | 2017-03-01 |
Family
ID=52085757
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410437415.5A Expired - Fee Related CN104200471B (en) | 2014-08-30 | 2014-08-30 | SAR image change detection based on adaptive weight image co-registration |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104200471B (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104978745B (en) * | 2015-06-25 | 2017-07-07 | 中北大学 | A kind of High Resolution Visible Light image object change detecting method |
CN106203475B (en) * | 2016-06-28 | 2019-07-19 | 周晓鹏 | SAR image change detection based on SRM super-pixel cluster |
CN106875380B (en) * | 2017-01-12 | 2019-10-08 | 西安电子科技大学 | A kind of heterogeneous image change detection method based on unsupervised deep neural network |
CN107392887B (en) * | 2017-06-16 | 2020-06-09 | 西北工业大学 | Heterogeneous remote sensing image change detection method based on homogeneous pixel point conversion |
CN107316296B (en) * | 2017-06-29 | 2020-11-10 | 新疆大学 | Remote sensing image change detection method and device based on logarithmic transformation |
CN109325921A (en) * | 2018-09-01 | 2019-02-12 | 哈尔滨工程大学 | It is a kind of based on the underwater mixed noise fast filtering technique of intermediate value-mean value |
CN110119782A (en) * | 2019-05-14 | 2019-08-13 | 西安电子科技大学 | SAR image change detection based on FPGA |
CN110188830B (en) * | 2019-06-01 | 2022-09-06 | 合肥工业大学 | SAR image change detection method based on multi-core graph cut |
CN112634217A (en) * | 2020-12-17 | 2021-04-09 | 中国人民解放军火箭军工程大学 | SAR image region change detection method |
CN112926484B (en) * | 2021-03-11 | 2022-07-01 | 新疆大学 | Low-illumination image change detection method and device based on automatic discrimination strategy |
CN113034471B (en) * | 2021-03-25 | 2022-08-02 | 重庆大学 | SAR image change detection method based on FINCH clustering |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101694720A (en) * | 2009-10-13 | 2010-04-14 | 西安电子科技大学 | Multidate SAR image change detection method based on space associated conditional probability fusion |
CN102930519A (en) * | 2012-09-18 | 2013-02-13 | 西安电子科技大学 | Method for generating synthetic aperture radar (SAR) image change detection difference images based on non-local means |
CN102968790A (en) * | 2012-10-25 | 2013-03-13 | 西安电子科技大学 | Remote sensing image change detection method based on image fusion |
CN103456020A (en) * | 2013-09-08 | 2013-12-18 | 西安电子科技大学 | Remote sensing image change detection method based on treelet feature fusion |
CN103810704A (en) * | 2014-01-23 | 2014-05-21 | 西安电子科技大学 | SAR (synthetic aperture radar) image change detection method based on support vector machine and discriminative random field |
CN103824302A (en) * | 2014-03-12 | 2014-05-28 | 西安电子科技大学 | SAR (synthetic aperture radar) image change detecting method based on direction wave domain image fusion |
-
2014
- 2014-08-30 CN CN201410437415.5A patent/CN104200471B/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101694720A (en) * | 2009-10-13 | 2010-04-14 | 西安电子科技大学 | Multidate SAR image change detection method based on space associated conditional probability fusion |
CN102930519A (en) * | 2012-09-18 | 2013-02-13 | 西安电子科技大学 | Method for generating synthetic aperture radar (SAR) image change detection difference images based on non-local means |
CN102968790A (en) * | 2012-10-25 | 2013-03-13 | 西安电子科技大学 | Remote sensing image change detection method based on image fusion |
CN103456020A (en) * | 2013-09-08 | 2013-12-18 | 西安电子科技大学 | Remote sensing image change detection method based on treelet feature fusion |
CN103810704A (en) * | 2014-01-23 | 2014-05-21 | 西安电子科技大学 | SAR (synthetic aperture radar) image change detection method based on support vector machine and discriminative random field |
CN103824302A (en) * | 2014-03-12 | 2014-05-28 | 西安电子科技大学 | SAR (synthetic aperture radar) image change detecting method based on direction wave domain image fusion |
Non-Patent Citations (2)
Title |
---|
Fractal Genetic Model in Change Detection of SAR images;H. Aghababaee 等;《Indian Society of Remote Sensing 2013》;20131231;第41卷(第4期);第739–747页 * |
建筑物震害遥感图像的变化检测与震害评估;张景发 等;《自然灾害学报》;20020531;第11卷(第2期);第59-64页 * |
Also Published As
Publication number | Publication date |
---|---|
CN104200471A (en) | 2014-12-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104200471B (en) | SAR image change detection based on adaptive weight image co-registration | |
CN106296655B (en) | SAR image change detection based on adaptive weight and high frequency threshold value | |
CN103456018B (en) | Remote sensing image change detection method based on fusion and PCA kernel fuzzy clustering | |
Hou et al. | Unsupervised change detection in SAR image based on gauss-log ratio image fusion and compressed projection | |
Li et al. | A review of remote sensing image classification techniques: The role of spatio-contextual information | |
CN103871039B (en) | Generation method for difference chart in SAR (Synthetic Aperture Radar) image change detection | |
CN105389799B (en) | SAR image object detection method based on sketch map and low-rank decomposition | |
Liang et al. | Maximum likelihood classification of soil remote sensing image based on deep learning | |
CN104680536B (en) | The detection method changed to SAR image using improved non-local mean algorithm | |
Chen et al. | Change detection algorithm for multi-temporal remote sensing images based on adaptive parameter estimation | |
Hwang et al. | A practical algorithm for the retrieval of floe size distribution of Arctic sea ice from high-resolution satellite Synthetic Aperture Radar imagery | |
CN110310263B (en) | SAR image residential area detection method based on significance analysis and background prior | |
Giannarou et al. | Optimal edge detection using multiple operators for image understanding | |
CN108509835B (en) | PolSAR image ground object classification method based on DFIC super-pixels | |
Huang et al. | Change detection method based on fractal model and wavelet transform for multitemporal SAR images | |
CN107729903A (en) | SAR image object detection method based on area probability statistics and significance analysis | |
CN104392209A (en) | Evaluation model for image complexity of target and background | |
CN115147726B (en) | City form map generation method and device, electronic equipment and readable storage medium | |
Su et al. | Change detection in synthetic aperture radar images based on non-local means with ratio similarity measurement | |
CN114419465B (en) | Method, device and equipment for detecting change of remote sensing image and storage medium | |
CN103345739B (en) | A kind of high-resolution remote sensing image building area index calculation method based on texture | |
Silvetti et al. | Quadratic self-correlation: An improved method for computing local fractal dimension in remote sensing imagery | |
CN112734666A (en) | SAR image speckle non-local mean suppression method based on similarity value | |
Huang et al. | Environmental monitoring of natural disasters using synthetic aperture radar image multi-directional characteristics | |
Ni et al. | Multiple-primitives-based hierarchical classification of airborne laser scanning data in urban areas |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20170301 |
|
CF01 | Termination of patent right due to non-payment of annual fee |