CN101694720A - Multidate SAR image change detection method based on space associated conditional probability fusion - Google Patents
Multidate SAR image change detection method based on space associated conditional probability fusion Download PDFInfo
- Publication number
- CN101694720A CN101694720A CN200910024296A CN200910024296A CN101694720A CN 101694720 A CN101694720 A CN 101694720A CN 200910024296 A CN200910024296 A CN 200910024296A CN 200910024296 A CN200910024296 A CN 200910024296A CN 101694720 A CN101694720 A CN 101694720A
- Authority
- CN
- China
- Prior art keywords
- image
- edge
- pixel
- value
- point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a multidate SAR image change detection method based on space associated conditional probability fusion, which mainly solves the problem that a traditional change detection method has more fake change regions when keeping the integrity of a change region. The multidate SAR image change detection method is realized by the following concrete steps: (1) segmenting two-date SAR images subjected to strengthened Lee filtration; (2) creating a conditional probability difference chart and working out a space-associated conditional probability for fusion; (3)obtaining a fusion difference chart by utilizing the space-associated conditional probability to fuse a two-date cutting result; and (4) obtaining a change detection result by extracting a threshold value from the fusion difference chart. The invention reduces the emergence of the fake change region and improves the precision of change detection while better ensuring the integrality of the change region, thereby being suitable for estimating disasters such as fires, flood and the like of multidate SAR images.
Description
Technical field
The invention belongs to image processing field, relate to the change-detection of multidate SAR image, specifically a kind of multidate SAR image change detection method that merges based on the space correlation conditional probability.
Background technology
Along with the development of satellite remote sensing technology, remotely-sensed data rapidly increases.SAR data wherein because be not subjected to atmospheric environment and and the cloud layer influence of blocking become a kind of important remotely-sensed data.The zone that how accurate fast automatic ground never find out marked change simultaneously in the SAR image of phase then has crucial meaning.
The research of seeking " non-variation " and " variation " two classification in the SAR image also is in the elementary step at present, two routes are roughly arranged, article one, route is a classification back comparative approach, also claim the back category method, this method during earlier to two the image of phase carry out independent classification, the comparison that two width of cloth classified images are pursued pixel obtains change-detection figure at last again; An other route is the disparity map sorting technique, the comparison that the image of phase pursues pixel during earlier to two of this method, such as the difference of pursuing picture element, ratio, CVA etc., again the differential image that relatively draws is carried out processing such as multiple further conversion, probability distribution to reach two classification, obtain change-detection figure at last.Back category method can reduce because data are obtained the different pseudo-change informations that cause of platform and environment, the complicated pretreatment such as radiant correction that do not need data, but this method is based on not the classification chart of phase simultaneously, higher to the segmentation result accuracy requirement, the little deviation of pixel value may be judged to different classifications, can produce greatest differences between non-variation and variation after through diversity ratio, directly influence final change-detection result.More research is at present carried out along this route of difference disaggregated model.Disparity map sorting technique simple, intuitive, the variation details that obtains is comparatively remarkable.The result that the structural differences image obtains is unanimous on the whole with actual variation and non-variation tendency, but the variation of the emittance of the atural object that reflects between them is but always not identical, therefore atural object intensity of variation and the attribute that reflects is also incomplete same, often contains more pseudo-change information.This method is directly pursued the difference of pixel or the internal association characteristic that the ratio computing does not make full use of 2 o'clock phase images with 2 o'clock phase images simultaneously, can't keep changing the integrality of block preferably.
Summary of the invention
The objective of the invention is to deficiency at above-mentioned two class change detecting methods, a kind of SAR image change detection method that merges based on the space correlation conditional probability is proposed, change the integrality of block with maintenance preferably, and reduce the pseudo-appearance that changes block, obtained change-detection result preferably.
The technical scheme that realizes the object of the invention is: calculate 2 o'clock phase SAR edge of image at first respectively and mate optimum segmentation figure and space correlation conditional probability figure, the probability graph of pointwise design conditions then and 2 o'clock space correlation conditional probabilities of corresponding 3 * 3 neighborhoods in the SAR image mutually, the conditional probability pointwise is merged the optimum segmentation figure of 2 o'clock phases in view of the above, obtain merging disparity map, cut apart and obtain the change-detection result merging disparity map at last.Concrete steps are as follows:
(1) 2 o'clock phase images to input adopt twice enhanced Lee filtering respectively, obtain image I
1And I
2
(2) calculate 2 o'clock phase images I respectively with the Canny boundary operator
1And I
2The edge, obtain two width of cloth Canny outline maps;
(3) calculate in two width of cloth Canny outline maps each pixel and its Euclidean distance of each marginal point on every side, the distance of minimum of selecting each pixel and its surrounding edge point obtains the Edge Distance value of each point in two width of cloth Canny outline maps as the Edge Distance value of this pixel;
(4) get image I
2Each gray level as a threshold value to image I
2Binaryzation obtains several binary map corresponding with each gray level respectively; Calculate the edge and the image I of each binary map
2The matching degree m at Canny edge
1(g
2)/m
2(g
2), wherein, m
1(g
2) be LE
2(g
2)/BE
2(g
2), BE
2(g
2) be and gray level g
2Pixel number on the edge of corresponding binary map, LE
2(g
2) be the binary map edge with the time 2 image I mutually
2The overlapping pixel number of Canny marginal existence, m
2(g
2) be gray level g
2On the edge of corresponding binary map the Edge Distance value of some corresponding point in second width of cloth Canny outline map with; The binary map of choosing the matching degree maximum is as image I
2Optimum segmentation figure IE
2
(5) get image I
1Each gray level as a threshold value to image I
1Binaryzation obtains several binary map corresponding with each gray level respectively; Calculate the edge and the image I of each binary map
1The matching degree m at Canny edge
3(g
1) m
1(g
1)/m
2(g
1), wherein, m
1(g
1) be LE
1(g
1)/BE
1(g
1), BE
1(g
1) be and gray level g
1Pixel number on the edge of corresponding binary map, LE
1(g
1) be the binary map edge with the time 1 image I mutually
1The Canny outline map in the overlapping pixel number of marginal existence, m
2(g
1) be gray level g
1On the corresponding binary map edge Edge Distance value of some corresponding point in first width of cloth Canny outline map with, m
3(g
1) be first o'clock phase images I
1Gray level g
1Edge and second o'clock phase images I of corresponding binary map
2Optimum segmentation figure IE
2The edge both exist the overlapping pixel number in locus divided by second o'clock phase images I
2Optimum segmentation figure IE
2The value of the pixel number at edge; The binary map that selects the matching degree maximum is as image I
1Optimum segmentation figure IE
1
(6) get image I
1And I
2In be positioned at that (i j) locates the gray-scale value v of pixel
1And v
2, form gray scale point to (v
1, v
2), statistical picture I
1And I
2Middle gray scale point is to (v
1, v
2) probability P (v that occurs
1, v
2) and gray-scale value v
1Probability P (the v that occurs
1);
(7) make up a width of cloth and image I
1And I
2The M of identical size * N image I
P, corresponding I
1And I
2Gray scale point appears to (v
1, v
2) the I of position
PPixel value compose and to be probability P (v
1, v
2)/P (v
1), obtain space correlation conditional probability figure I
P
(8) statistical space Correlation Criteria probability graph I
PIn each pixel and image I
1And I
2The conditional probability of each point in 3 * 3 neighborhoods of middle same position pixel
Wherein, v
P(i j) is I
PIn (i j) locates a little value, and v (s) is an image I
1And I
2In (i j) locates the value of 18 points in 3 * 3 neighborhoods a little, N
H, l(v
P(i, j), v (s)) be (v
P(i, j), v (s)) is by I
PIn (h l) locates pixel value and image I
1And I
2In (h l) locates in 3 * 3 neighborhoods of pixel the frequency that 18 groups of gray scale centerings that 18 pixel values form occur, and P (v (s)) is an image I
1And I
2The probability that intermediate value v (s) occurs;
(9) according to step (8) image I
1And I
2In (i j) locates each point and I in 3 * 3 neighborhoods
PIn (i, the conditional probability of the point of j) locating is accordingly with the optimum segmentation of 2 o'clock phase images in step (4) and the step (5) figure IE as a result
1And IE
2In (i j) locates that each point merges in 3 * 3 neighborhoods, and (i, the value of j) locating obtain a width of cloth and merge disparity map I to obtain merging the back disparity map
F
(10) utilize the Otsu threshold value that step (9) gained is merged disparity map I
FBe divided into variation class and non-variation class, obtain the change-detection result.
The present invention is owing to adopt based on the dividing method of edge coupling and cut apart 2 o'clock phasors, and utilizes the space correlation conditional probability that 2 o'clock phase optimum segmentation figure are merged, and created out the fusion disparity map, compared with prior art has following advantage:
(1) the present invention has realized combining of the hard coupling in edge and soft coupling, has reduced the phenomenon of over-segmentation in other dividing methods, has obtained better segmentation effect.
(2) the present invention had both taken into full account the regional consistance of 2 o'clock phase original images, had kept the probabilistic relation of space correlation conditional probability figure with respect to original image again, had reached guaranteeing the regional conforming while as far as possible, reduced the purpose of detected pseudo-region of variation.
Description of drawings
Fig. 1 is realization flow figure of the present invention;
Fig. 2 is the visioning procedure synoptic diagram of conditional probability disparity map of the present invention;
Fig. 3 is that space correlation conditional probability of the present invention makes up synoptic diagram;
Fig. 4 the present invention is based on the optimum segmentation figure four directions of space correlation conditional probability to merging synoptic diagram;
Fig. 5 is the Canny edge match curve figure of simulation SAR experimental data;
Fig. 6 is simulation SAR experimental data and figure as a result;
Fig. 7 is true SAR experimental data and figure as a result.
Embodiment:
With reference to Fig. 1, performing step of the present invention is as follows:
Step 1 is imported 2 o'clock phase images, as Fig. 6 (a) and Fig. 6 (b), 2 o'clock phase images importing is all adopted twice enhanced Lee filtering, obtains image I
1And I
2, wherein, the window size of enhanced Lee filtering is 5 * 5 pixels.
Step 2 is calculated 2 o'clock phase images I respectively with the Canny boundary operator
1And I
2The edge, obtain two width of cloth Canny outline maps.
Step 3 is got image I successively
2Each gray level as threshold value to image I
2Binaryzation obtains several binary map corresponding with each gray level, calculates the edge and the image I of each binary map
2The matching degree m at Canny edge
1(g
2)/m
2(g
2), the binary map of choosing the matching degree maximum is as image I
2Optimum segmentation figure IE
2, wherein,
m
1(g
2) be defined as LE
2(g
2)/BE
2(g
2), this BE
2(g
2) be and gray level g
2Pixel number on the edge of corresponding binary map, this LE
2(g
2) be the binary map edge with the time 2 image I mutually
2The overlapping pixel number of Canny marginal existence;
Computed image I
2The Canny outline map in the Euclidean distance of each marginal point around each pixel and its, select the pixel value of wherein minimum distance as this point, obtain image I
2Edge Distance figure;
m
2(g
2) be defined as gray level g
2Point is in image I on the edge of corresponding binary map
2Edge Distance figure in corresponding point the Edge Distance value and.
Step 4 is got image I respectively
1Each gray level g
1As threshold value to image I
1Binaryzation obtains several binary map corresponding with each gray level, calculates the edge and the image I of each binary map
1The matching degree m at Canny edge
3(g
1) m
1(g
1)/m
2(g
1), the binary map that selects the matching degree maximum is as image I
1Optimum segmentation figure IE
1, wherein,
m
1(g
1) be defined as LE
1(g
1)/BE
1(g
1), BE
1(g
1) be and gray level g
1Pixel number on the edge of corresponding binary map, LE
1(g
1) be binary map edge and image I
1The Canny outline map in the overlapping pixel number of marginal existence;
Computed image I
1The Canny outline map in the Euclidean distance of each marginal point around each pixel and its, select the pixel value of wherein minimum distance as this point, obtain image I
1Edge Distance figure;
m
2(g
1) be defined as gray level g
1Point is in image I on the corresponding binary map edge
1Edge Distance figure in corresponding point the Edge Distance value and;
m
3(g
1) be defined as first o'clock phase images I
1Gray level g
1Edge and second o'clock phase images I of corresponding binary map
2Optimum segmentation figure IE
2The edge both exist the overlapping pixel number in locus divided by second o'clock phase images I
2Optimum segmentation figure IE
2The value of the pixel number at edge.
Step 6, space correlation conditional probability figure I
P
With reference to Fig. 2, being implemented as follows of this step:
At first, get image I after step (1) filtering
1And I
2In be positioned at (i, j) the gray-scale value v at some place
1And v
2, form gray scale point to (v
1, v
2), and statistical picture I
1And I
2Mid point is to (v
1, v
2) probability P (v that occurs
1, v
2) and gray-scale value v
1Probability P (the v that occurs
1);
Then, make up a width of cloth and image I
1And I
2The M of identical size * N image I
P, corresponding I
1And I
2Gray scale point appears to (v
1, v
2) the I of position
PPixel value compose and to be probability P (v
1, v
2)/P (v
1), obtain space correlation conditional probability figure I
P
Step 7, statistical space Correlation Criteria probability graph I
PIn each pixel correspondence image I
1And I
2The conditional probability of each point in 3 * 3 neighborhoods of middle same pixel position, as shown in Figure 3, concrete steps are as follows:
7a) note image I
1And I
2In (i, j) pixel value of locating 18 points in a little 3 * 3 neighborhoods constitutes set
Image I wherein
1Sets of pixel values be
Image I
2Sets of pixel values be
7b) space correlation conditional probability figure I
PIn (i, j) pixel value of locating a little is v
P, the image I corresponding with this point
1And I
2So that (i, j) sets of pixel values for totally 18 points in 3 * 3 pixel size scopes at center is v
S, statistics wherein
With v
PThe pixel value of forming is right
At pixel pair set (v
P, v
S) the middle frequency that occurs
And at space correlation conditional probability figure I
PMiddle this pixel value of statistics is right
The total frequency that occurs
Obtain
The probability that occurs
7c) statistics gray-scale value
At I
1And I
2The general probability that occurs among two width of cloth figure
Then can obtain I
PIn (i j) locates gray-scale value v
PWith respect to I
1And I
2In (i j) locates each gray-scale value in 3 * 3 neighborhoods
Conditional probability:
Step 8 is according to step (6) image I
1And I
2In (i j) locates each point and I in 3 * 3 neighborhoods
PIn (i j) locates a little conditional probability, accordingly with the optimum segmentation figure IE of 2 o'clock phase images in the step (2)
1And IE
2In (i j) locates that each point merges in 3 * 3 neighborhoods, and (i, the value of j) locating obtain a width of cloth and merge disparity map I to obtain merging the back disparity map
F, concrete steps are as follows:
8a) select the optimum segmentation figure IE of 2 o'clock phase images respectively
1And IE
2(i, j) in 3 * 3 neighborhoods at some place on 0 °, 45 °, 90 °, 135 ° directions greater than the pixel of optimal threshold separately, calculate on the four direction greater than the pixel of optimal threshold at direction proportion r separately
1(ori) and r
2(ori) and the average u of the respective conditions probability on direction separately
1(ori) and u
2(ori), wherein ori is respectively 45 °, and 90 °, 135 °;
8b) calculate the optimum segmentation figure IE of 2 o'clock phasors respectively
1And IE
2(i, j) ratio on the some place four direction and counterparty are to the product of the average of conditional probability;
8c) product of the four direction of 2 o'clock phases is averaged summation and obtains that (i, the j) fusion pixel values at some place obtain a width of cloth and merge disparity map I
F
Step 9 utilizes the Otsu threshold value that step (8) gained is merged disparity map I
FIntermediate value is divided into the variation class greater than the point of OTSU threshold value, and other points are divided into non-variation class, obtain the change-detection result.
Effect of the present invention further specifies by following experiment:
The present invention has used simulation SAR and true two groups of experimental datas of SAR data respectively.
First group of simulation SAR data are that the distribution form according to the Gibbs random field obtains reference picture with two class scenes, obtain to have the SAR image of coherent spot then according to the imaging mechanism of SAR image, at last the region of variation of emulation is embedded into and obtains in the reference picture.Fig. 6 has provided simulation SAR data and experimental result thereof, wherein Fig. 6 (a) is the former figure of simulation SAR data first o'clock phase, Fig. 6 (b) is the former figure of simulation SAR data second o'clock phase, Fig. 6 (c) is the optimum segmentation figure of first o'clock phasor, Fig. 6 (d) is the optimum segmentation figure of second o'clock phasor, Fig. 6 (e) is the change-detection reference diagram of 2 o'clock phases, and Fig. 6 (f) is change-detection of the present invention figure as a result.
Second group true SAR data are to be obtained near Switzerland Bern city by ERS 2 entrained SAR in April, 1999 and May.This group data variation partly be in the Thun city, the flood in Bern city and section Aare river, Bern airport causes.Used image size is 301 * 301 pixels in the experiment, 256 gray levels, and wherein changing the pixel number is 1155, not changing the pixel elements number is 89446.Fig. 7 has provided true SAR data and experimental result, the former figure of phase when Fig. 7 (a) is true SAR the first, the former figure of phase when Fig. 7 (b) is true SAR the second, Fig. 7 (c) is the optimum segmentation figure of first o'clock phasor, Fig. 7 (d) is the optimum segmentation figure of second o'clock phasor, Fig. 7 (e) is the change-detection reference diagram of 2 o'clock phases, and Fig. 7 (f) is change-detection of the present invention figure as a result.
The present invention is by comparing with the logarithm ratioing technigue, utilize the classical index in three change-detection fields of false drop rate, loss and just inspection rate to illustrate that the present invention has kept changing the integrality of block preferably, and reduced the appearance of pseudo-variation block, obtained change-detection result preferably.
Table 1 has provided the experimental result of the present invention and logarithm ratioing technigue respectively.As can be seen from Table 1, with respect to the logarithm ratioing technigue, the present invention is to simulation SAR and true SAR data, and false drop rate and loss have all had tangible reduction, and just inspection rate obviously improves, and generally speaking, the change-detection precision has had large increase.For simulation SAR data, false drop rate of the present invention is reduced to 6.93% with respect to the logarithm ratioing technigue from 25.24%, illustrate that the pseudo-block occurrence rate that changes has reduced 17 percentage points, loss is reduced to 6.93% from 11.304%, the integrality that region of variation is described is kept preferably, simultaneously just inspection rate brings up to 95.063% from 88.69%, and the change-detection precision has had large increase.For true SAR data, the present invention is keeping false drop rate to change under the little situation, and loss significantly reduces, and just inspection rate obviously improves, and the change-detection precision has had large increase.
Table 1SAR data experiment result
Claims (2)
1. a multidate SAR image change detection method that merges based on the space correlation conditional probability comprises the steps:
1) 2 o'clock phase images to input adopt twice enhanced Lee filtering respectively, obtain image I
1And I
2
2) calculate 2 o'clock phase images I respectively with the Canny boundary operator
1And I
2The edge, obtain two width of cloth Canny outline maps;
3) calculate in two width of cloth Canny outline maps each pixel and its Euclidean distance of each marginal point on every side, the distance of minimum of selecting each pixel and its surrounding edge point obtains the Edge Distance value of each point in two width of cloth Canny outline maps as the Edge Distance value of this pixel;
4) get image I
2Each gray level as a threshold value to image I
2Binaryzation obtains several binary map corresponding with each gray level respectively; Calculate the edge and the image I of each binary map
2The matching degree m at Canny edge
1(g
2)/m
2(g
2), wherein, m
1(g
2) be LE
2(g
2)/BE
2(g
2), BE
2(g
2) be and gray level g
2Pixel number on the edge of corresponding binary map, LE
2(g
2) be the binary map edge with the time 2 image I mutually
2The overlapping pixel number of Canny marginal existence, m
2(g
2) be gray level g
2On the edge of corresponding binary map the Edge Distance value of some corresponding point in second width of cloth Canny outline map with; The binary map of choosing the matching degree maximum is as image I
2Optimum segmentation figure IE
2
5) get image I
1Each gray level as a threshold value to image I
1Binaryzation obtains several binary map corresponding with each gray level respectively; Calculate the edge and the image I of each binary map
1The matching degree m at Canny edge
3(g
1) m
1(g
1)/m
2(g
1), wherein, m
1(g
1) be LE
1(g
1)/BE
1(g
1), BE
1(g
1) be and gray level g
1Pixel number on the edge of corresponding binary map, LE
1(g
1) be the binary map edge with the time 1 image I mutually
1The Canny outline map in the overlapping pixel number of marginal existence, m
2(g
1) be gray level g
1On the corresponding binary map edge Edge Distance value of some corresponding point in first width of cloth Canny outline map with, m
3(g
1) be first o'clock phase images I
1Gray level g
1Edge and second o'clock phase images I of corresponding binary map
2Optimum segmentation figure IE
2The edge both exist the overlapping pixel number in locus divided by second o'clock phase images I
2Optimum segmentation figure EI
2The value of the pixel number at edge; The binary map that selects the matching degree maximum is as image I
1Optimum segmentation figure IE
1
6) get image I
1And I
2In be positioned at that (i j) locates the gray-scale value v of pixel
1And v
2, form gray scale point to (v
1, v
2), statistical picture I
1And I
2Middle gray scale point is to (v
1, v
2) probability P (v that occurs
1, v
2) and gray-scale value v
1Probability P (the v that occurs
1);
7) make up a width of cloth and image I
1And I
2The M of identical size * N image I
P, corresponding I
1And I
2Gray scale point appears to (v
1, v
2) the I of position
PPixel value compose and to be probability P (v
1, v
2)/P (v
1), obtain space correlation conditional probability figure I
P
8) statistical space Correlation Criteria probability graph I
PIn each pixel and image I
1And I
2The conditional probability of each point in 3 * 3 neighborhoods of middle same position pixel
Wherein, v
P(i j) is I
PIn (i j) locates a little value, and v (s) is an image I
1And I
2In (i j) locates the value of 18 points in 3 * 3 neighborhoods a little, N
H, l(v
P(i, j), v (s)) be (v
P(i, j), v (s)) is by I
PIn (h l) locates pixel value and image I
1And I
2In (h l) locates in 3 * 3 neighborhoods of pixel the frequency that 18 groups of gray scale centerings that 18 pixel values form occur, and P (v (s)) is an image I
1And I
2The probability that intermediate value v (s) occurs;
9) according to step (8) image I
1And I
2In (i j) locates each point and I in 3 * 3 neighborhoods
PIn (i, the conditional probability of the point of j) locating is accordingly with the optimum segmentation of 2 o'clock phase images in step (4) and the step (5) figure IE as a result
1And IE
2In (i j) locates that each point merges in 3 * 3 neighborhoods, and (i, the value of j) locating obtain a width of cloth and merge disparity map I to obtain merging the back disparity map
F
10) utilize the Otsu threshold value that step (9) gained is merged disparity map I
FBe divided into variation class and non-variation class, obtain the change-detection result.
2. the multidate SAR image change detection method that merges based on the space correlation conditional probability according to claim 1, the wherein described optimum segmentation with 2 o'clock phase images in step (4) and the step (5) of step (9) figure IE as a result
1And IE
2In (i j) locates each point fusion in 3 * 3 neighborhoods, carries out as follows:
Select the optimum segmentation figure IE of 2 o'clock phase images at first, respectively
1And IE
2(i, in 3 * 3 neighborhoods of j) locating on 0 °, 45 °, 90 °, 135 ° directions greater than the pixel of optimal threshold separately, calculate on the four direction greater than the pixel of optimal threshold at direction proportion r separately
1(ori) and r
2(ori) and the average u of the respective conditions probability on direction separately
1(ori) and u
2(ori), wherein ori is respectively 45 °, and 90 °, 135 °;
Then, calculate the optimum segmentation figure IE of 2 o'clock phasors respectively
1And IE
2(i j) locates ratio on the four direction and the counterparty product to the average of conditional probability;
At last, the product of the four direction of 2 o'clock phases is averaged summation and obtains that (i, the fusion pixel values of j) locating obtain a width of cloth and merge disparity map I
F
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN200910024296XA CN101694720B (en) | 2009-10-13 | 2009-10-13 | Multidate SAR image change detection method based on space associated conditional probability fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN200910024296XA CN101694720B (en) | 2009-10-13 | 2009-10-13 | Multidate SAR image change detection method based on space associated conditional probability fusion |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101694720A true CN101694720A (en) | 2010-04-14 |
CN101694720B CN101694720B (en) | 2012-02-08 |
Family
ID=42093689
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN200910024296XA Expired - Fee Related CN101694720B (en) | 2009-10-13 | 2009-10-13 | Multidate SAR image change detection method based on space associated conditional probability fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN101694720B (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102013093A (en) * | 2010-12-02 | 2011-04-13 | 南京大学 | High resolution remote sensing image segmentation method based on Gram-Schmidt fusion and locally excitatory globally inhibitory oscillator networks (LEGION) |
CN102176014A (en) * | 2011-01-19 | 2011-09-07 | 西安理工大学 | Method for detecting urban region change based on multi-temporal SAR (synthetic aperture radar) images |
CN102262395A (en) * | 2010-05-25 | 2011-11-30 | 赖金轮 | Multiple damage detection and control system, and method thereof |
CN102750701A (en) * | 2012-06-15 | 2012-10-24 | 西安电子科技大学 | Method for detecting spissatus and spissatus shadow based on Landsat thematic mapper (TM) images and Landsat enhanced thematic mapper (ETM) images |
CN104200471A (en) * | 2014-08-30 | 2014-12-10 | 西安电子科技大学 | SAR image change detection method based on adaptive weight image fusion |
CN106709923A (en) * | 2015-11-16 | 2017-05-24 | 中国科学院沈阳自动化研究所 | Change detection and test method thereof oriented to heterogeneous sequence image |
CN107516082A (en) * | 2017-08-25 | 2017-12-26 | 西安电子科技大学 | Based on the SAR image change region detection method from step study |
CN107689051A (en) * | 2017-09-08 | 2018-02-13 | 浙江环球星云遥感科技有限公司 | A kind of multitemporal SAR image change detecting method based on changed factor |
CN110023989A (en) * | 2017-03-29 | 2019-07-16 | 华为技术有限公司 | A kind of generation method and device of sketch image |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1139898C (en) * | 2002-03-25 | 2004-02-25 | 北京工业大学 | Cornea focus image cutting method based on k-mean cluster and information amalgamation |
CN1251142C (en) * | 2003-11-20 | 2006-04-12 | 上海交通大学 | Multi-source image registering method on the basis of contour under rigid body transformation |
CN100454337C (en) * | 2005-11-01 | 2009-01-21 | 中国科学院自动化研究所 | Method for registering images by combining space information |
-
2009
- 2009-10-13 CN CN200910024296XA patent/CN101694720B/en not_active Expired - Fee Related
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102262395A (en) * | 2010-05-25 | 2011-11-30 | 赖金轮 | Multiple damage detection and control system, and method thereof |
CN102013093A (en) * | 2010-12-02 | 2011-04-13 | 南京大学 | High resolution remote sensing image segmentation method based on Gram-Schmidt fusion and locally excitatory globally inhibitory oscillator networks (LEGION) |
CN102176014B (en) * | 2011-01-19 | 2013-02-13 | 西安理工大学 | Method for detecting urban region change based on multi-temporal SAR (synthetic aperture radar) images |
CN102176014A (en) * | 2011-01-19 | 2011-09-07 | 西安理工大学 | Method for detecting urban region change based on multi-temporal SAR (synthetic aperture radar) images |
CN102750701B (en) * | 2012-06-15 | 2015-03-04 | 西安电子科技大学 | Method for detecting spissatus and spissatus shadow based on Landsat thematic mapper (TM) images and Landsat enhanced thematic mapper (ETM) images |
CN102750701A (en) * | 2012-06-15 | 2012-10-24 | 西安电子科技大学 | Method for detecting spissatus and spissatus shadow based on Landsat thematic mapper (TM) images and Landsat enhanced thematic mapper (ETM) images |
CN104200471A (en) * | 2014-08-30 | 2014-12-10 | 西安电子科技大学 | SAR image change detection method based on adaptive weight image fusion |
CN104200471B (en) * | 2014-08-30 | 2017-03-01 | 西安电子科技大学 | SAR image change detection based on adaptive weight image co-registration |
CN106709923A (en) * | 2015-11-16 | 2017-05-24 | 中国科学院沈阳自动化研究所 | Change detection and test method thereof oriented to heterogeneous sequence image |
CN106709923B (en) * | 2015-11-16 | 2019-11-29 | 中国科学院沈阳自动化研究所 | A kind of variation detection and its test method towards heterologous sequence image |
CN110023989A (en) * | 2017-03-29 | 2019-07-16 | 华为技术有限公司 | A kind of generation method and device of sketch image |
CN107516082A (en) * | 2017-08-25 | 2017-12-26 | 西安电子科技大学 | Based on the SAR image change region detection method from step study |
CN107516082B (en) * | 2017-08-25 | 2019-11-22 | 西安电子科技大学 | Based on the SAR image change region detection method from step study |
CN107689051A (en) * | 2017-09-08 | 2018-02-13 | 浙江环球星云遥感科技有限公司 | A kind of multitemporal SAR image change detecting method based on changed factor |
Also Published As
Publication number | Publication date |
---|---|
CN101694720B (en) | 2012-02-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101694720B (en) | Multidate SAR image change detection method based on space associated conditional probability fusion | |
CN101650439B (en) | Method for detecting change of remote sensing image based on difference edge and joint probability consistency | |
CN102932605B (en) | Method for selecting camera combination in visual perception network | |
Liu et al. | Single image dehazing via large sky region segmentation and multiscale opening dark channel model | |
CN102254319B (en) | Method for carrying out change detection on multi-level segmented remote sensing image | |
CN110119728A (en) | Remote sensing images cloud detection method of optic based on Multiscale Fusion semantic segmentation network | |
CN102496034B (en) | High-spatial resolution remote-sensing image bag-of-word classification method based on linear words | |
CN103985108B (en) | Method for multi-focus image fusion through boundary detection and multi-scale morphology definition measurement | |
CN109816012A (en) | A kind of multiscale target detection method of integrating context information | |
CN104091171B (en) | Vehicle-mounted far infrared pedestrian detecting system and method based on local feature | |
CN102110227B (en) | Compound method for classifying multiresolution remote sensing images based on context | |
CN103714549B (en) | Based on the stereo-picture object dividing method of quick local matching | |
CN103226820A (en) | Improved two-dimensional maximum entropy division night vision image fusion target detection algorithm | |
CN104463199A (en) | Rock fragment size classification method based on multiple features and segmentation recorrection | |
CN103632363A (en) | Object-level high-resolution remote sensing image change detection method based on multi-scale fusion | |
CN102842045A (en) | Pedestrian detection method based on combined features | |
CN104408482A (en) | Detecting method for high-resolution SAR (Synthetic Aperture Radar) image object | |
CN103198482B (en) | Based on the method for detecting change of remote sensing image that disparity map fuzzy membership merges | |
CN102622769A (en) | Multi-target tracking method by taking depth as leading clue under dynamic scene | |
CN104809433A (en) | Zebra stripe detection method based on maximum stable region and random sampling | |
CN110008900B (en) | Method for extracting candidate target from visible light remote sensing image from region to target | |
CN104899892A (en) | Method for quickly extracting star points from star images | |
CN103294792A (en) | Polarimetric SAR (synthetic aperture radar) terrain classification method based on semantic information and polarimetric decomposition | |
CN105184808A (en) | Automatic segmentation method for foreground and background of optical field image | |
CN103226832A (en) | Multispectral remote sensing image variation detection method based on spectral reflectivity variation analysis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20120208 Termination date: 20171013 |