CN102254311A - Method and system for fusing remote sensing images - Google Patents
Method and system for fusing remote sensing images Download PDFInfo
- Publication number
- CN102254311A CN102254311A CN2011101561616A CN201110156161A CN102254311A CN 102254311 A CN102254311 A CN 102254311A CN 2011101561616 A CN2011101561616 A CN 2011101561616A CN 201110156161 A CN201110156161 A CN 201110156161A CN 102254311 A CN102254311 A CN 102254311A
- Authority
- CN
- China
- Prior art keywords
- image
- high frequency
- strength component
- frequency image
- synthetic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Radar Systems Or Details Thereof (AREA)
- Image Processing (AREA)
Abstract
A method for fusing remote sensing images comprises the following steps of: performing colour space transformation on a multispectral image to obtain the intensity component of the multispectral image; performing multi-level wavelet decomposition on the intensity component of the multispectral image and a synthetic aperture radar image by the a' trous algorithm so as to obtain an intensity component high-frequency image, a synthetic-aperture radar high-frequency image and an intensity component low-frequency image at each level; collecting the intensity component high-frequency images and the synthetic-aperture radar high-frequency images level by level through a sliding window, respectively, thereby obtaining a variance within the sliding window, and then obtaining a fused high-frequency image according to the variance of the sliding window; reconstructing the fused high-frequency image and the intensity component low-frequency images; and performing the colour space transformation on the reconstructed image to obtain a final fused image. In the method and the system for fusing the remote sensing images, the colour space transformation and the a' trous algorithm are applied to the fusion of the multispectral image and the synthetic aperture radar image, so that the spectral information and the space information can be well retained and the accuracy in the fusion process is greatly improved.
Description
[technical field]
The present invention relates to the image analysing computer technology, particularly relate to a kind of remote sensing image fusing method and system.
[background technology]
Along with the development of remote sensing technology, various earth observation satellites are providing the satellite remote-sensing image of different space, time and spectral resolution continuously.For observed object being had more comprehensive, clear, a understanding accurately and cognition, the remote sensing image integration technology develops day by day, with multispectral remote sensing image and synthetic aperture (synthetic aperture radar, abbreviation SAR) image merges, to realize more accurate, more reliable estimation and the judgement to target.
Traditional multispectral remote sensing image and synthetic aperture image can pass through IHS (Intensity, Hue, Saturation, intensity, tone, degree of saturation) converter technique, principal component analysis (PCA) and merge based on the algorithm of wavelet decomposition.But the IHS converter technique can cause the distortion of spectral information in the image, does not consider the spatial information in the image.
[summary of the invention]
Based on this, be necessary to provide a kind of remote sensing image fusing method that improves accuracy.
In addition, also be necessary to provide a kind of remote sensing image emerging system that improves accuracy.
A kind of remote sensing image fusing method may further comprise the steps: multispectral image is carried out the strength component that color space transformation obtains described multispectral image; The strength component and the synthetic-aperture radar image of described multispectral image are carried out the multilayer wavelet decomposition by the porous algorithm, obtain strength component high frequency image, synthetic-aperture radar high frequency image and strength component low frequency image in each layer; Successively strength component high frequency image and synthetic-aperture radar high frequency image are added up respectively by moving window, obtained the variance in the moving window, merged the high frequency image accordingly according to the variance in the described moving window; Described fusion high frequency image and strength component low frequency image are reconstructed; Image after the reconstruct is carried out the color space inverse transformation, finally merged image.
Preferably, described strength component and synthetic-aperture radar image with described multispectral image carries out the multilayer wavelet decomposition by the porous algorithm, and the step that obtains strength component high frequency image, synthetic-aperture radar high frequency image and strength component low frequency image in each layer is: the strength component of multispectral image and synthetic-aperture radar image are obtained the strength component high frequency image of ground floor and synthetic-aperture radar high frequency image, strength component low frequency image by low frequency filter respectively; Successively according to decomposing the low frequency filter of number of plies adjustment, by the strength component high frequency image and the synthetic-aperture radar high frequency image of last layer, strength component low frequency image obtains working as anterior layer by the low frequency filter when anterior layer strength component high frequency image, synthetic-aperture radar high frequency image and strength component low frequency image when anterior layer.
Preferably, described strength component and synthetic-aperture radar image with multispectral image carries out wavelet decomposition by low frequency filter respectively and obtains also comprising before the step of the strength component high frequency image of initiation layer and synthetic-aperture radar high frequency image: the low frequency filter of choosing corresponding system wavelet decomposition according to the resolution ratio between described multispectral image and the synthetic-aperture radar image.
Preferably, describedly successively strength component high frequency image and synthetic-aperture radar high frequency image are added up the variance that obtains in the moving window respectively by moving window, the step that is merged the high frequency image according to the variance in the described moving window accordingly is: move described moving window respectively in described strength component high frequency image and synthetic-aperture radar high frequency image, calculate the variance in the described moving window; Variance in the moving window described in each layer is compared, get the corresponding high frequency image that merges of maximum variance conduct in the described moving window.
A kind of remote sensing image emerging system comprises at least: the spatial alternation module is used for multispectral image is carried out the strength component that color space transformation obtains described multispectral image; The wavelet decomposition module is used for the strength component and the synthetic-aperture radar image of described multispectral image are carried out strength component high frequency image, synthetic-aperture radar high frequency image and the strength component low frequency image that the multilayer wavelet decomposition obtains each layer by the porous algorithm; Fusion Module is used for successively strength component high frequency image and synthetic-aperture radar high frequency image being added up the variance that obtains in the moving window respectively by moving window, is merged the high frequency image accordingly according to the variance in the described moving window; Reconstructed module is used for described fusion high frequency image and strength component low frequency image are reconstructed; Described spatial alternation module is used for that also the image after the reconstruct is carried out the color space inverse transformation and is finally merged image.
Preferably, described wavelet decomposition module comprises: low frequency filter, be used for that the strength component of multispectral image and synthetic-aperture radar image are carried out wavelet decomposition respectively and obtain the strength component high frequency image of ground floor and synthetic-aperture radar high frequency image, strength component low frequency image, and successively carry out wavelet decomposition and obtain when the strength component high frequency image of anterior layer and synthetic-aperture radar high frequency image, strength component low frequency image according to the strength component high frequency image of last layer and synthetic-aperture radar high frequency image, strength component low frequency image; Adjustment unit is used for successively according to decomposing the described low frequency filter of number of plies adjustment when anterior layer.
Preferably, described wavelet decomposition module also comprises: choose the unit, be used for choosing according to the resolution ratio between described multispectral image and the synthetic-aperture radar image low frequency filter of corresponding system wavelet decomposition.
Preferably, described Fusion Module comprises: the variance computing unit, be used for moving described moving window at described strength component high frequency image and synthetic-aperture radar high frequency image respectively, and calculate the variance in the described moving window; Comparing unit is used for the variance in the moving window described in each layer is compared, and gets the corresponding high frequency image that merges of maximum variance conduct in the described moving window.
In above-mentioned remote sensing image fusing method and the system, color space transformation, porous algorithm application in the fusion of multispectral image and synthetic-aperture radar image, can be kept spectral information and spatial information preferably, greatly improve the accuracy in the fusion process.
In above-mentioned remote sensing image fusing method and the system, the spectral resolution height of multispectral image, spectral information enrich, and the synthetic-aperture radar image has very high spatial resolution, this two classes image is merged, can obtain more accurate, more reliable, more fully merge image, thereby the visual classification of the classification that utilizes in the soil, the greenery patches in the urban area, exposed soil, buildings and road network, aspects such as the extraction performance enormous function of road network are to satisfy the requirement of city digital development.
[description of drawings]
Fig. 1 is the process flow diagram of remote sensing image fusing method among the embodiment;
Fig. 2 is the process flow diagram of wavelet decomposition among the embodiment;
Fig. 3 obtains the process flow diagram that merges the high frequency image among the embodiment;
Fig. 4 is the synoptic diagram of remote sensing image emerging system among the embodiment;
Fig. 5 is the synoptic diagram of wavelet decomposition module among the embodiment;
Fig. 6 is the synoptic diagram of Fusion Module among the embodiment;
Fig. 7 is a multispectral image among the embodiment;
Fig. 8 is a synthetic-aperture radar image among the embodiment;
Fig. 9 is the final fusion image among the embodiment.
[embodiment]
Fig. 1 shows a remote sensing image fusing method among the embodiment, may further comprise the steps:
In step S10, multispectral image is carried out the strength component that color space transformation obtains multispectral image.In the present embodiment, with multispectral image from RGB (Red, Green, Blue, RGB) spatial alternation to the IHS space.Because the spectral information of multispectral image is mainly reflected on tone and the saturation degree in the IHS space, in the fusion process of remote sensing image, need under the prerequisite that keeps spectral information, add more detailed information, therefore multispectral image is transformed in the IHS space by rgb space.
Particularly, can carry out the IHS spatial alternation with the multispectral image input with in the drag, this model is:
Wherein, I is an intensity, and H is a tone, and S is a saturation degree, v
1With v
2Represent process variable respectively, pass through v
1With v
2Calculate H and S respectively.
In step S30, the strength component and the synthetic-aperture radar image of multispectral image carried out the multilayer wavelet decomposition by the porous algorithm, obtain strength component high frequency image, synthetic-aperture radar high frequency image and strength component low frequency image in each layer.In the present embodiment, the porous algorithm is a à trous algorithm, has translation invariance, can eliminate defectives such as spectral leakage in the image and spectral aliasing well, and the space continuity that can not lose the edge.To the wavelet decomposition that the strength component and the synthetic-aperture radar image of multispectral image carries out the preset decomposition number of plies, wherein decomposing the number of plies can be provided with voluntarily according to the effect that merges, and is generally 2.
To the strength component in the multispectral image, promptly I component is carried out wavelet decomposition, keep H component (tone component) and S component (saturation degree component), because after multispectral image has carried out the IHS spatial alternation, its spectral information mainly concentrates in the strength component, tone component and saturation degree component are decomposed obtained effect and bad, therefore strength component is carried out wavelet decomposition, keep tone component and saturation degree component.
In a specific embodiment, as shown in Figure 2, the process of step S30 is:
In step S301, the strength component of multispectral image and synthetic-aperture radar image are obtained the strength component high frequency image of ground floor and synthetic-aperture radar high frequency image, strength component low frequency image by low frequency filter respectively.In the present embodiment, the strength component of multispectral image and synthetic-aperture radar image carry out wavelet decomposition as image to be decomposed by following formula:
I
1=I
0*H
0
G
1=I
0-I
1
Wherein, I
0Be image to be decomposed, I
1For carrying out the low frequency image after one deck wavelet decomposition, H
0Be low frequency filter, satisfy
Wherein M represents the wavelet transformation of M system, and δ (k) is dirac function (Dirac function); G
1Be the high frequency image after one deck wavelet decomposition.
In step S303, successively according to decomposing the low frequency filter of number of plies adjustment when anterior layer, by the strength component high frequency image and the synthetic-aperture radar image of last layer, strength component low frequency image by the low frequency filter when anterior layer obtain when anterior layer strength component high frequency image, synthesize hole radar image and strength component low frequency image.In the present embodiment,, carry out L layer wavelet decomposition, can realize by following formula according to decomposing number of plies L:
I
L=I
L-1*H
L-1
G
L=I
L-1-I
L
Wherein, I
L-1For carrying out the low frequency image after the L-1 layer wavelet decomposition, H
L-1Used low frequency filter during for L-1 layer wavelet decomposition, I
LFor carrying out the low frequency image after the L layer wavelet decomposition, G
LBe the high frequency image after the L layer wavelet decomposition.
In the above-mentioned remote sensing image fusing method, before step S301, also comprised the step of choosing the low frequency filter of corresponding system wavelet decomposition according to the resolution ratio between multispectral image and the synthetic-aperture radar image.In the present embodiment, be the fusion of realization arbitrary resolution ratio, and the accuracy after guaranteeing to merge, before carrying out wavelet decomposition, also need to choose suitable low frequency filter, to handle the visual fusion problem of arbitrary resolution ratio.Particularly, the spatial resolution that calculates multispectral image is M (M 〉=2) with the ratio of the spatial resolution of synthetic-aperture radar image, then selects the low frequency filter of M system wavelet decomposition.For example, the spatial resolution of multispectral image is 30 meters, and the spatial resolution of synthetic hole radar image is 10 meters, and then the spatial resolution ratio of multispectral image and synthetic-aperture radar image is 3, then chooses the low frequency filter of three-shift wavelet decomposition.
In step S50, successively strength component high frequency image and synthetic-aperture radar high frequency image are added up respectively by moving window, obtain the variance in the moving window, merged the high frequency image accordingly according to the variance in the moving window.In the present embodiment, the size of moving window can arbitrarily be provided with, for example, and 3*3,5*5,7*7 etc.In a preferred embodiment,, easy more image information is caused damage, therefore, choose the moving window that is of a size of 3*3 according to the final effect of visual fusion because window is big more.
In a specific embodiment, as shown in Figure 3, the process of step S50 is:
In step S501, mobile moving window in strength component high frequency image and synthetic-aperture radar high frequency image calculates the variance in the moving window respectively.In the present embodiment, in each layer wavelet decomposition process, moving window moves in strength component high frequency image and synthetic-aperture radar high frequency image, the calculating of the line slip window internal variance of going forward side by side, and each layer all has corresponding a series of moving window variance.
In step S503, the variance in the moving window in each layer is compared, get the corresponding high frequency image that merges of maximum variance conduct in the moving window.
In step S70, be reconstructed merging high frequency image and strength component low frequency image.In the present embodiment, will merge the high frequency image and be reconstructed with the last resulting strength component low frequency of one deck wavelet decomposition image, restructing algorithm is:
Wherein, I is the image after the reconstruct, and L is for decomposing the number of plies, G
iBe the fusion high frequency image of i layer, I
LBe every L layer strength component low frequency image.
In step S90, the image after the reconstruct is carried out the color space inverse transformation finally merged image.In the present embodiment, with the image after the reconstruct from the IHS space conversion to rgb space, finally to be merged image, its transformation model is as follows:
In addition, also be necessary to provide a kind of remote sensing image emerging system.As shown in Figure 4, this system comprises spatial alternation module 10, wavelet decomposition module 30, Fusion Module 50 and reconstructed module 70.
Particularly, can carry out the IHS spatial alternation with the multispectral image input with in the drag, this model is:
Wherein, I is an intensity, and H is a tone, and S is a saturation degree, v
1With v
2Represent process variable respectively, pass through v
1With v
2Calculate H and S respectively.
Fusion Module 50 is used for successively strength component high frequency image and synthetic-aperture radar high frequency image being added up the variance that obtains in the moving window respectively by moving window, is merged the high frequency image accordingly according to the variance in the moving window.In the present embodiment, the size of moving window can arbitrarily be provided with, for example, and 3*3,5*5,7*7 etc.In a preferred embodiment, because window is big more, easy more image information is caused damage, therefore according to the final effect of visual fusion, Fusion Module 50 is chosen the moving window that is of a size of 3*3.
Reconstructed module 70 is used for being reconstructed merging high frequency image and strength component low frequency image.In the present embodiment, reconstructed module 70 will merge the high frequency image and the last resulting strength component low frequency of one deck wavelet decomposition image is reconstructed, and restructing algorithm is:
Wherein, I is the image after the reconstruct, and L is for decomposing the number of plies, G
iBe the fusion high frequency image of i layer, I
LBe every L layer strength component low frequency image.
In the above-mentioned remote sensing image emerging system, spatial alternation module 10 is used for that also the image after the reconstruct is carried out the color space inverse transformation and is finally merged image.In the present embodiment, spatial alternation module 10 with the image after the reconstruct from the IHS space conversion to rgb space, finally to be merged image, its transformation model is as follows:
In a specific embodiment, as shown in Figure 5, wavelet decomposition module 30 comprises low frequency filter 301 and adjustment unit 303.
I
1=I
0*H
0
G
1=I
0-I
1
Wherein, I
0Be image to be decomposed, I
1For carrying out the low frequency image after one deck wavelet decomposition, H
0Be low frequency filter, satisfy
Wherein M represents the wavelet transformation of M system, and δ (k) is the dirac function; G
1Be the high frequency image after one deck wavelet decomposition.
I
L=I
L-1*H
L-1
G
L=I
L-1-I
L
Wherein, I
L-1For carrying out the low frequency image after the L-1 layer wavelet decomposition, H
L-1Used low frequency filter during for L-1 layer wavelet decomposition, I
LFor carrying out the low frequency image after the L layer wavelet decomposition, G
LBe the high frequency image after the L layer wavelet decomposition.
Among another embodiment, above-mentioned wavelet decomposition module 30 also comprises chooses the unit.Choose the low frequency filter 301 that the unit is used for choosing according to the resolution ratio between multispectral image and the synthetic hole radar image corresponding system wavelet decomposition.In the present embodiment, be the fusion of realization arbitrary resolution ratio, and the accuracy after guaranteeing to merge, choosing the unit also needed to choose suitable low frequency filter before carrying out wavelet decomposition, to handle the visual fusion problem of arbitrary resolution ratio.Particularly, the ratio of choosing spatial resolution that the unit calculates multispectral image and the spatial resolution of synthetic-aperture radar image is M (M 〉=2), then selects the low frequency filter of M system wavelet decomposition.For example, the spatial resolution of multispectral image is 30 meters, and the spatial resolution of synthetic hole radar image is 10 meters, and then the spatial resolution ratio of multispectral image and synthetic-aperture radar image is 3, then chooses the low frequency filter of three-shift wavelet decomposition.
In another specific embodiment, as shown in Figure 6, Fusion Module 50 comprises variance computing unit 501 and comparing unit 503.
Variance computing unit 501 is used for respectively at strength component high frequency image and the mobile moving window of synthetic-aperture radar high frequency image, calculates the variance in the moving window.In the present embodiment, in each layer wavelet decomposition process, variance computing unit 501 moves moving window in strength component high frequency image and synthetic-aperture radar high frequency image, the calculating of the line slip window internal variance of going forward side by side, and each layer all has corresponding a series of moving window variance.
Comparing unit 503 is used for the variance in each layer moving window is compared, and gets the corresponding high frequency image that merges of maximum variance conduct in the moving window.
Set forth remote sensing image fusing method and system below in conjunction with an embodiment.Among this embodiment, as shown in Figure 7, the spatial resolution of multispectral image is 10 meters, as shown in Figure 8, the spatial resolution of synthetic-aperture radar image is 2.5 meters, and then the spatial resolution of multispectral image is 4 with the ratio of the spatial resolution of synthetic-aperture radar image, and the low frequency filter of choosing quaternary wavelet decomposition carries out wavelet decomposition and reconstruct, and through finally being merged image after the inverse transformation of color space, as shown in Figure 9.Should finally merge the spectral signature that image has not only kept multispectral image, also increase textural characteristics.
In above-mentioned remote sensing image fusing method and the system, color space transformation, porous algorithm application in the fusion of multispectral image and synthetic-aperture radar image, can be kept spectral information and spatial information preferably, greatly improve the accuracy in the fusion process.
In above-mentioned remote sensing image fusing method and the system, the spectral resolution height of multispectral image, spectral information enrich, and the synthetic-aperture radar image has very high spatial resolution, this two classes image is merged, can obtain more accurate, more reliable, more fully merge image, thereby the visual classification of the classification that utilizes in the soil, the greenery patches in the urban area, exposed soil, buildings and road network, aspects such as the extraction performance enormous function of road network are to satisfy city digital people's requirement.
The above embodiment has only expressed several embodiment of the present invention, and it describes comparatively concrete and detailed, but can not therefore be interpreted as the restriction to claim of the present invention.Should be pointed out that for the person of ordinary skill of the art without departing from the inventive concept of the premise, can also make some distortion and improvement, these all belong to protection scope of the present invention.Therefore, the protection domain of patent of the present invention should be as the criterion with claims.
Claims (8)
1. remote sensing image fusing method may further comprise the steps:
Multispectral image is carried out the strength component that color space transformation obtains described multispectral image;
The strength component and the synthetic-aperture radar image of described multispectral image are carried out the multilayer wavelet decomposition by the porous algorithm, obtain strength component high frequency image, synthetic-aperture radar high frequency image and strength component low frequency image in each layer;
Successively strength component high frequency image and synthetic-aperture radar high frequency image are added up respectively by moving window, obtained the variance in the moving window, merged the high frequency image accordingly according to the variance in the described moving window;
Described fusion high frequency image and strength component low frequency image are reconstructed;
Image after the reconstruct is carried out the color space inverse transformation, finally merged image.
2. remote sensing image fusing method according to claim 1, it is characterized in that, described strength component and synthetic-aperture radar image with described multispectral image carries out the multilayer wavelet decomposition by the porous algorithm, and the step that obtains strength component high frequency image, synthetic-aperture radar high frequency image and strength component low frequency image in each layer is:
The strength component of multispectral image and synthetic-aperture radar image are obtained the strength component high frequency image of ground floor and synthetic-aperture radar high frequency image, strength component low frequency image by low frequency filter respectively;
Successively according to decomposing the low frequency filter of number of plies adjustment, by the strength component high frequency image and the synthetic-aperture radar high frequency image of last layer, strength component low frequency image obtains working as anterior layer by the low frequency filter when anterior layer strength component high frequency image, synthetic-aperture radar high frequency image and strength component low frequency image when anterior layer.
3. remote sensing image fusing method according to claim 2, it is characterized in that described strength component and synthetic-aperture radar image with multispectral image carries out wavelet decomposition by low frequency filter respectively and obtain also comprising before the step of the strength component high frequency image of initiation layer and synthetic-aperture radar high frequency image:
Choose the low frequency filter of corresponding system wavelet decomposition according to the resolution ratio between described multispectral image and the synthetic-aperture radar image.
4. remote sensing image fusing method according to claim 1, it is characterized in that, describedly successively strength component high frequency image and synthetic-aperture radar high frequency image are added up the variance that obtains in the moving window respectively by moving window, the step that is merged the high frequency image according to the variance in the described moving window accordingly is:
In described strength component high frequency image and synthetic-aperture radar high frequency image, move described moving window respectively, calculate the variance in the described moving window;
Variance in the moving window described in each layer is compared, get the corresponding high frequency image that merges of maximum variance conduct in the described moving window.
5. a remote sensing image emerging system is characterized in that, comprises at least:
The spatial alternation module is used for multispectral image is carried out the strength component that color space transformation obtains described multispectral image;
The wavelet decomposition module is used for the strength component and the synthetic-aperture radar image of described multispectral image are carried out strength component high frequency image, synthetic-aperture radar high frequency image and the strength component low frequency image that the multilayer wavelet decomposition obtains each layer by the porous algorithm;
Fusion Module is used for successively strength component high frequency image and synthetic-aperture radar high frequency image being added up the variance that obtains in the moving window respectively by moving window, is merged the high frequency image accordingly according to the variance in the described moving window;
Reconstructed module is used for described fusion high frequency image and strength component low frequency image are reconstructed;
Described spatial alternation module is used for that also the image after the reconstruct is carried out the color space inverse transformation and is finally merged image.
6. remote sensing image emerging system according to claim 5 is characterized in that, described wavelet decomposition module comprises:
Low frequency filter, be used for that the strength component of multispectral image and synthetic-aperture radar image are carried out wavelet decomposition respectively and obtain the strength component high frequency image of ground floor and synthetic-aperture radar high frequency image, strength component low frequency image, and successively carry out wavelet decomposition and obtain when the strength component high frequency image of anterior layer and synthetic-aperture radar high frequency image, strength component low frequency image according to the strength component high frequency image of last layer and synthetic-aperture radar high frequency image, strength component low frequency image;
Adjustment unit is used for successively according to decomposing the described low frequency filter of number of plies adjustment when anterior layer.
7. remote sensing image emerging system according to claim 6 is characterized in that, described wavelet decomposition module also comprises:
Choose the unit, be used for choosing the low frequency filter of corresponding system wavelet decomposition according to the resolution ratio between described multispectral image and the synthetic-aperture radar image.
8. remote sensing image emerging system according to claim 5 is characterized in that, described Fusion Module comprises:
The variance computing unit is used for moving described moving window at described strength component high frequency image and synthetic-aperture radar high frequency image respectively, calculates the variance in the described moving window;
Comparing unit is used for the variance in the moving window described in each layer is compared, and gets the corresponding high frequency image that merges of maximum variance conduct in the described moving window.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011101561616A CN102254311A (en) | 2011-06-10 | 2011-06-10 | Method and system for fusing remote sensing images |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011101561616A CN102254311A (en) | 2011-06-10 | 2011-06-10 | Method and system for fusing remote sensing images |
Publications (1)
Publication Number | Publication Date |
---|---|
CN102254311A true CN102254311A (en) | 2011-11-23 |
Family
ID=44981553
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2011101561616A Pending CN102254311A (en) | 2011-06-10 | 2011-06-10 | Method and system for fusing remote sensing images |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102254311A (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102609929A (en) * | 2012-01-12 | 2012-07-25 | 河南大学 | Self-adaptive independent-information remote sensing image fusion method |
CN102663394A (en) * | 2012-03-02 | 2012-09-12 | 北京航空航天大学 | Method of identifying large and medium-sized objects based on multi-source remote sensing image fusion |
CN104021536A (en) * | 2014-06-16 | 2014-09-03 | 西北工业大学 | Self-adaptation SAR image and multispectral image fusion method |
CN104156911A (en) * | 2014-07-18 | 2014-11-19 | 苏州阔地网络科技有限公司 | Processing method and system for image fusion |
CN106842165A (en) * | 2017-03-16 | 2017-06-13 | 电子科技大学 | One kind is based on different distance angular resolution radar centralization asynchronous fusion method |
CN108764326A (en) * | 2018-05-23 | 2018-11-06 | 北京工业大学 | Urban impervious surface extracting method based on depth confidence network |
CN109886904A (en) * | 2019-01-25 | 2019-06-14 | 北京市遥感信息研究所 | A kind of SAR image and low resolution Multispectral Image Fusion Methods and system |
CN112634185A (en) * | 2020-12-17 | 2021-04-09 | 中国人民解放军火箭军工程大学 | SAR image and optical image fusion method based on HSL and image entropy |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1472544A (en) * | 2003-06-05 | 2004-02-04 | 上海交通大学 | Remote sensing image picture element and characteristic combination optimizing mixing method |
CN1484039A (en) * | 2003-07-24 | 2004-03-24 | 上海交通大学 | Image merging method based on inseparable wavelet frame |
CN1581230A (en) * | 2004-05-20 | 2005-02-16 | 上海交通大学 | Remote-senstive image interfusion method based on image local spectrum characteristic |
CN1770201A (en) * | 2004-11-05 | 2006-05-10 | 北京师范大学 | Adjustable remote sensing image fusion method based on wavelet transform |
-
2011
- 2011-06-10 CN CN2011101561616A patent/CN102254311A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1472544A (en) * | 2003-06-05 | 2004-02-04 | 上海交通大学 | Remote sensing image picture element and characteristic combination optimizing mixing method |
CN1484039A (en) * | 2003-07-24 | 2004-03-24 | 上海交通大学 | Image merging method based on inseparable wavelet frame |
CN1581230A (en) * | 2004-05-20 | 2005-02-16 | 上海交通大学 | Remote-senstive image interfusion method based on image local spectrum characteristic |
CN1770201A (en) * | 2004-11-05 | 2006-05-10 | 北京师范大学 | Adjustable remote sensing image fusion method based on wavelet transform |
Non-Patent Citations (3)
Title |
---|
《2010 3rd International Congress on Image and Signal Processing(CISP)》 20101018 Nianlong Han et.al Multi-band � Trous Wavelet Transform for Multisensor Image Fusion , * |
NIANLONG HAN ET.AL: "Multi-band À Trous Wavelet Transform for Multisensor Image Fusion", 《2010 3RD INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING(CISP)》 * |
NIANLONG HAN ET.AL: "Multi-band À Trous Wavelet Transform for Multisensor Image Fusion", 《2010 3RD INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING(CISP)》, 18 October 2010 (2010-10-18) * |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102609929A (en) * | 2012-01-12 | 2012-07-25 | 河南大学 | Self-adaptive independent-information remote sensing image fusion method |
CN102663394A (en) * | 2012-03-02 | 2012-09-12 | 北京航空航天大学 | Method of identifying large and medium-sized objects based on multi-source remote sensing image fusion |
CN102663394B (en) * | 2012-03-02 | 2013-09-25 | 北京航空航天大学 | Method of identifying large and medium-sized objects based on multi-source remote sensing image fusion |
CN104021536A (en) * | 2014-06-16 | 2014-09-03 | 西北工业大学 | Self-adaptation SAR image and multispectral image fusion method |
CN104021536B (en) * | 2014-06-16 | 2017-01-04 | 西北工业大学 | A kind of adaptive SAR image and Multispectral Image Fusion Methods |
CN104156911A (en) * | 2014-07-18 | 2014-11-19 | 苏州阔地网络科技有限公司 | Processing method and system for image fusion |
CN106842165A (en) * | 2017-03-16 | 2017-06-13 | 电子科技大学 | One kind is based on different distance angular resolution radar centralization asynchronous fusion method |
CN106842165B (en) * | 2017-03-16 | 2020-02-18 | 电子科技大学 | Radar centralized asynchronous fusion method based on different distance angular resolutions |
CN108764326A (en) * | 2018-05-23 | 2018-11-06 | 北京工业大学 | Urban impervious surface extracting method based on depth confidence network |
CN109886904A (en) * | 2019-01-25 | 2019-06-14 | 北京市遥感信息研究所 | A kind of SAR image and low resolution Multispectral Image Fusion Methods and system |
CN109886904B (en) * | 2019-01-25 | 2021-08-10 | 北京市遥感信息研究所 | SAR image and low-resolution multispectral image fusion method and system |
CN112634185A (en) * | 2020-12-17 | 2021-04-09 | 中国人民解放军火箭军工程大学 | SAR image and optical image fusion method based on HSL and image entropy |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102254311A (en) | Method and system for fusing remote sensing images | |
Garzelli et al. | Optimal MMSE pan sharpening of very high resolution multispectral images | |
Hong et al. | A wavelet and IHS integration method to fuse high resolution SAR with moderate resolution multispectral images | |
CN103049898B (en) | Method for fusing multispectral and full-color images with light cloud | |
CN103679661B (en) | A kind of self adaptation remote sensing image fusion method based on significance analysis | |
CN107330871A (en) | The image enchancing method of insulator automatic identification is run under bad weather condition | |
CN106204541A (en) | The track foreign body intrusion detection method being combined with infrared light based on visible ray | |
CN105894513B (en) | Take the remote sensing image variation detection method and system of imaged object change in time and space into account | |
CN103942769A (en) | Satellite remote sensing image fusion method | |
CN103116881A (en) | Remote sensing image fusion method based on PCA (principal component analysis) and Shearlet conversion | |
CN101216557B (en) | Residual hypercomplex number dual decomposition multi-light spectrum and full-color image fusion method | |
CN102176014A (en) | Method for detecting urban region change based on multi-temporal SAR (synthetic aperture radar) images | |
CN103136733A (en) | Remote sensing image color enhancing method based on multi-scale image segmentation and color transferring | |
CN112927252B (en) | Newly-added construction land monitoring method and device | |
US7433540B1 (en) | Decomposing natural image sequences | |
CN102928872B (en) | A kind of method improving dam, beach sand seismic reservoir recognition performance and describe precision | |
Fryskowska et al. | Some aspects of satellite imagery integration from Eros B and Landsat 8 | |
CN102298768B (en) | High-resolution image reconstruction method based on sparse samples | |
Wang et al. | Using 250-m MODIS data for enhancing spatiotemporal fusion by sparse representation | |
Zaveri et al. | Novel hybrid multispectral image fusion method using fuzzy logic | |
CN106097270A (en) | The desert areas Road Detection platform being positioned on unmanned plane | |
Yang et al. | Practical image fusion method based on spectral mixture analysis | |
CN107038684A (en) | A kind of method for lifting TMI spatial resolution | |
Khosravi et al. | A new pseudo-color technique based on intensity information protection for passive sensor imagery | |
Smara et al. | Multisource ERS-1 and optical data for vegetal cover assessment and monitoring in a semi-arid region of Algeria |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C12 | Rejection of a patent application after its publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20111123 |