CN103049898A - Method for fusing multispectral and full-color images with light cloud - Google Patents

Method for fusing multispectral and full-color images with light cloud Download PDF

Info

Publication number
CN103049898A
CN103049898A CN2013100308198A CN201310030819A CN103049898A CN 103049898 A CN103049898 A CN 103049898A CN 2013100308198 A CN2013100308198 A CN 2013100308198A CN 201310030819 A CN201310030819 A CN 201310030819A CN 103049898 A CN103049898 A CN 103049898A
Authority
CN
China
Prior art keywords
image
multispectral
full
background image
colour
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013100308198A
Other languages
Chinese (zh)
Other versions
CN103049898B (en
Inventor
刘芳
石程
李玲玲
郝红侠
戚玉涛
焦李成
郑莹
尚荣华
马文萍
马晶晶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201310030819.8A priority Critical patent/CN103049898B/en
Publication of CN103049898A publication Critical patent/CN103049898A/en
Application granted granted Critical
Publication of CN103049898B publication Critical patent/CN103049898B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention provides a method for fusing multispectral and full-color images with light cloud. The problem that the cloud and mist area is interfered by a cloud layer after the multispectral and full-color images with light cloud are fused is mainly solved. The method comprises the following implementation steps of: sampling and filtering the multispectral image with light cloud, and obtaining background images of the multispectral and full-color images; respectively removing light cloud of the multispectral and full-color images with light cloud; performing PCA conversion on the multispectral image in which the cloud is removed, and performing Shearlet decomposition on the converted first principal component image and the full-color image; taking a low-frequency coefficient of the first principal component as a low-frequency coefficient of a fusion component, and taking a high-frequency coefficient of the full-color image as a high-frequency coefficient of the fusion component; and performing reversible PCA conversion on the fusion component and the other components subjected to the PCA conversion, and obtaining the fusion image. The method has the advantages of high definition of light cloud area of the fusion image and high spectrum retainability and can be used for military target recognition, weather and environment monitoring, land utilization, urban planning and prevention and reduction of natural disasters.

Description

Multispectral and panchromatic image fusion method with thin cloud
Technical field
The invention belongs to the intelligent image process field, relate to and remove thin cloud and image interfusion method, can be used for to the technology in military target identification, weather monitoring, environmental monitoring, soil utilization, city planning and a plurality of fields such as prevent and reduce natural disasters.
Background technology
Since the U.S. in 1972 implements the earth resources satellite plan, satellite technology is fast development in the world, so far each Main Developed Countries of the world and minority developing country, comprise that China, India etc. have successively launched hundreds of satellite, operation wave band covering visible light to sightless near infrared, short-wave infrared, in the wide frequency domains such as infrared, far infrared, microwave.These satellites transmit Massive Remote Sensing Data covering the whole world to the satellite ground station and the mobile reception station that are dispersed in all over the world every day.Because therefore up to the present satellite remote sensing date continuity and temporal sequentiality spatially, be the only resource that the dynamic observation data of global range can be provided.
The satellite remote sensing date major part of at present widespread use is optical image, although optical image generally have contain much information, resolution is high and the characteristics such as image stabilization.But simultaneously, because in imaging process, the existence of cloud and mist in the common atmosphere, remote sensing satellite is often recorded together with cloud and mist information when obtaining view data.Because blocking so that sensor can't obtain the terrestrial object information of cloud and mist overlay area of cloud and mist, thereby had a strong impact on the quality of remote sensing optical imagery, when parts of images is covered by thicker cloud and mist, the information of atural object can't be received by sensor, and to relatively thin cloud and mist, the terrestrial object information that sensor still can receiving unit, but this imperfect information is to identification, the classification of target in the image, and the precision of ground object information extraction, all caused to seriously influence.
Be the utilization factor of Effective Raise remote sensing optical imagery, have several different methods to reduce on the market or remove thin cloud and mist to the impact of image.The method of mainly using in the market has: multispectral image method, multiple image superposition method, based on the method for figure image intensifying, and based on method of image restoration etc.
Multispectral method is to utilize a kind of special sensor, and perhaps some wave band has the characteristic of stronger susceptibility in the multispectral image to cloud and mist, detects the information of cloud and mist.Then from original image, deduct the information of cloud and mist, obtain removing the image behind the cloud and mist.The method can effectively be removed the cloud and mist in the image, but needs to increase sensor or wave band to the cloud and mist sensitivity, and cost is higher, so that the application of this kind method is restricted.
The multiple image method of superposition is to superpose by the Various Seasonal that areal is taken, the image of different time, obtains frame.Because areal usually can change at the terrestrial object information of different time, has had a strong impact on the interpretation of the image behind the superposition.
Based on the method for the removal cloud and mist of image restoration, be mechanism and the process by analyzing thin cloud degraded image, seek out corresponding anti-process, thereby obtain the method for original image.But process image from the angle of image restoration, must be familiar with mechanism and the process of thin cloud degraded image, because the degree of degeneration of thin cloud degraded image and the Range-based of object and camera, must be in conjunction with relevant supplementary during processing, such as atmospheric extinction coefficient, object and camera distance etc., and the procurement cost of these information is higher, is difficult to be widely used.
Method based on the removal cloud and mist of figure image intensifying has two kinds of homomorphic filtering method and low frequency filtering methods usually, and the homomorphic filtering method is a kind of dynamic range by compressed image, promotes the method that the image high fdrequency component reaches the purpose of removing thin cloud.The method does not add differentiation to image detail, adopts single wave filter to strengthen, and adaptivity is relatively poor; The low frequency filtering method obtains the background image of image by image is carried out Gassian low-pass filter, subtracts each other by original image and background image, reaches the purpose of removing thin cloud.The method has certain removal effect to thin cloud and mist, has simultaneously also weakened the background of image, although adopted afterwards penalty method to improve, but from the result, the loss that the method has still produced information to the image of removing behind the cloud and mist.
Because the image that single sensor obtains is difficult to satisfy actual needs at aspects such as spectral information and resolution, so just need to utilize the redundancy of sensor image data, pass through integration technology, with the image information fusion of different sensors in piece image, such as multispectral and full-colour image, multispectral image has preferably spectral information, but its spatial resolution is low; Full-colour image has higher spatial resolution, but its spectral information is abundant not, so just need by integration technology, obtains high-resolution multi-spectral image, with acquisition terrestrial object information is described more accurately.When but there was cloud and mist information in multispectral and full-colour image, integration technology can not be removed the cloud and mist information in the image.For the fusion with the multispectral and full-colour image of thin cloud, be first this two width of cloth image to be carried out respectively carrying out fusion treatment after cloud and mist removes again in the market.The image that single-sensor is obtained carries out respectively cloud and mist when removing, and considers from the angle of obtaining the image information cost, should select the method for figure image intensifying, but is inevitable but the figure image intensifying causes image information loss.So, utilize the redundancy between multispectral and the full-colour image, seek a kind ofly more effectively for the fusion method of thin cloud atlas picture, be urgent problem in the market.
Summary of the invention
The object of the invention is for the deficiency of above-mentioned prior art to merging with thin cloud atlas picture, a kind of multispectral and panchromatic image fusion method with thin cloud is proposed, with the interference of cloud layer after reducing multispectral and full-colour image and merging to thin cloud sector domain information, reduce the loss of fused image information, improve the sharpness of fused image, describe more accurately terrestrial object information.
For achieving the above object, main contents of the present invention are as follows:
1) multispectral image with thin cloud is carried out up-sampling, big or small identical with the full-colour image of thin cloud of the size of the multispectral image after the sampling;
2) the multispectral and full-colour image after the sampling is carried out respectively down-sampling, Gassian low-pass filter and up-sampling successively, obtain the background image B of multispectral image 1Background image B with full-colour image 2, wherein the wave band number of the wave band number of the background image of multispectral image and multispectral image is identical;
3) each wave band and the full-colour image of multispectral image are removed respectively thin cloud operation, obtain removing the multispectral image I of thin cloud 1With the full-colour image I that removes thin cloud 2
4) to removing the multispectral image I behind the thin cloud 1Carry out the PCA conversion, obtain each component image
Figure BDA00002780619300036
Wherein The i principal component that obtains after the expression multispectral image process PCA conversion, i=1,2 ..., n, n are the sum of component;
5) to the first factor image through obtaining after the PCA conversion
Figure BDA00002780619300032
With the full-colour image I that removes thin cloud 2, carry out respectively the Shearlet conversion and decompose, with the first factor image
Figure BDA00002780619300033
Be decomposed into a low frequency coefficient x 1With a plurality of directional subband coefficient y 1, y 2..., y m, will remove the full-colour image I of thin cloud 2Be decomposed into a low frequency coefficient x 2With a plurality of directional subband coefficient z 1, z 2... z m;
6) to step 2) background image of each wave band of the multispectral image that obtains, calculate its average gray, obtain the synthetic background image of multispectral image
Figure BDA00002780619300034
7) according to the synthetic background image of multispectral image
Figure BDA00002780619300035
Set up weight matrix w 1, according to the background image B of full-colour image 2, set up weight matrix w 2
8) to full-colour image I 2Each directional subband coefficient z 1, z 2..., z m, multiply by weight matrix w 1And w 2, the directional subband coefficient l of the first component image that obtains merging 2, l 2..., l m, with the low frequency coefficient x of the first factor image 1Low frequency coefficient k as the first factor image after merging;
9) to low frequency coefficient k and a plurality of directional subband coefficient l of the first factor image after merging 1, l 2..., l m, carry out contrary Shearlet conversion, the first factor image after obtaining merging
10) the first factor image after will merging
Figure BDA00002780619300042
Other component images except the first factor that obtain with step 4)
Figure BDA00002780619300043
Form new data set, and this new data set is carried out contrary PCA conversion, the image I after obtaining merging.
The present invention compared with prior art has following advantage:
(a) the present invention is because Image Acquisition background image after using up-sampling, obtain the approximate background image of original image by the method that image behind the up-sampling is sampled, utilized behind the up-sampling information of image very level and smooth, the approximate background image that obtains by sampling is the background image characteristics of approaching to reality preferably, overcome traditional algorithm high problem of complexity computing time in filtering, thereby improved the speed that image is removed thin cloud.
(b) the present invention is owing to setting up the detailed information that weight matrix strengthens territory, the thin cloud sector of fused images with background image, utilize territory, thin cloud sector in image, to have the characteristics of higher gray-scale value, having overcome in traditional fused images thin cloud sector domain information loses, image blurring problem after causing merging, thus the sharpness of fused images improved.
Description of drawings
Fig. 1 is that the present invention is with the multispectral of thin cloud and panchromatic image fusion method process flow diagram;
Fig. 2 is the QuickBird satellite image with thin cloud of the present invention;
Fig. 3 is the fused images that obtains with the inventive method.
Embodiment
With reference to Fig. 1, of the present invention as follows with the multispectral of thin cloud and panchromatic image fusion method step:
Step 1 is carried out up-sampling to the multispectral image with thin cloud, big or small identical with the full-colour image of thin cloud of the size of the multispectral image behind the up-sampling, and wherein original multispectral image is designated as Y shown in Fig. 2 (a) 1, original full-colour image is designated as Y shown in Fig. 2 (b) 2
Step 2 is sampled respectively and the Gassian low-pass filter processing to the multispectral image behind the up-sampling and original full-colour image, obtains the background image B with the multispectral image of thin cloud 1Background image B with full-colour image 2
2.1) to the multispectral image behind the up-sampling and original full-colour image, carry out respectively down-sampling, the size of the multispectral and full-colour image behind the down-sampling is 1/3 of original full-colour image size;
2.2) adopt respectively Gaussian filter to carry out Gassian low-pass filter to the multispectral image behind the down-sampling and full-colour image, obtain the multispectral image of down-sampling and the background image of full-colour image, the wave band number of the background image of the multispectral image of down-sampling is identical with the wave band number of multispectral image;
2.3) to the background image of down-sampling, carry out up-sampling, obtain the background image B of multispectral image 1Background image B with full-colour image 2, gained image size is identical with original full-colour image size.
Step 3 is to original multispectral image Y 1Each wave band and original full-colour image remove respectively the operation of thin cloud.
3.1) the original multispectral image Y of calculating 1The gray average of each band image and original full-colour image Y 2Gray average, namely to the summation of the gray-scale value of pixels all in the image, again divided by the sum of all pixels of image;
3.2) to original multispectral image Y 1Each wave band, the subtracting background image B 1Corresponding wave band, add the gray average of corresponding wave band, obtain removing the multispectral image I of thin cloud 1
3.3) to original full-colour image Y 2, deduct its background image B 2, add the average of its gray scale, obtain removing the full-colour image I of thin cloud 2
Step 4 is to removing the multispectral image I behind the thin cloud 1Carry out the PCA conversion, obtain each component image
Figure BDA00002780619300051
Wherein
Figure BDA00002780619300052
The i principal component that obtains after the expression multispectral image process PCA conversion, i=1,2 ..., n, n are the sum of component.
Step 5 is to the first factor image through obtaining after the PCA conversion
Figure BDA00002780619300053
With the full-colour image I that removes thin cloud 2, carry out respectively the Shearlet conversion and decompose, with the first factor image Be decomposed into a low frequency coefficient x 1With m directional subband coefficient y 1, y 2..., y m, will remove the full-colour image I of thin cloud 2Be decomposed into a low frequency coefficient x 2With m directional subband coefficient z 1, z 2..., z m
The Shearlet conversion is a kind of in the multi-scale geometric analysis instrument of a new generation, and concrete is olation is:
5.1) by the non-lower sampling Laplacian Pyramid Transform image is carried out Scale Decomposition, respectively with the first factor image
Figure BDA00002780619300055
With full-colour image I 2Be decomposed into a low frequency coefficient and a plurality of high frequency coefficient, the number of high frequency coefficient is identical with the decomposition number of plies, wherein the number of plies of Scale Decomposition is unrestricted, has higher correlativity in order to make low frequency coefficient and full-colour image after the decomposition, and this example determines that decomposing the number of plies is 4 layers;
5.2) by the Shear wave filter each high frequency coefficient is decomposed into a plurality of directions, wherein the selection of direction number is not restricted, consider the time complexity of calculating and the needs of algorithm, this example is determined respectively from thick yardstick to thin yardstick, 4 high frequency coefficients are decomposed into respectively 6,6,10 and 10 directions, obtain totally 22 directional subband coefficients.
Step 6 is according to the background image B of multispectral image in the step 2 1, calculate the gray average of all wave bands, obtain the synthetic background image of multispectral image
Synthetic background image obtains by following formula:
B 1 ′ ( p , q ) = Σ k = 1 N B 1 k ( p , q ) N
Wherein
Figure BDA00002780619300063
Expression background image B 1The gray-scale value located at coordinate (p, q) of K-band,
Figure BDA00002780619300064
The background image that expression is synthetic
Figure BDA00002780619300065
At the gray-scale value that coordinate (p, q) is located, N is the wave band number of multispectral image.
Step 7 is according to the synthetic background image of the multispectral image that obtains Set up synthetic background image
Figure BDA00002780619300067
Weight matrix w 1
7.1) set up and synthesize background image
Figure BDA00002780619300068
The buffer memory matrix M 1, initial value is zero, its size is identical with the background image of multispectral image, with the synthetic background image of multispectral image The gray-scale value of each position, indirect assignment is given synthetic background image
Figure BDA000027806193000610
The buffer memory matrix M 1The relevant position;
7.2) the synthetic background image of statistics
Figure BDA000027806193000611
The buffer memory matrix M 1The maximal value of middle all values and minimum value are denoted as respectively
Figure BDA000027806193000612
With
Figure BDA000027806193000613
7.3) according to maximal value
Figure BDA000027806193000614
And minimum value
Figure BDA000027806193000615
Calculate synthetic background image
Figure BDA000027806193000616
Threshold value T 1:
T 1 = M min 1 + 0.5 × ( M max 1 - M min 1 ) ;
7.4) with the synthetic background image of multispectral image
Figure BDA000027806193000618
In gray-scale value and its threshold value T of each pixel 1Compare, with gray-scale value greater than its threshold value T 1Pixel be divided into a class, and the zone that this class pixel is formed is denoted as S 11With gray-scale value less than its threshold value T 1Pixel be divided into another kind ofly, and the zone that this class pixel is formed is denoted as S 12
7.5) difference zoning S 11And S 12In the average of all grey scale pixel values, be denoted as respectively
Figure BDA000027806193000619
With
Figure BDA000027806193000620
Calculate the buffer memory matrix M 1The threshold value R of compression span 1:
R 1 = A S 11 - A S 12 M max 1 - M min 1 - - - ( B )
7.6) to the buffer memory matrix M 1In all values carry out normalized, obtain normalization matrix
Figure BDA000027806193000622
This normalization matrix
Figure BDA000027806193000623
In all value all in [0,1] interval;
7.7) to normalization matrix In each on duty with the compression span threshold value R 1, make normalization matrix
Figure BDA00002780619300072
In all values be compressed to [0, R 1] in the scope, the matrix after the compression is exactly weight matrix w 1
Step 8 is according to the background image B of full-colour image 2, set up the second weight matrix w 2
8.1) set up the background image B of full-colour image 2The buffer memory matrix M 2, initial value is zero, its size is identical with the background image of full-colour image, with this background image B 2The gray-scale value of each position, indirect assignment is given its buffer memory matrix M 2The relevant position;
8.2) statistics full-colour image background image B 2The buffer memory matrix M 2The maximal value of middle all values and minimum value are denoted as respectively
Figure BDA00002780619300073
With
8.3) according to maximal value
Figure BDA00002780619300075
And minimum value
Figure BDA00002780619300076
Calculate the background image B of full-colour image 2Threshold value T 2,
T 2 = M min 2 + 0.5 × ( M max 2 - M min 2 ) ;
8.4) with the background image B of full-colour image 2In gray-scale value and its threshold value T of each pixel 2Compare, with gray-scale value greater than this threshold value T 2Pixel be divided into a class, and this class pixel compositing area is denoted as S 21With gray-scale value less than this threshold value T 2Pixel be divided into another kind ofly, and the zone that these pixels are formed is denoted as S 22
8.5) difference zoning S 21And S 22In the average of all grey scale pixel values, be denoted as respectively
Figure BDA00002780619300078
With
Figure BDA00002780619300079
Calculate described buffer memory matrix M 2The threshold value R of compression span 2:
R 2 = A S 21 - A S 22 M max 2 - M min 2 , - - - ( B )
8.6) to the background image B of full-colour image 2The buffer memory matrix M 2In all values carry out normalized, obtain normalization matrix
Figure BDA000027806193000711
This normalization matrix In all value all in [0,1] interval;
8.7) to normalization matrix
Figure BDA000027806193000713
In each on duty with the compression span threshold value R 2, make normalization matrix
Figure BDA000027806193000714
In all values be compressed to [0, R 2] in the scope, the matrix after the compression is exactly weight matrix w 2
Step 9 is to full-colour image I 2Each directional subband coefficient z 1, z 2..., z m, multiply by weight matrix w 1And w 2, the directional subband coefficient l of the first component image after obtaining merging 1, l 2..., l m, with the low frequency coefficient x of the first factor image 1The low frequency coefficient k of the first factor image after indirect assignment is given and merged.
Step 10 is to low frequency coefficient k and a plurality of directional subband coefficient l of the first factor image after merging 1, l 2..., l m, carry out contrary Shearlet conversion, the first factor image after obtaining merging
Figure BDA000027806193000715
Step 11 is with the first factor image after merging
Figure BDA00002780619300081
Other component image except the first factor that obtains with step 4
Figure BDA00002780619300082
Form new data set, and this new data set is carried out contrary PCA conversion, the image I after obtaining merging, as shown in Figure 3, wherein Fig. 3 (a) is the fused images that strengthens by weight matrix, Fig. 3 (b) is the partial enlarged drawing of Fig. 3 (a).
Above description is example of the present invention; obviously for those skilled in the art; after having understood content of the present invention and principle; all may be in the situation that do not deviate from the principle of the invention, structure; carry out correction and change on form and the details, but these are based on the correction of inventive concept with change still within claim protection domain of the present invention.

Claims (6)

1. the multispectral and panchromatic image fusion method with thin cloud comprises the steps:
1) multispectral image with thin cloud is carried out up-sampling, big or small identical with the full-colour image of thin cloud of the size of the multispectral image after the sampling;
2) the multispectral and full-colour image after the sampling is carried out respectively down-sampling, Gassian low-pass filter and up-sampling successively, obtain the background image B of multispectral image 1Background image B with full-colour image 2, wherein the wave band number of the wave band number of the background image of multispectral image and multispectral image is identical;
3) each wave band and the full-colour image of multispectral image are removed respectively thin cloud operation, obtain removing the multispectral image I of thin cloud 1With the full-colour image I that removes thin cloud 2
4) to removing the multispectral image I behind the thin cloud 1Carry out the PCA conversion, obtain each component image
Figure FDA00002780619200011
Wherein
Figure FDA00002780619200012
The i principal component that obtains after the expression multispectral image process PCA conversion, i=1,2 ..., n, n are the sum of component;
5) to the first factor image through obtaining after the PCA conversion
Figure FDA00002780619200013
With the full-colour image I that removes thin cloud 2, carry out respectively the Shearlet conversion and decompose, with the first factor image
Figure FDA00002780619200014
Be decomposed into a low frequency coefficient x 1With a plurality of directional subband coefficient y 1, y 2..., y m, will remove the full-colour image I of thin cloud 2Be decomposed into a low frequency coefficient x 2With a plurality of directional subband coefficient z 1z 2..., z m;
6) to step 2) background image of each wave band of the multispectral image that obtains, calculate its average gray, obtain the synthetic background image of multispectral image
Figure FDA00002780619200015
7) according to the synthetic background image of multispectral image
Figure FDA00002780619200016
Set up weight matrix w 1, according to the background image B of full-colour image 2, set up weight matrix w 2
8) to full-colour image I 2Each directional subband coefficient z 1, z 2..., z m, multiply by weight matrix w 1And w 2, the directional subband coefficient l of the first component image that obtains merging 1, l 2..., l m, with the low frequency coefficient x of the first factor image 1Low frequency coefficient k as the first factor image after merging;
9) to low frequency coefficient k and a plurality of directional subband coefficient l of the first factor image after merging 1, l 2..., l m, carry out contrary Shearlet conversion, the first factor image after obtaining merging
10) the first factor image after will merging Other component images except the first factor that obtain with step 4)
Figure FDA00002780619200022
Form new data set, and this new data set is carried out contrary PCA conversion, the image I after obtaining merging.
2. a kind of multispectral and panchromatic image fusion method with thin cloud according to claim 1, wherein described each wave band and the full-colour image to multispectral image of step 3) removed respectively thin cloud operation, carries out as follows:
2.1) calculate the gray average of each band image of multispectral image and the gray average of full-colour image, namely to the gray-scale value summation of pixels all in the image, again divided by the sum of all pixels of image;
2.2) to each wave band of multispectral image, subtracting background image B 1Corresponding wave band, add the gray average of corresponding wave band, obtain removing the multispectral image I of thin cloud 1
2.3) to full-colour image, deduct its background image B 2, add the average of its gray scale, obtain removing the full-colour image I of thin cloud 2
3. the multispectral and panchromatic image fusion method with thin cloud according to claim 1, wherein step 5) is described to the first factor image Carry out respectively Shearlet with full-colour image and decompose, carry out as follows:
3.1) by the non-lower sampling Laplacian Pyramid Transform image is carried out Scale Decomposition, be a low frequency coefficient and a plurality of high frequency coefficient with picture breakdown, the number of high frequency coefficient is identical with the decomposition number of plies, and the present invention determines that decomposing the number of plies is 4 layers;
3.2) by the Shear wave filter each high frequency coefficient being decomposed into a plurality of directions, the present invention determines respectively from thick yardstick to thin yardstick, and each high frequency coefficient is decomposed into 6,6,10 and 10 directions, obtains a plurality of directional subband coefficients.
4. the multispectral and panchromatic image fusion method with thin cloud according to claim 1, wherein the background image of described each wave band according to multispectral image of step 6) obtains the synthetic background image of multispectral image
Figure FDA00002780619200024
Obtain by following formula:
B 1 ′ ( p , q ) = Σ k = 1 N B 1 k ( p , q ) N - - - ( A )
Wherein
Figure FDA00002780619200026
Expression background image B 1The gray-scale value located at coordinate (p, q) of K-band,
Figure FDA00002780619200027
The background image that expression is synthetic
Figure FDA00002780619200031
At the gray-scale value that coordinate (p, q) is located, N is the wave band number of multispectral image.
5. the multispectral and panchromatic image fusion method with thin cloud according to claim 1, the wherein described synthetic background image according to multispectral image of step 7)
Figure FDA00002780619200032
Set up synthetic background image Weight matrix w 1, carry out as follows:
5.1) set up and synthesize background image
Figure FDA00002780619200034
The buffer memory matrix M 1, initial value is zero, its size is identical with the background image of multispectral image, with the synthetic background image of multispectral image The gray-scale value of each position, indirect assignment is given its buffer memory matrix M 1The relevant position;
5.2) the synthetic background image of statistics
Figure FDA00002780619200036
The buffer memory matrix M 1The maximal value of middle all values and minimum value are denoted as respectively
Figure FDA00002780619200037
With
Figure FDA00002780619200038
5.3) according to maximal value
Figure FDA00002780619200039
And minimum value
Figure FDA000027806192000310
Calculate synthetic background image
Figure FDA000027806192000311
Threshold value T 1:
T 1 = M min 1 + 0.5 × ( M max 1 - M min 1 ) ,
5.4) with the synthetic background image of multispectral image
Figure FDA000027806192000313
In gray-scale value and its threshold value T of each pixel 1Compare, with gray-scale value greater than its threshold value T 1Pixel be divided into a class, and the zone that this class pixel is formed is denoted as S 11With gray-scale value less than its threshold value T 1Pixel be divided into another kind ofly, and the zone that this class pixel is formed is denoted as S 12
5.5) difference zoning S 11And S 12In the average of all grey scale pixel values, be denoted as respectively
Figure FDA000027806192000314
With
Figure FDA000027806192000315
Calculate synthetic background image
Figure FDA000027806192000316
The buffer memory matrix M 1The threshold value R of compression span 1:
R 1 = A S 11 - A S 12 M max 1 - M min 1 - - - ( B )
5.6) to described buffer memory matrix M 1In all values carry out normalized, obtain normalization matrix
Figure FDA000027806192000318
This normalization matrix In all value all in [0,1] interval;
5.7) to normalization matrix
Figure FDA000027806192000320
In each on duty with its compression span threshold value R 1, make normalization matrix
Figure FDA000027806192000321
In all values be compressed to [0, R 1] in the scope, the matrix after the compression is exactly weight matrix w 1
6. the multispectral and panchromatic image fusion method with thin cloud according to claim 1, the wherein described background image B according to full-colour image of step 7) 2, set up the background image B of full-colour image 2Weight matrix w 2, carry out as follows:
6.1) set up the background image B of full-colour image 2The buffer memory matrix M 2, initial value is zero, its size is identical with the background image of full-colour image, with background image B 2The gray-scale value of each position, indirect assignment is given its buffer memory matrix M 2The relevant position;
6.2) statistics full-colour image background image B 2The buffer memory matrix M 2The maximal value of middle all values and minimum value are denoted as respectively
Figure FDA00002780619200041
With
Figure FDA00002780619200042
6.3) according to maximal value
Figure FDA00002780619200043
And minimum value
Figure FDA00002780619200044
Calculate the background image B of full-colour image 2Threshold value T 2,
T 2 = M min 2 + 0.5 × ( M max 2 - M min 2 ) ;
6.4) with the background image B of full-colour image 2In gray-scale value and its threshold value T of each pixel 2Compare, with gray-scale value greater than its threshold value T 2Pixel be divided into a class, and this class pixel compositing area is denoted as S 21With gray-scale value less than its threshold value T 2Pixel be divided into another kind ofly, and the zone that these pixels are formed is denoted as S 22
6.5) difference zoning S 21And S 22In the average of all grey scale pixel values, be denoted as respectively
Figure FDA00002780619200046
With
Figure FDA00002780619200047
Calculate the background image B of full-colour image 2The buffer memory matrix M 2The threshold value R of compression span 2:
R 2 = A S 21 - A S 22 M max 2 - M min 2 - - - ( B )
6.6) to described buffer memory matrix M 2In all values carry out normalized, obtain normalization matrix
Figure FDA00002780619200049
This normalization matrix
Figure FDA000027806192000410
In all value all in [0,1] interval;
6.7) to normalization matrix
Figure FDA000027806192000411
In each on duty with its compression span threshold value R 2, make normalization matrix In all values be compressed to [0, R 2] in the scope, the matrix after the compression is exactly weight matrix w 2
CN201310030819.8A 2013-01-27 2013-01-27 Method for fusing multispectral and full-color images with light cloud Active CN103049898B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310030819.8A CN103049898B (en) 2013-01-27 2013-01-27 Method for fusing multispectral and full-color images with light cloud

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310030819.8A CN103049898B (en) 2013-01-27 2013-01-27 Method for fusing multispectral and full-color images with light cloud

Publications (2)

Publication Number Publication Date
CN103049898A true CN103049898A (en) 2013-04-17
CN103049898B CN103049898B (en) 2015-04-22

Family

ID=48062528

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310030819.8A Active CN103049898B (en) 2013-01-27 2013-01-27 Method for fusing multispectral and full-color images with light cloud

Country Status (1)

Country Link
CN (1) CN103049898B (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103295214A (en) * 2013-06-28 2013-09-11 深圳大学 Cloudless moderate resolution imaging spectroradiometer (MODIS) remote sensing image generation method and system based on color characteristics
CN103293168A (en) * 2013-05-28 2013-09-11 陕西科技大学 Fruit surface defect detection method based on visual saliency
CN103456018A (en) * 2013-09-08 2013-12-18 西安电子科技大学 Remote sensing image change detection method based on fusion and PCA kernel fuzzy clustering
CN103793883A (en) * 2013-12-11 2014-05-14 北京工业大学 Principal component analysis-based imaging spectral image super resolution restoration method
CN104517264A (en) * 2013-09-30 2015-04-15 华为终端有限公司 Image processing method and device
CN104616253A (en) * 2015-01-09 2015-05-13 电子科技大学 Light cloud removing method of optical remote sensing image utilizing independent component analysis technology
CN106127712A (en) * 2016-07-01 2016-11-16 上海联影医疗科技有限公司 Image enchancing method and device
CN107622479A (en) * 2017-09-04 2018-01-23 南京理工大学 A kind of profile marble band adaptive detailing method for implanting of the panchromatic sharpening of multispectral image
CN107958450A (en) * 2017-12-15 2018-04-24 武汉大学 Panchromatic multispectral image fusion method and system based on adaptive Gaussian mixture model
CN109118462A (en) * 2018-07-16 2019-01-01 中国科学院东北地理与农业生态研究所 A kind of remote sensing image fusing method
US10290108B2 (en) 2015-12-31 2019-05-14 Shanghai United Imaging Healthcare Co., Ltd. Methods and systems for image processing
CN111507454A (en) * 2019-01-30 2020-08-07 兰州交通大学 Improved cross cortical neural network model for remote sensing image fusion
CN112565756A (en) * 2020-11-26 2021-03-26 西安电子科技大学 Cloud-containing remote sensing image compression method based on quantization strategy
CN113436123A (en) * 2021-06-22 2021-09-24 宁波大学 High-resolution SAR and low-resolution multispectral image fusion method based on cloud removal-resolution improvement cooperation
CN113570536A (en) * 2021-07-31 2021-10-29 中国人民解放军61646部队 Panchromatic and multispectral image real-time fusion method based on CPU and GPU cooperative processing

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008070544A2 (en) * 2006-12-01 2008-06-12 Harris Corporation Structured smoothing for superresolution of multispectral imagery based on registered panchromatic image
CN101359399A (en) * 2008-09-19 2009-02-04 常州工学院 Cloud-removing method for optical image
CN102509262A (en) * 2011-10-17 2012-06-20 中煤地航测遥感局有限公司 Method for removing thin cloud of remote sensing image
US20120269430A1 (en) * 2011-04-22 2012-10-25 Michael Paul Deskevich System and method for combining color information with spatial information in multispectral images

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008070544A2 (en) * 2006-12-01 2008-06-12 Harris Corporation Structured smoothing for superresolution of multispectral imagery based on registered panchromatic image
CN101359399A (en) * 2008-09-19 2009-02-04 常州工学院 Cloud-removing method for optical image
US20120269430A1 (en) * 2011-04-22 2012-10-25 Michael Paul Deskevich System and method for combining color information with spatial information in multispectral images
CN102509262A (en) * 2011-10-17 2012-06-20 中煤地航测遥感局有限公司 Method for removing thin cloud of remote sensing image

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103293168A (en) * 2013-05-28 2013-09-11 陕西科技大学 Fruit surface defect detection method based on visual saliency
CN103295214B (en) * 2013-06-28 2016-01-20 深圳大学 Cloudless MODIS remote sensing images based on color character generate method and system
CN103295214A (en) * 2013-06-28 2013-09-11 深圳大学 Cloudless moderate resolution imaging spectroradiometer (MODIS) remote sensing image generation method and system based on color characteristics
CN103456018A (en) * 2013-09-08 2013-12-18 西安电子科技大学 Remote sensing image change detection method based on fusion and PCA kernel fuzzy clustering
CN103456018B (en) * 2013-09-08 2017-01-18 西安电子科技大学 Remote sensing image change detection method based on fusion and PCA kernel fuzzy clustering
CN104517264B (en) * 2013-09-30 2018-03-16 华为终端(东莞)有限公司 Image processing method and device
CN104517264A (en) * 2013-09-30 2015-04-15 华为终端有限公司 Image processing method and device
CN103793883A (en) * 2013-12-11 2014-05-14 北京工业大学 Principal component analysis-based imaging spectral image super resolution restoration method
CN103793883B (en) * 2013-12-11 2016-11-09 北京工业大学 A kind of imaging spectrum Super-Resolution method based on principal component analysis
CN104616253A (en) * 2015-01-09 2015-05-13 电子科技大学 Light cloud removing method of optical remote sensing image utilizing independent component analysis technology
CN104616253B (en) * 2015-01-09 2017-05-10 电子科技大学 Light cloud removing method of optical remote sensing image utilizing independent component analysis technology
US11049254B2 (en) 2015-12-31 2021-06-29 Shanghai United Imaging Healthcare Co., Ltd. Methods and systems for image processing
US11880978B2 (en) 2015-12-31 2024-01-23 Shanghai United Imaging Healthcare Co., Ltd. Methods and systems for image processing
US10290108B2 (en) 2015-12-31 2019-05-14 Shanghai United Imaging Healthcare Co., Ltd. Methods and systems for image processing
CN106127712A (en) * 2016-07-01 2016-11-16 上海联影医疗科技有限公司 Image enchancing method and device
CN106127712B (en) * 2016-07-01 2020-03-31 上海联影医疗科技有限公司 Image enhancement method and device
CN107622479B (en) * 2017-09-04 2020-11-27 南京理工大学 Contour wave sub-band self-adaptive detail injection method for multi-spectral-band image panchromatic sharpening
CN107622479A (en) * 2017-09-04 2018-01-23 南京理工大学 A kind of profile marble band adaptive detailing method for implanting of the panchromatic sharpening of multispectral image
CN107958450B (en) * 2017-12-15 2021-05-04 武汉大学 Panchromatic multispectral image fusion method and system based on self-adaptive Gaussian filtering
CN107958450A (en) * 2017-12-15 2018-04-24 武汉大学 Panchromatic multispectral image fusion method and system based on adaptive Gaussian mixture model
CN109118462A (en) * 2018-07-16 2019-01-01 中国科学院东北地理与农业生态研究所 A kind of remote sensing image fusing method
CN111507454A (en) * 2019-01-30 2020-08-07 兰州交通大学 Improved cross cortical neural network model for remote sensing image fusion
CN111507454B (en) * 2019-01-30 2022-09-06 兰州交通大学 Improved cross cortical neural network model for remote sensing image fusion
CN112565756A (en) * 2020-11-26 2021-03-26 西安电子科技大学 Cloud-containing remote sensing image compression method based on quantization strategy
CN112565756B (en) * 2020-11-26 2021-09-03 西安电子科技大学 Cloud-containing remote sensing image compression method based on quantization strategy
CN113436123A (en) * 2021-06-22 2021-09-24 宁波大学 High-resolution SAR and low-resolution multispectral image fusion method based on cloud removal-resolution improvement cooperation
CN113570536A (en) * 2021-07-31 2021-10-29 中国人民解放军61646部队 Panchromatic and multispectral image real-time fusion method based on CPU and GPU cooperative processing
CN113570536B (en) * 2021-07-31 2022-02-01 中国人民解放军61646部队 Panchromatic and multispectral image real-time fusion method based on CPU and GPU cooperative processing

Also Published As

Publication number Publication date
CN103049898B (en) 2015-04-22

Similar Documents

Publication Publication Date Title
CN103049898B (en) Method for fusing multispectral and full-color images with light cloud
Kulkarni et al. Pixel level fusion techniques for SAR and optical images: A review
Wang et al. Fusion of Landsat 8 OLI and Sentinel-2 MSI data
KR101580585B1 (en) Method for data fusion of panchromatic and thermal-infrared images and Apparatus Thereof
Shahtahmassebi et al. Review of shadow detection and de-shadowing methods in remote sensing
Grohnfeldt et al. Jointly sparse fusion of hyperspectral and multispectral imagery
CN102509262B (en) Method for removing thin cloud of remote sensing image
CN102005037B (en) Multimodality image fusion method combining multi-scale bilateral filtering and direction filtering
CN103116881A (en) Remote sensing image fusion method based on PCA (principal component analysis) and Shearlet conversion
WO2019153746A1 (en) Geological linear body extraction method based on tensor voting coupled with hough transform
CN107610164A (en) A kind of No. four Image registration methods of high score based on multiple features mixing
CN104217426A (en) Object-oriented water-body extracting method based on ENVISAT ASAR and Landsat TM remote sensing data
CN105139396B (en) Full-automatic remote sensing image cloud and fog detection method
CN104537678A (en) Method for removing cloud and mist from single remote sensing image
CN114549385B (en) Optical and SAR image fusion cloud removing method based on depth dense residual error network
CN113793289A (en) Multi-spectral image and panchromatic image fuzzy fusion method based on CNN and NSCT
Cheng et al. Missing information reconstruction for single remote sensing images using structure-preserving global optimization
Yehia et al. Fusion of high-resolution SAR and optical imageries based on a wavelet transform and IHS integrated algorithm
Huang et al. Multi-feature combined for building shadow detection in GF-2 Images
Ahuja et al. A survey of satellite image enhancement techniques
CN106910178B (en) Multi-angle SAR image fusion method based on tone statistical characteristic classification
Xiao et al. A Novel Image Fusion Method for Water Body Extraction Based on Optimal Band Combination.
CN113436123B (en) High-resolution SAR and low-resolution multispectral image fusion method based on cloud removal-resolution improvement cooperation
CN113888421A (en) Fusion method of multispectral satellite remote sensing image
Huang et al. An improved variational method for hyperspectral image pansharpening with the constraint of spectral difference minimization

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant