CN112307901A - Landslide detection-oriented SAR and optical image fusion method and system - Google Patents

Landslide detection-oriented SAR and optical image fusion method and system Download PDF

Info

Publication number
CN112307901A
CN112307901A CN202011045558.3A CN202011045558A CN112307901A CN 112307901 A CN112307901 A CN 112307901A CN 202011045558 A CN202011045558 A CN 202011045558A CN 112307901 A CN112307901 A CN 112307901A
Authority
CN
China
Prior art keywords
image
sar
landslide
frequency
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011045558.3A
Other languages
Chinese (zh)
Inventor
司为国
孟庆祥
袁兆祥
向枫
段学琳
汪自翔
侯伟宏
刘剑
方炯
徐晓华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Electric Power Research Institute of State Grid Zhejiang Electric Power Co Ltd
Hangzhou Power Supply Co of State Grid Zhejiang Electric Power Co Ltd
State Grid Economic and Technological Research Institute
Original Assignee
Wuhan University WHU
Electric Power Research Institute of State Grid Zhejiang Electric Power Co Ltd
Hangzhou Power Supply Co of State Grid Zhejiang Electric Power Co Ltd
State Grid Economic and Technological Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU, Electric Power Research Institute of State Grid Zhejiang Electric Power Co Ltd, Hangzhou Power Supply Co of State Grid Zhejiang Electric Power Co Ltd, State Grid Economic and Technological Research Institute filed Critical Wuhan University WHU
Priority to CN202011045558.3A priority Critical patent/CN112307901A/en
Publication of CN112307901A publication Critical patent/CN112307901A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering

Abstract

The invention discloses a landslide detection-oriented SAR and optical image fusion method. The method of the invention comprises the following steps: step 1, preprocessing an SAR and an optical image; step 2, performing HIS conversion on the optical image to obtain three components I, H and S; step 3, performing stationary wavelet transform and high-frequency component energy quantity large fusion on the I component of the SAR image and the optical image; respectively carrying out significance detection on low-frequency and high-frequency components of the SAR image and the gray information of the image, establishing an SAR significant target detection area guide function, and partitioning the SAR image; step 5, establishing a salient region fusion rule, and realizing image fusion according to a regional fusion strategy; and 6, identifying and extracting landslide disaster information based on the fusion image. The method has better adaptability to the SAR and optical image fusion for landslide detection, adopts relevant processing measures on structure maintenance, noise removal and spectrum retention, and obtains excellent effect.

Description

Landslide detection-oriented SAR and optical image fusion method and system
Technical Field
The invention relates to the field of landslide hazard identification and monitoring and the field of multi-sensor remote sensing image data fusion, in particular to a landslide detection-oriented SAR and optical image fusion method and system.
Background
With the further development of the remote sensing scientific technology, the work of analyzing and researching by applying the remote sensing technology in the landslide disaster is also deepened continuously, and the mass acquisition of the remote sensing data of various sensors with various resolutions provides sufficient data guarantee for carrying out the identification of the landslide disaster. The remote sensing technology is applied to identifying and researching landslide disasters, so that the landslide disaster can be prevented and reduced, and sufficient disaster information data guarantee can be provided for disaster relief and post-disaster reconstruction, and life and property loss caused by disasters is reduced to the minimum.
At present, most of remote sensing technology research oriented to landslide hazard identification and detection at home and abroad only adopts a single data source of one of optical remote sensing images or SAR images, and is only limited to fusion between heterogeneous optical images or fusion between multi-band multi-polarization SAR images when using a fusion image to carry out analysis research, and few researches related to landslide detection by using fusion data of SAR and optical images are available. The optical remote sensing images with various spatial resolutions not only can provide spectral information, but also can provide information such as textures, geometric shapes and the like of ground objects, so that the accuracy of landslide disaster identification is ensured. Due to the characteristics of the Synthetic Aperture Radar (SAR) in all weather and all-weather, the SAR is more effective than an optical sensor under various complex environments and meteorological conditions with poor visibility, and is particularly suitable for the application of ground feature information acquisition, sudden disaster (such as flood, earthquake, landslide and the like) disaster situation quick acquisition, crust deformation monitoring and the like in cloudy and foggy areas. However, the SAR image belongs to coherent imaging of oblique projection, and is greatly different from a visible light image in imaging mechanism, radiation characteristic and geometric characteristic. Therefore, the SAR and optical image fusion is very difficult to apply to ground feature interpretation and landslide hazard detection, the mapping application difficulty is high, and a large research space still exists. The main problems that exist are:
first, a spectral distortion problem can occur during the fusion of the optical image and the SAR image.
Secondly, how to obtain high-quality SAR image landslide feature information.
Thirdly, the amount of calculation of the optical image and SAR image fusion technology is large, and the improvement of the speed of the fusion algorithm is also a big problem in the current research situation.
Disclosure of Invention
The technical problem to be solved by the present invention is to provide a method and a system for fusing SAR and optical images for landslide detection, aiming at the defects existing in the prior art.
Therefore, the invention adopts the following technical scheme: a landslide detection-oriented SAR and optical image fusion method comprises the following steps:
step 1, preprocessing an SAR image and an optical image;
step 2, performing HIS conversion on the optical image to obtain three components I, H and S;
step 3, performing stationary wavelet transform and high-frequency component energy quantity large fusion on the I components of the SAR image and the optical image;
step 4, respectively carrying out landslide characteristic detection on the low-frequency and high-frequency components of the SAR image and the gray level information of the image, establishing an SAR landslide target detection area function, and partitioning the SAR image;
step 5, establishing a landslide feature region fusion rule, and realizing image fusion according to a regional fusion strategy;
and 6, identifying and extracting landslide disaster information based on the fusion image.
Further, the specific steps of measuring the high-frequency component energy and performing large fusion in step 3 are as follows:
step 31, calculating high-frequency component energy:
the input SAR image and the component I of the optical image are decomposed by the stationary wavelet to obtain the low-frequency and high-frequency information of 4 groups of images: a. thes,j(x,y)、Av,j(x,y)、
Figure BDA0002707840900000031
Wherein: a. thes,j(x, y) and Av,j(x, y) respectively representing the low-frequency information obtained by decomposing the SAR image and the optical image at the jth time,
Figure BDA0002707840900000032
and
Figure BDA0002707840900000033
respectively representing high-frequency information obtained by decomposing the SAR image and the optical image in the epsilon direction at the jth time; the directivity of epsilon is represented by a number, epsilon-1 represents the decomposition in the horizontal direction, epsilon-2 represents the vertical directionThe decomposition of the direction, wherein epsilon is 3 to represent the decomposition in the diagonal direction, and j is the decomposition frequency;
step 32, a big fusion method:
step 321, based on the high frequency information of the two sets of images
Figure BDA0002707840900000034
And
Figure BDA0002707840900000035
separately calculating high frequency neighborhood energies
Figure BDA0002707840900000036
And
Figure BDA0002707840900000037
Figure BDA0002707840900000038
Figure BDA0002707840900000039
wherein the content of the first and second substances,
Figure BDA00027078409000000310
representing the high-frequency neighborhood energy of the SAR image in the epsilon direction under the j-th decomposition,
Figure BDA00027078409000000311
representing the high-frequency neighborhood energy of the optical image in the epsilon direction under the j-th decomposition;
322, selecting energy change salient wavelet coefficients to form new high-frequency wavelet coefficients to participate in the reconstruction of the back end by analyzing the energy of the neighborhood, wherein the processing strategy of taking the large energy is as follows:
Figure BDA00027078409000000312
wherein the content of the first and second substances,
Figure BDA00027078409000000313
is the high frequency information of the fusion result.
Further, the SAR landslide target detection area function established in step 4 is:
S′(x,y)=[(SE(x,y)+SA(x,y)+SN(x,y))]β
Figure BDA00027078409000000314
wherein beta is a regulating parameter used for highlighting a landslide characteristic target area;
s' (x, y) is the intermediate calculated quantity of the SAR image landslide characteristic function,
s (x, y) is a final SAR image landslide characteristic function,
SA(x, y) is a low frequency salient feature,
SE(x, y) is a high frequency salient feature,
SN(x, y) is a landslide feature function with dark features in the SAR image.
Further, the process of establishing the SAR landslide target detection area function in step 4 is as follows:
step 41, target analysis of the SAR image:
the wavelet high-frequency coefficient in the SAR image represents the parts with larger fluctuation in the image, and the parts are the landslide characteristic salient regions in the SAR image;
step 411, inputting SAR image fs(x, y) after 3-layer stationary wavelet decomposition (SWT), 4 groups of low-frequency detail information A are generatedj(x, y) and high frequency information
Figure BDA0002707840900000041
Wherein: a. thej(x, y) represents the low-frequency information obtained by decomposing the SAR image at the jth time,
Figure BDA0002707840900000042
representing high-frequency information obtained by decomposing the SAR image at the jth time in the epsilon direction, wherein the directivity of epsilon is represented by letters, epsilon-h represents decomposition in the horizontal direction, epsilon-v represents decomposition in the vertical direction, epsilon-d represents decomposition in the diagonal direction, and j is the decomposition frequency;
step 412, based on the high frequency information of the image
Figure BDA0002707840900000043
Calculating high-frequency detail intensity information E of SAR images(x,y):
Figure BDA0002707840900000044
Wherein | | | represents taking the absolute value;
standard high frequency intensity combined information is obtained by normalizing high frequency energy and low frequency background part
Figure BDA0002707840900000045
And low frequency information
Figure BDA0002707840900000046
Carrying out landslide feature extraction to obtain high-frequency and low-frequency significant features SE(x, y) and SA(x,y):
Figure BDA0002707840900000047
Figure BDA0002707840900000048
Step 413, setting low-frequency data obtained by subjecting the SAR image to 3-layer SWT wavelet transform, and obtaining m × n f through 0-1 normalizations(x,y),ft(x, y) is a full 1 filling function of m multiplied by n, so that the characteristic gray landslide characteristic function S in the SAR imageN(x, y) is:
SN(x,y)=[ft(x,y)-fs(x,y)]α
wherein alpha is a filtering parameter used for weakening the influence of the non-target area on the characteristic function;
step 42, establishing a landslide target area of the SAR image:
and obtaining a final SAR image landslide characteristic function S (x, y) by weighted combination of the obtained landslide characteristic functions:
S′(x,y)=[(SE(x,y)+SA(x,y)+SN(x,y))]β
Figure BDA0002707840900000051
where β is the tuning parameter used to highlight the landslide signature region, and the resulting S (x, y) is normalized to a value of [0, 1 ].
Further, the specific content of establishing the landslide feature region fusion rule in the step 5 is as follows:
substituting the high-frequency characteristic function S (x, y) into the fusion rule, and finally expressing the fusion rule as follows:
Figure BDA0002707840900000052
wherein f isR/C、fG/C、fB/CR, G, B three channels, f, each representing an optical imageR/F、fG/F、fB/FR, G, B three channels, f, representing fused images respectivelyIrIs the gray level fusion result of the I components of the SAR image and the color visible light image under the frame of the stationary wavelet transform method, fIIs the brightness component of the visible light image, and λ represents the gray level fusion result fIrBrightness f of visible light imageIThe average value ratio is used for eliminating the influence of redundant basic colors on the brightness of the fusion result, and the calculation method comprises the following steps:
Figure BDA0002707840900000053
where mean () represents the arithmetic mean operation of the image.
The other technical scheme adopted by the invention is as follows: a landslide detection-oriented SAR and optical image fusion system comprises:
the image preprocessing unit is used for preprocessing the SAR image and the optical image;
an HIS conversion unit for performing HIS conversion on the optical image to obtain three components I, H and S;
an energy measuring and large fusion unit for performing stable wavelet transform and high-frequency component energy measuring and large fusion on the I components of the SAR image and the optical image;
the SAR landslide target detection area function establishing unit is used for respectively carrying out landslide characteristic detection on low-frequency and high-frequency components of the SAR image and the gray level information of the image, establishing an SAR landslide target detection area function and partitioning the SAR image;
the image fusion unit is used for establishing a landslide characteristic region fusion rule and realizing image fusion according to a regional fusion strategy;
and the landslide disaster information extraction unit is used for identifying and extracting landslide disaster information based on the fusion image.
The invention has the beneficial effects that:
1. the landslide feature function is constructed, so that the landslide target of the image can be accurately judged and analyzed.
The fusion of the SAR image and the visible light image is researched and analyzed, a wavelet energy maximization fusion algorithm based on a landslide target is provided for image fusion, and a fusion strategy is improved at a pixel level and a feature level. According to the characteristic that the landslide feature function is insensitive to the target noise, the same strategy is selected for the pixel image and the feature image: and the weighted average is adopted for the decomposed wavelet low-frequency coefficients, and the large rule of the energy of the domain is adopted for the high-frequency coefficient part, so that the fused image and the obvious characteristics are more definite. On the basis, the obtained fusion image is subjected to background and detail separation, the details are screened through the significant information obtained by fusion, and the noise in the fusion image is weakened on the basis of not influencing background contour information.
2. And a landslide feature detection area function is constructed, so that the landslide feature area of the SAR image can be correctly planned.
From the angle of wavelet multi-resolution, landslide target comprehensive analysis is carried out on the SAR image to obtain a landslide feature detection area function, so that the space area of the image can be divided into: the method comprises the steps of determining whether a target area with landslide exists or not, determining whether the target area.
3. The landslide feature region fusion algorithm effectively distinguishes spectrums, details and noise according to the difference of feature regions.
The method comprises the steps that the necessity that the salient features must enter a fused image is considered in a feature region, the problem that the gray-dark features which are easy to ignore in fusion are easy to lose is also considered, and the relation between the gray-dark features and a noisy background on a pixel level is analyzed; different weighting coefficients are distributed on different characteristic regions according to different attributes, the problem of influence of noise on the spectrum is further solved when the problem of contradiction between details and the spectrum is processed in a compromise mode, and the effectiveness and the originality of the spectrum are reserved.
In conclusion, the method is reliable and practical, has better adaptability to the SAR and optical image fusion for landslide detection, adopts relevant processing measures on structure maintenance, noise removal and spectrum reservation, and obtains excellent effect, and has better practicability and feasibility.
Drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2 is a basic flow diagram of the image registration of the present invention;
FIG. 3 is a flow chart of landslide feature detection denoising fusion of the present invention;
fig. 4 is a flowchart of a landslide target detection area function generation flow of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Example 1
As shown in fig. 1, the method for integrating an SAR and an optical image for landslide detection according to the embodiment of the present invention includes the following steps:
step 1, preprocessing an SAR image and an optical image;
step 2, performing HIS conversion on the optical image to obtain three components I, H and S;
step 3, performing stationary wavelet transform and high-frequency component energy quantity large fusion on the I components of the SAR image and the optical image;
step 4, respectively carrying out landslide characteristic detection on the low-frequency and high-frequency components of the SAR image and the gray level information of the image, establishing an SAR landslide target detection area function, and partitioning the SAR image;
step 5, establishing a landslide feature region fusion rule, and realizing image fusion according to a regional fusion strategy;
and 6, identifying and extracting landslide disaster information based on the fusion image.
Further, the specific steps of preprocessing the SAR image in step 1 are as follows:
step 11, selecting a diffusion function:
selecting a function according to the selection standard of the diffusion function in the anisotropic diffusion model and combining the analysis of the property, the graph and the diffusion strength of the function
Figure BDA0002707840900000081
And the classical diffusion function exp [ - (x/k)2]As a function of diffusion in an anisotropic diffusion model.
Step 12, adopting the following diffusion model:
the diffusion function of the model is expressed as:
Figure BDA0002707840900000091
in the formula (I), the compound is shown in the specification,
x is the gradient amplitude I of 4 directionsN、IS、IE、IW
GM is the average of local image gradient change values and is expressed as follows:
Figure BDA0002707840900000092
step 13, using the relative signal-to-noise ratio as an iteration termination condition:
Figure BDA0002707840900000093
in the formula:
Ikthe result of k-time filtering of the image is obtained;
I(k+1)the result of k +1 times of filtering of the image is obtained;
Ω is the entire area of the image.
|RSNR(k+1)-RSNRk|/RSNRk≤ε
Wherein epsilon is a threshold value selected in advance, and epsilon is generally more than or equal to 0.01.
When the iteration in the iteration process is terminated, an image with relatively small noise content can be obtained, and the noise removing capability is maximized.
Further, the step 1 of preprocessing the optical image comprises the following specific steps:
step 14, correction processing:
generally, the optical images purchased from the satellite ground station are roughly processed, and generally, the user only needs to perform fine processing, i.e., geometric correction of the images. There are three schemes of systematic correction, correction using control points and hybrid correction for geometric correction processing of images, which are briefly described as follows:
step 141, system correction is to substitute measured values such as calibration data of the remote sensing sensor, the position of the sensor, the satellite attitude, etc. into a theoretical correction formula to perform geometric distortion correction.
Step 142, control point correction is to use a mathematical model to approximately describe the geometric deformation process of the remote sensing image by using the corresponding points between the deformed remote sensing image and the standard map, namely control point data pairs. And solving a geometric distortion model through the geometric control points, and then carrying out geometric correction on the image.
Step 143, the hybrid correction is a staged correction scheme, which first performs the geometric correction of the system by using a theoretical formula of the geometric distortion, and then further corrects the geometric distortion by using the ground control point.
The present invention uses a currently common hybrid correction scheme.
Step 15, image enhancement:
the common image enhancement methods include three types, namely linear transformation, histogram equalization, nonlinear transformation piecewise linear transformation and the like, and the processing methods are respectively as follows:
step 151, linear transformation expands the luminance value range of the image to the entire display value range, so that the resulting light-tone area of the image appears lighter and the dark-tone area appears darker. This displays similar image data values as discernable different tones.
Step 152, histogram equalization expands the contrast of the middle bright area in the image, and the contrast of the high bright area and the low bright area of the bright areas at the two ends in the corresponding original image is compressed.
Step 153, the non-linear transformation divides the brightness range of the band into a plurality of sections, and the linear transformation of different degrees is performed according to the sections to determine different weights by referring to various kinds of histogram. This can enhance the contrast of the target image in a specific brightness value area, such as the contrast between rivers, roads, farmlands and plants, so as to facilitate interpretation.
Further, the specific steps of geometrically registering the SAR and the optical image in step 1 are as follows:
step 16, defining image registration is a process of obtaining coordinate transformation parameters between images according to a similarity measurement criterion, so that two or more images of the same region acquired from different sensors, different viewing angles and different times can be transformed to the same coordinate system. Mathematically, the definition of image registration can be expressed in mathematical expressions as:
I1(x1,y1)=g(I2(T(x2,y2)))
wherein:
(x1,y1) The coordinates of the pixel points in the reference image are obtained;
(x2,y2) The coordinates of the pixel points in the non-registered image are obtained;
I1(x1,y1) Is the pixel gray (brightness) value in the reference image;
I2(x2,y2) As pixel gray (intensity) values in the unregistered image;
g is one-dimensional gray scale transformation between images;
and T is two-dimensional space coordinate transformation between the images.
The process of determining T is the process of registration.
Step 17, the general process of image registration is to project the images onto the same ground coordinate system after the multi-sensor data is subjected to strict geometric correction processing and system errors are corrected, then select a small number of control points on each sensor image, and realize the accurate registration of the images by multi-step processing such as automatic selection of feature points or calculation of similarity between the feature points, rough estimation of the position of registration points, accurate determination of registration points, estimation of registration transformation parameters and the like. Image registration can be seen as a combination of several elements:
the feature space: is defined as extracting a characteristic set for matching from a reference image and an input image
Search space: set of possible transformations for establishing correspondence between input features and reference features
Search strategy: for selecting a transformation model that can be calculated to step the matching up to the accuracy requirement during the process
The proximity metric: for evaluating a match between input data defined by a given transformation obtained from the search space and reference data.
The image matching method based on the characteristics, which is adopted by the invention, takes some salient characteristics extracted from the image gray level as matching primitives, and the characteristics used for matching are usually points, lines, areas and the like. The algorithm mainly comprises two steps of feature extraction and feature matching. Before feature matching, firstly, extracting features such as points, lines, regions and the like with obvious gray changes from a plurality of images to be matched to form a feature set. And then selecting the feature pairs with matching relation as much as possible in the feature set corresponding to each image by using a feature matching algorithm. And processing the non-characteristic pixel points by using methods such as interpolation and the like, and calculating a corresponding matching relation, thereby realizing pixel-by-pixel registration between a plurality of images.
Further, the specific steps of performing HIS transform on the optical image in step 2 are:
step 21, the IHS contains three members: lightness (Intensity), Hue (Hue), and Saturation (Saturation). From IHS can be derived from a pass-through transformation of the RGB model color space, which is not a simple linear-to-relation. The IHS model and the RGB model can be obtained by correspondence within a mathematical calculation range.
The calculation method from the RGB space to the IHS space is as follows:
Figure BDA0002707840900000121
Figure BDA0002707840900000122
Figure BDA0002707840900000123
Figure BDA0002707840900000124
the color range represented by the different value ranges of the H component is different, wherein 0 degrees is red, 120 degrees is green, and 240 degrees is blue, so that the chromaticity is 0-240 degrees and covers all colors of the visible spectrum. The conversion from the IHS cylindrical space model to the RGB space model needs to be considered in three cases:
step 211, when 0 ≦ H < 120 °, the IHS model corresponds to the RGB model, the hue region mainly falls on the R hue:
Figure BDA0002707840900000131
Figure BDA0002707840900000132
Figure BDA0002707840900000133
step 212, when 120 ≦ H < 240 °, the IHS model corresponds to the RGB model, the hue region falls predominantly on the G hue:
Figure BDA0002707840900000134
Figure BDA0002707840900000135
Figure BDA0002707840900000136
step 213, when 240 ≦ H < 360 °, the IHS model corresponds to the RGB model, the hue region falls mainly on the B hue:
Figure BDA0002707840900000137
Figure BDA0002707840900000138
Figure BDA0002707840900000139
further, the specific steps of the stationary wavelet transform in step 3 are as follows:
step 31, insert 2 in filter coefficients of the j-th layer decompositionj-1Blank data (i.e., padding 0) to increase the length of the filter, and extension of the filter to achieve multi-scale analysis of the image, namely:
Figure BDA00027078409000001310
Figure BDA00027078409000001311
wherein { hkDenotes the low pass filter coefficients; { gkThe band pass filter coefficients are denoted.
Step 32, after the stationary wavelet transform, the size of the obtained new data is not changed, and the SWT (stationary wavelet decomposition) has certain redundancy. Using the SWT algorithm as image analysis, performing convolution filtering on two dimensional directions of an image, and obtaining a new image which can be expressed as:
Figure BDA0002707840900000141
Figure BDA0002707840900000142
Figure BDA0002707840900000143
Figure BDA0002707840900000144
wherein A isx,jRepresents the low frequency component of the j-th decomposition,
Figure BDA0002707840900000145
high frequency components in the horizontal (denoted by lower case h), vertical (denoted by lower case v), and diagonal (denoted by lower case d) directions of the jth decomposition are respectively represented.
Further, the specific method for measuring the high-frequency component energy in the step 3 to obtain the large fusion comprises the following steps:
step 33, high-frequency component energy calculation:
the input SAR image and optical image I component are processed by stationary wavelet decomposition (SWT) to obtain the low frequency and high frequency information of 4 groups of images: a. thes,j(x,y)、Av,j(x,y)、
Figure BDA0002707840900000146
Figure BDA0002707840900000147
Wherein: a. thes,j(x, y) and Av,j(x, y) respectively representing the low-frequency information obtained by decomposing the SAR image and the optical image at the jth time,
Figure BDA0002707840900000148
and
Figure BDA0002707840900000149
respectively representing high-frequency information obtained by decomposing the SAR image and the optical image at the jth time in the epsilon direction. The directivity of epsilon is represented by numbers, epsilon-1 represents the decomposition in the horizontal direction, epsilon-2 represents the vertical directionAnd e ═ 3 denotes the decomposition in the diagonal direction, and j denotes the number of decompositions.
Step 34, a big fusion method:
step 341, according to the high frequency information subgraphs of the two groups of images
Figure BDA00027078409000001410
And
Figure BDA00027078409000001411
computing high frequency neighborhood energy
Figure BDA00027078409000001412
And
Figure BDA00027078409000001413
the energy convolution kernel usually selects a base number as the side length, and the invention selects a length of 3 multiplied by 3:
Figure BDA0002707840900000151
Figure BDA0002707840900000152
wherein the content of the first and second substances,
Figure BDA0002707840900000153
representing the high-frequency neighborhood energy of the SAR image in the epsilon direction under the j-th decomposition,
Figure BDA0002707840900000154
representing the high frequency neighborhood energy of the optical image in the epsilon direction at the j-th decomposition.
Step 342, selecting energy change salient wavelet coefficients to form new high-frequency wavelet coefficients to participate in the reconstruction of the back end through analyzing the energy of the neighborhood, wherein the processing strategy of taking the energy to be large is as follows:
Figure BDA0002707840900000155
wherein
Figure BDA0002707840900000156
Is the high frequency information of the fusion result.
Further, the specific method for establishing the landslide characteristic detection area function in the step 4 is as follows:
step 41, target analysis of the SAR image:
according to the wavelet characteristics, the wavelet transform can process and analyze images comprehensively and in multiple angles through multi-resolution analysis. The wavelet high-frequency coefficient size in the SAR image represents the parts with large fluctuation in the image, and the parts are usually the landslide characteristic salient regions in the SAR image.
Step 411, inputting SAR image fs(x, y) after 3-layer stationary wavelet decomposition (SWT), 4 groups of low-frequency detail information A are generatedj(x, y) and high frequency information
Figure BDA0002707840900000157
Wherein: a. thej(x, y) represents the low-frequency information obtained by decomposing the SAR image at the jth time,
Figure BDA0002707840900000158
and the high-frequency information obtained by decomposing the SAR image at the jth time in the epsilon direction is shown. The directivity of ∈ is represented by letters, ∈ ═ h represents decomposition in the horizontal direction, ∈ ═ v represents decomposition in the vertical direction, ∈ ═ d represents decomposition in the diagonal direction, and j represents the number of decompositions.
Step 412, based on the high frequency information of the image
Figure BDA0002707840900000159
Calculating high-frequency detail intensity information E of SAR images(x,y):
Figure BDA00027078409000001510
Where | represents taking the absolute value.
By normalizing the high frequency energy and low frequency background portion, standard high frequency intensity combination information can be obtained
Figure BDA0002707840900000161
And low frequency information
Figure BDA0002707840900000162
Carrying out landslide feature extraction to obtain high-frequency and low-frequency significant features SE(x, y) and SA(x,y):
Figure BDA0002707840900000163
Figure BDA0002707840900000164
Step 413, the simplest method for extracting the landslide dark-gray feature is to perform complement transformation after the low-frequency coefficient of the SAR image wavelet is normalized, the SAR image and the characteristic of multiplicative speckle noise, the smaller the pixel value in the SAR image is, the less noise pollution is caused, and the low-frequency part after the wavelet transformation is represented as a slow dark target area. The influence of bright background colors on the extraction of dark and dark objects can be attenuated by a simple exponential transformation. Meanwhile, in order not to influence the spectrum information after fusion, the weight of the dark target in the fusion can be weakened through exponential transformation, so that the weight is lower than the landslide characteristic significant area and higher than the non-characteristic area. Setting low-frequency data obtained by subjecting the SAR image to 3-layer SWT wavelet transform, and obtaining m multiplied by n f after 0-1 normalizations(x,y),ft(x, y) is a full 1 fill function (matrix) of m × n size, then the slope characteristic function S with dark characteristic in SAR imageN(x, y) is:
SN(x,y)=[ft(x,y)-fs(x,y)]α
wherein α is a filtering parameter for weakening the influence of the non-target region on the characteristic function, and in the invention, the filtering parameter is taken as α -5.
Step 42, establishing a landslide target area of the SAR image:
and obtaining a final SAR image landslide characteristic function S (x, y) by weighted combination of the obtained landslide characteristic functions:
S′(x,y)=[(SE(x,y)+SA(x,y)+SN(x,y))]β
Figure BDA0002707840900000165
wherein beta is a regulating parameter used for highlighting the landslide characteristic region, the regulating parameter is 0.5 in the invention, and the finally obtained S (x, y) is a value normalized to [0, 1 ].
Further, the specific steps of establishing the landslide feature region fusion rule in the step 5 are as follows:
step 51, a no-feature region fusion rule:
the featureless regions marked by the characteristic salient regions indicate that the SAR image does not contain obvious characteristic information, the regions often contain unique multiplicative speckle noise in the SAR image, the multiplicative speckle noise can be deduced reversely, and the regions may also have background colors representing information such as terrain height and the like. Meanwhile, the detail information and the spectrum information belong to a pair of spears, so that the spectrum information of the optical image is reserved in the region for the main fusion purpose, and meanwhile, in order to solve the problem that the characteristic dark region is not lost along with the vanishing background color, the region takes the spectrum information of the optical image as a base, and the image characteristics of the grayscale fusion detail are weighted through the remarkable information of the weight, so that the influence of noise on the spectrum information is reduced.
For the non-feature target of the SAR and the optical image, the fusion rule is that the spectrum information of the optical image and the SAR feature information controlled by the weight are selected, and the formula can be expressed as follows:
Figure BDA0002707840900000171
wherein f isR/C、fG/C、fB/CRepresenting three channels red, green and blue components of the input optical image,
Figure BDA0002707840900000172
representing a small background color present in the SAR image, fR/F、fG/F、fB/FRespectively, representing three channel components of the fused color image result.
Step 52, fusing rules of the characteristic dark areas and the characteristic salient areas
For the target characteristic areas (characteristic dark areas and characteristic salient areas) of the image, the adopted fusion rule is to add characteristic information on the basis of the information of each channel of the color image. Through the feature information screened by the feature salient region, the target feature information of the early fusion result can be strengthened, the target point is more definite, the texture details are richer and clearer, and the fusion method is as follows:
Figure BDA0002707840900000173
wherein f isIRepresenting the luminance component of the optical image, which component can be obtained by color model conversion. f. ofIrFor the gray level fusion result, a large fusion rule can be obtained through the stable wavelet neighborhood energy. λ represents the gray level fusion result fIrBrightness f of visible light imageIThe average value ratio is used for eliminating the influence of redundant basic colors on the brightness of the fusion result, and the calculation method comprises the following steps:
Figure BDA0002707840900000181
where mean () represents the arithmetic mean operation of the image.
Step 53, unifying the fusion rules:
by substituting the high-frequency feature function S (x, y) into the fusion rule, the final fusion rule can be expressed as:
Figure BDA0002707840900000182
wherein f isR/C、fG/C、fB/CR, G, B three channels, f, each representing an optical imageR/F、fG/F、fB/FR, G, B three channels, f, representing fused images respectivelyIrIs the gray scale fusion result of I components of SAR image and color visible light image under the frame of using Stationary Wavelet Transform (SWT)IIs the visible image luminance component.
Example 2
The present embodiment provides a landslide detection-oriented SAR and optical image fusion system, which includes:
the image preprocessing unit is used for preprocessing the SAR image and the optical image;
an HIS conversion unit for performing HIS conversion on the optical image to obtain three components I, H and S;
an energy measuring and large fusion unit for performing stable wavelet transform and high-frequency component energy measuring and large fusion on the I components of the SAR image and the optical image;
the SAR landslide target detection area function establishing unit is used for respectively carrying out landslide characteristic detection on low-frequency and high-frequency components of the SAR image and the gray level information of the image, establishing an SAR landslide target detection area function and partitioning the SAR image;
the image fusion unit is used for establishing a landslide characteristic region fusion rule and realizing image fusion according to a regional fusion strategy;
and the landslide disaster information extraction unit is used for identifying and extracting landslide disaster information based on the fusion image.
In the energy-getting large fusion unit, the energy-getting large fusion specifically comprises the following steps:
step 31, calculating high-frequency component energy:
the input SAR image and the component I of the optical image are decomposed by the stationary wavelet to obtain the low-frequency and high-frequency information of 4 groups of images: a. thes,j(x,y)、Av,j(x,y)、
Figure BDA0002707840900000191
Wherein: a. thes,j(x, y) and Av,j(x, y) respectively representing the low-frequency information obtained by decomposing the SAR image and the optical image at the jth time,
Figure BDA0002707840900000192
and
Figure BDA0002707840900000193
respectively representing high-frequency information obtained by decomposing the SAR image and the optical image in the epsilon direction at the jth time; the directivity of epsilon is represented by numbers, epsilon-1 represents the decomposition in the horizontal direction, epsilon-2 represents the decomposition in the vertical direction, epsilon-3 represents the decomposition in the diagonal direction, and j represents the decomposition times;
step 32, a big fusion method:
step 321, based on the high frequency information of the two sets of images
Figure BDA0002707840900000194
And
Figure BDA0002707840900000195
separately calculating high frequency neighborhood energies
Figure BDA0002707840900000196
And
Figure BDA0002707840900000197
Figure BDA0002707840900000198
Figure BDA0002707840900000199
wherein the content of the first and second substances,
Figure BDA00027078409000001910
showing the epsilon direction of the SAR image under the j decompositionThe high-frequency neighborhood energy of (a),
Figure BDA00027078409000001911
representing the high-frequency neighborhood energy of the optical image in the epsilon direction under the j-th decomposition;
322, selecting energy change salient wavelet coefficients to form new high-frequency wavelet coefficients to participate in the reconstruction of the back end by analyzing the energy of the neighborhood, wherein the processing strategy of taking the large energy is as follows:
Figure BDA00027078409000001912
wherein the content of the first and second substances,
Figure BDA00027078409000001913
is the high frequency information of the fusion result.
The SAR landslide target detection area function is established as follows:
step 41, target analysis of the SAR image:
the wavelet high-frequency coefficient in the SAR image represents the parts with larger fluctuation in the image, and the parts are the landslide characteristic salient regions in the SAR image;
step 411, inputting SAR image fs(x, y) after 3-layer stationary wavelet decomposition (SWT), 4 groups of low-frequency detail information A are generatedj(x, y) and high frequency information
Figure BDA0002707840900000201
Wherein: a. thej(x, y) represents the low-frequency information obtained by decomposing the SAR image at the jth time,
Figure BDA0002707840900000202
representing high-frequency information obtained by decomposing the SAR image at the jth time in the epsilon direction, wherein the directivity of epsilon is represented by letters, epsilon-h represents decomposition in the horizontal direction, epsilon-v represents decomposition in the vertical direction, epsilon-d represents decomposition in the diagonal direction, and j is the decomposition frequency;
step 412, based on the high frequency information of the image
Figure BDA0002707840900000203
Calculating high-frequency detail intensity information E of SAR images(x,y):
Figure BDA0002707840900000204
Wherein | | | represents taking the absolute value;
standard high frequency intensity combined information is obtained by normalizing high frequency energy and low frequency background part
Figure BDA0002707840900000205
And low frequency information
Figure BDA0002707840900000206
Carrying out landslide feature extraction to obtain high-frequency and low-frequency significant features SE(x, y) and SA(x,y):
Figure BDA0002707840900000207
Figure BDA0002707840900000208
Step 413, setting low-frequency data obtained by subjecting the SAR image to 3-layer SWT wavelet transform, and obtaining m × n f through 0-1 normalizations(x,y),ft(x, y) is a full 1 filling function of m multiplied by n, so that the characteristic gray landslide characteristic function S in the SAR imageN(x, y) is:
SN(x,y)=[ft(x,y)-fs(x,y)]α
wherein alpha is a filtering parameter used for weakening the influence of the non-target area on the characteristic function;
step 42, establishing a landslide target area of the SAR image:
and obtaining a final SAR image landslide characteristic function S (x, y) by weighted combination of the obtained landslide characteristic functions:
S′(x,y)=[(SE(x,y)+SA(x,y)+SN(x,y))]β
Figure BDA0002707840900000211
where β is the tuning parameter used to highlight the landslide signature region, and the resulting S (x, y) is normalized to a value of [0, 1 ].
S' (x, y) is the intermediate calculated quantity of the SAR image landslide characteristic function,
s (x, y) is a final SAR image landslide characteristic function,
SA(x, y) is a low frequency salient feature,
SE(x, y) is a high frequency salient feature,
SN(x, y) is a landslide feature function with dark features in the SAR image.
In the image fusion unit, the concrete contents of the landslide feature region fusion rule are as follows:
substituting the high-frequency characteristic function S (x, y) into the fusion rule, and finally expressing the fusion rule as follows:
Figure BDA0002707840900000212
wherein f isR/C、fG/C、fB/CR, G, B three channels, f, each representing an optical imageR/F、fG/F、fB/FR, G, B three channels, f, representing fused images respectivelyIrIs the gray level fusion result of the I components of the SAR image and the color visible light image under the frame of the stationary wavelet transform method, fIIs the brightness component of the visible light image, and λ represents the gray level fusion result fIrBrightness f of visible light imageIThe average value ratio is used for eliminating the influence of redundant basic colors on the brightness of the fusion result, and the calculation method comprises the following steps:
Figure BDA0002707840900000213
where mean () represents the arithmetic mean operation of the image.
It will be understood that modifications and variations can be made by persons skilled in the art in light of the above teachings and all such modifications and variations are intended to be included within the scope of the invention as defined in the appended claims.

Claims (10)

1. A landslide detection-oriented SAR and optical image fusion method is characterized by comprising the following steps:
step 1, preprocessing an SAR image and an optical image;
step 2, performing HIS conversion on the optical image to obtain three components I, H and S;
step 3, performing stationary wavelet transform and high-frequency component energy quantity large fusion on the I components of the SAR image and the optical image;
step 4, respectively carrying out landslide characteristic detection on the low-frequency and high-frequency components of the SAR image and the gray level information of the image, establishing an SAR landslide target detection area function, and partitioning the SAR image;
step 5, establishing a landslide feature region fusion rule, and realizing image fusion according to a regional fusion strategy;
and 6, identifying and extracting landslide disaster information based on the fusion image.
2. The landslide detection-oriented SAR and optical image fusion method according to claim 1, wherein the specific steps of large fusion of high frequency component energy in step 3 are as follows:
step 31, calculating high-frequency component energy:
the input SAR image and the component I of the optical image are decomposed by the stationary wavelet to obtain the low-frequency and high-frequency information of 4 groups of images: a. thes,j(x,y)、Av,j(x,y)、
Figure FDA0002707840890000011
Wherein: a. thes,j(x, y) and Av,j(x, y) respectively representing the low-frequency information obtained by decomposing the SAR image and the optical image at the jth time,
Figure FDA0002707840890000012
and
Figure FDA0002707840890000013
respectively representing high-frequency information obtained by decomposing the SAR image and the optical image in the epsilon direction at the jth time; the directivity of epsilon is represented by numbers, epsilon-1 represents the decomposition in the horizontal direction, epsilon-2 represents the decomposition in the vertical direction, epsilon-3 represents the decomposition in the diagonal direction, and j represents the decomposition times;
step 32, a big fusion method:
step 321, based on the high frequency information of the two sets of images
Figure FDA0002707840890000021
And
Figure FDA0002707840890000022
separately calculating high frequency neighborhood energies
Figure FDA0002707840890000023
And
Figure FDA0002707840890000024
Figure FDA0002707840890000025
Figure FDA0002707840890000026
wherein the content of the first and second substances,
Figure FDA0002707840890000027
representing the high-frequency neighborhood energy of the SAR image in the epsilon direction under the j-th decomposition,
Figure FDA0002707840890000028
representing the high-frequency neighborhood energy of the optical image in the epsilon direction under the j-th decomposition;
322, selecting energy change salient wavelet coefficients to form new high-frequency wavelet coefficients to participate in the reconstruction of the back end by analyzing the energy of the neighborhood, wherein the processing strategy of taking the large energy is as follows:
Figure FDA0002707840890000029
wherein the content of the first and second substances,
Figure FDA00027078408900000210
is the high frequency information of the fusion result.
3. The landslide detection-oriented SAR and optical image fusion method according to claim 1, wherein the SAR landslide target detection area function established in step 4 is:
S′(x,y)=[(SE(x,y)+SA(x,y)+SN(x,y))]β
Figure FDA00027078408900000211
wherein beta is a regulating parameter used for highlighting a landslide characteristic target area;
s' (x, y) is the intermediate calculated quantity of the SAR image landslide characteristic function,
s (x, y) is a final SAR image landslide characteristic function,
SA(x, y) is a low frequency salient feature,
SE(x, y) is a high frequency salient feature,
SN(x, y) is a landslide feature function with dark features in the SAR image.
4. The landslide detection-oriented SAR and optical image fusion method according to claim 4, wherein the process of establishing the SAR landslide target detection area function in step 4 is as follows:
step 41, target analysis of the SAR image:
the wavelet high-frequency coefficient in the SAR image represents the parts with larger fluctuation in the image, and the parts are the landslide characteristic salient regions in the SAR image;
step 411, inputting SAR image fs(x, y) after 3-layer stationary wavelet decomposition (SWT), 4 groups of low-frequency detail information A are generatedj(x, y) and high frequency information
Figure FDA0002707840890000031
Wherein: a. thej(x, y) represents the low-frequency information obtained by decomposing the SAR image at the jth time,
Figure FDA0002707840890000032
representing high-frequency information obtained by decomposing the SAR image at the jth time in the epsilon direction, wherein the directivity of epsilon is represented by letters, epsilon-h represents decomposition in the horizontal direction, epsilon-v represents decomposition in the vertical direction, epsilon-d represents decomposition in the diagonal direction, and j is the decomposition frequency;
step 412, based on the high frequency information of the image
Figure FDA0002707840890000033
Calculating high-frequency detail intensity information E of SAR images(x,y):
Figure FDA0002707840890000034
Wherein | | | represents taking the absolute value;
standard high frequency intensity combined information is obtained by normalizing high frequency energy and low frequency background part
Figure FDA0002707840890000035
And low frequency information
Figure FDA0002707840890000036
Carrying out landslide feature extraction to obtain high-frequency and low-frequency significant features SE(x, y) and SA(x,y):
Figure FDA0002707840890000037
Figure FDA0002707840890000038
Step 413, setting low-frequency data obtained by subjecting the SAR image to 3-layer SWT wavelet transform, and obtaining m × n f through 0-1 normalizations(x,y),ft(x, y) is a full 1 filling function of m multiplied by n, so that the characteristic gray landslide characteristic function S in the SAR imageN(x, y) is:
SN(x,y)=[ft(x,y)-fs(x,y)]α
wherein alpha is a filtering parameter used for weakening the influence of the non-target area on the characteristic function;
step 42, establishing a landslide target area of the SAR image:
and obtaining a final SAR image landslide characteristic function S (x, y) by weighted combination of the obtained landslide characteristic functions:
S′(x,y)=[(SE(x,y)+SA(x,y)+SN(x,y))]β
Figure FDA0002707840890000041
where β is the tuning parameter used to highlight the landslide signature region, and the resulting S (x, y) is normalized to a value of [0, 1 ].
5. The landslide detection-oriented SAR and optical image fusion method according to claim 1, wherein the concrete contents of the landslide feature region fusion rule established in the step 5 are as follows:
substituting the high-frequency characteristic function S (x, y) into the fusion rule, and finally expressing the fusion rule as follows:
Figure FDA0002707840890000042
wherein f isR/C、fG/C、fB/CR, G, B three channels, f, each representing an optical imageR/F、fG/F、fB/FR, G, B three channels, f, representing fused images respectivelyIrIs the gray level fusion result of the I components of the SAR image and the color visible light image under the frame of the stationary wavelet transform method, fIIs the brightness component of the visible light image, and λ represents the gray level fusion result fIrBrightness f of visible light imageIThe average value ratio is used for eliminating the influence of redundant basic colors on the brightness of the fusion result, and the calculation method comprises the following steps:
Figure FDA0002707840890000043
where mean () represents the arithmetic mean operation of the image.
6. The utility model provides a SAR and optical image fusion system towards landslide detection which characterized in that includes:
the image preprocessing unit is used for preprocessing the SAR image and the optical image;
the HIS conversion unit is used for carrying out HTS conversion on the optical image to obtain three components I, H and S;
an energy measuring and large fusion unit for performing stable wavelet transform and high-frequency component energy measuring and large fusion on the I components of the SAR image and the optical image;
the SAR landslide target detection area function establishing unit is used for respectively carrying out landslide characteristic detection on low-frequency and high-frequency components of the SAR image and the gray level information of the image, establishing an SAR landslide target detection area function and partitioning the SAR image;
the image fusion unit is used for establishing a landslide characteristic region fusion rule and realizing image fusion according to a regional fusion strategy;
and the landslide disaster information extraction unit is used for identifying and extracting landslide disaster information based on the fusion image.
7. The landslide detection-oriented SAR and optical image fusion system according to claim 1, wherein in the energy-gain-large fusion unit, the specific steps of energy-gain-large fusion are as follows:
step 31, calculating high-frequency component energy:
the input SAR image and the component I of the optical image are decomposed by the stationary wavelet to obtain the low-frequency and high-frequency information of 4 groups of images: a. thes,j(x,y)、Av,j(x,y)、
Figure FDA0002707840890000051
Wherein: a. thes,j(x, y) and Av,j(x, y) respectively representing the low-frequency information obtained by decomposing the SAR image and the optical image at the jth time,
Figure FDA0002707840890000052
and
Figure FDA0002707840890000053
respectively representing high-frequency information obtained by decomposing the SAR image and the optical image in the epsilon direction at the jth time; the directivity of epsilon is represented by numbers, epsilon-1 represents the decomposition in the horizontal direction, epsilon-2 represents the decomposition in the vertical direction, epsilon-3 represents the decomposition in the diagonal direction, and j represents the decomposition times;
step 32, a big fusion method:
step 321, based on the high frequency information of the two sets of images
Figure FDA0002707840890000054
And
Figure FDA0002707840890000055
separately calculating high frequency neighborhood energies
Figure FDA0002707840890000056
And
Figure FDA0002707840890000057
Figure FDA0002707840890000058
Figure FDA0002707840890000061
wherein the content of the first and second substances,
Figure FDA0002707840890000062
representing the high-frequency neighborhood energy of the SAR image in the epsilon direction under the j-th decomposition,
Figure FDA0002707840890000063
representing the high-frequency neighborhood energy of the optical image in the epsilon direction under the j-th decomposition;
322, selecting energy change salient wavelet coefficients to form new high-frequency wavelet coefficients to participate in the reconstruction of the back end by analyzing the energy of the neighborhood, wherein the processing strategy of taking the large energy is as follows:
Figure FDA0002707840890000064
wherein the content of the first and second substances,
Figure FDA0002707840890000065
is the high frequency information of the fusion result.
8. The landslide detection-oriented SAR and optical image fusion system according to claim 1, wherein in the SAR landslide target detection area function establishing unit, the SAR landslide target detection area function is:
S′(x,y)=[(SE(x,y)+SA(x,y)+SN(x,y))]β
Figure FDA0002707840890000066
wherein beta is a regulating parameter used for highlighting a landslide characteristic target area;
s' (x, y) is the intermediate calculated quantity of the SAR image landslide characteristic function,
s (x, y) is a final SAR image landslide characteristic function,
SA(x, y) is a low frequency salient feature,
SE(x, y) is a high frequency salient feature,
SN(x, y) is a landslide feature function with dark features in the SAR image.
9. The landslide detection-oriented SAR and optical image fusion system according to claim 4, wherein the SAR landslide target detection area function is established as follows:
step 41, target analysis of the SAR image:
the wavelet high-frequency coefficient in the SAR image represents the parts with larger fluctuation in the image, and the parts are the landslide characteristic salient regions in the SAR image;
step 411, inputting SAR image fs(x, y) after 3-layer stationary wavelet decomposition (SWT), 4 groups of low-frequency detail information A are generatedj(x, y) and high frequency information
Figure FDA0002707840890000071
Wherein: a. thej(x, y) represents the low-frequency information obtained by decomposing the SAR image at the jth time,
Figure FDA0002707840890000072
representing high-frequency information obtained by decomposing the SAR image at the jth time in the epsilon direction, wherein the directivity of epsilon is represented by letters, epsilon-h represents decomposition in the horizontal direction, epsilon-v represents decomposition in the vertical direction, epsilon-d represents decomposition in the diagonal direction, and j is the decomposition frequency;
step 412, based on the high frequency information of the image
Figure FDA0002707840890000073
Calculating high-frequency detail intensity information E of SAR images(x,y):
Figure FDA0002707840890000074
Wherein | | | represents taking the absolute value;
standard high frequency intensity combined information is obtained by normalizing high frequency energy and low frequency background part
Figure FDA0002707840890000075
And low frequency information
Figure FDA0002707840890000076
Carrying out landslide feature extraction to obtain high-frequency and low-frequency significant features SE(x, y) and SA(x,y):
Figure FDA0002707840890000077
Figure FDA0002707840890000078
Step 413, setting low-frequency data obtained by subjecting the SAR image to 3-layer SWT wavelet transform, and obtaining m × n f through 0-1 normalizations(x,y),ft(x, y) is m x n largeSmall full 1 fill function, the landslide feature function S with dark features in SAR imageN(x, y) is:
SN(x,y)=[ft(x,y)-fs(x,y)]α
wherein alpha is a filtering parameter used for weakening the influence of the non-target area on the characteristic function;
step 42, establishing a landslide target area of the SAR image:
and obtaining a final SAR image landslide characteristic function S (x, y) by weighted combination of the obtained landslide characteristic functions:
S′(x,y)=[(SE(x,y)+SA(x,y)+SN(x,y))]β
Figure FDA0002707840890000081
where β is the tuning parameter used to highlight the landslide signature region, and the resulting S (x, y) is normalized to a value of [0, 1 ].
10. The SAR and optical image fusion system facing landslide detection according to claim 1, wherein in the image fusion unit, the landslide feature region fusion rule includes:
substituting the high-frequency characteristic function S (x, y) into the fusion rule, and finally expressing the fusion rule as follows:
Figure FDA0002707840890000082
wherein f isB/C、fG/C、fB/CR, G, B three channels, f, each representing an optical imageR/F、fG/F、fB/FR, G, B three channels, f, representing fused images respectivelyIrIs the gray level fusion result of the I components of the SAR image and the color visible light image under the frame of the stationary wavelet transform method, fIIs the brightness component of the visible image, and λ represents the gray scaleFusion result fIrBrightness f of visible light imageIThe average value ratio is used for eliminating the influence of redundant basic colors on the brightness of the fusion result, and the calculation method comprises the following steps:
Figure FDA0002707840890000083
where mean () represents the arithmetic mean operation of the image.
CN202011045558.3A 2020-09-28 2020-09-28 Landslide detection-oriented SAR and optical image fusion method and system Pending CN112307901A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011045558.3A CN112307901A (en) 2020-09-28 2020-09-28 Landslide detection-oriented SAR and optical image fusion method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011045558.3A CN112307901A (en) 2020-09-28 2020-09-28 Landslide detection-oriented SAR and optical image fusion method and system

Publications (1)

Publication Number Publication Date
CN112307901A true CN112307901A (en) 2021-02-02

Family

ID=74489326

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011045558.3A Pending CN112307901A (en) 2020-09-28 2020-09-28 Landslide detection-oriented SAR and optical image fusion method and system

Country Status (1)

Country Link
CN (1) CN112307901A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113076991A (en) * 2021-03-30 2021-07-06 中国人民解放军93114部队 Multi-target information comprehensive processing method and device based on nonlinear integral algorithm
CN113538306A (en) * 2021-06-15 2021-10-22 西安电子科技大学 Multi-image fusion method for SAR image and low-resolution optical image
CN115236655A (en) * 2022-09-01 2022-10-25 成都理工大学 Landslide identification method, system, equipment and medium based on fully-polarized SAR
CN115525727A (en) * 2022-10-14 2022-12-27 昆明理工大学 Agile power transmission line point cloud modeling and analyzing system
CN116452936A (en) * 2023-04-22 2023-07-18 安徽大学 Rotation target detection method integrating optics and SAR image multi-mode information

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090161944A1 (en) * 2007-12-21 2009-06-25 Industrial Technology Research Institute Target detecting, editing and rebuilding method and system by 3d image
CN101510309A (en) * 2009-03-30 2009-08-19 西安电子科技大学 Segmentation method for improving water parting SAR image based on compound wavelet veins region merge
JP5636085B1 (en) * 2013-12-27 2014-12-03 アジア航測株式会社 Single-polarization SAR color image creation device
CN105160648A (en) * 2014-11-26 2015-12-16 中国人民解放军第二炮兵工程大学 Radar target and shadow segmentation method based on wavelet and constant false alarm rate
CN105809194A (en) * 2016-03-08 2016-07-27 华中师范大学 Method for translating SAR image into optical image
CN106600572A (en) * 2016-12-12 2017-04-26 长春理工大学 Adaptive low-illumination visible image and infrared image fusion method
CN106960430A (en) * 2017-03-17 2017-07-18 西安电子科技大学 Based on subregional SAR image and color visible image fusion method
CN108765359A (en) * 2018-05-31 2018-11-06 安徽大学 A kind of fusion method of target in hyperspectral remotely sensed image and full-colour image based on JSKF models and NSCT technologies
CN109409292A (en) * 2018-10-26 2019-03-01 西安电子科技大学 The heterologous image matching method extracted based on fining characteristic optimization
CN109613513A (en) * 2018-12-20 2019-04-12 长安大学 A kind of potential landslide automatic identifying method of optical remote sensing for taking InSAR deformation into account
CN111178388A (en) * 2019-12-05 2020-05-19 上海交通大学 Partial discharge phase distribution detection method based on NSCT photoelectric fusion atlas
KR102086323B1 (en) * 2019-09-30 2020-05-26 대한민국 Method for providing automatic monitoring service with continuity of sentinel satellite imagery based on permanent scatterer interferometric synthetic aperture radar

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090161944A1 (en) * 2007-12-21 2009-06-25 Industrial Technology Research Institute Target detecting, editing and rebuilding method and system by 3d image
CN101510309A (en) * 2009-03-30 2009-08-19 西安电子科技大学 Segmentation method for improving water parting SAR image based on compound wavelet veins region merge
JP5636085B1 (en) * 2013-12-27 2014-12-03 アジア航測株式会社 Single-polarization SAR color image creation device
CN105160648A (en) * 2014-11-26 2015-12-16 中国人民解放军第二炮兵工程大学 Radar target and shadow segmentation method based on wavelet and constant false alarm rate
CN105809194A (en) * 2016-03-08 2016-07-27 华中师范大学 Method for translating SAR image into optical image
CN106600572A (en) * 2016-12-12 2017-04-26 长春理工大学 Adaptive low-illumination visible image and infrared image fusion method
CN106960430A (en) * 2017-03-17 2017-07-18 西安电子科技大学 Based on subregional SAR image and color visible image fusion method
CN108765359A (en) * 2018-05-31 2018-11-06 安徽大学 A kind of fusion method of target in hyperspectral remotely sensed image and full-colour image based on JSKF models and NSCT technologies
CN109409292A (en) * 2018-10-26 2019-03-01 西安电子科技大学 The heterologous image matching method extracted based on fining characteristic optimization
CN109613513A (en) * 2018-12-20 2019-04-12 长安大学 A kind of potential landslide automatic identifying method of optical remote sensing for taking InSAR deformation into account
KR102086323B1 (en) * 2019-09-30 2020-05-26 대한민국 Method for providing automatic monitoring service with continuity of sentinel satellite imagery based on permanent scatterer interferometric synthetic aperture radar
CN111178388A (en) * 2019-12-05 2020-05-19 上海交通大学 Partial discharge phase distribution detection method based on NSCT photoelectric fusion atlas

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Y. ZHANG等: "Information Fusion of Optical Image and SAR Image Based on DEM"", 《2019 IEEE INTERNATIONAL CONFERENCE ON SIGNAL, INFORMATION AND DATA PROCESSING (ICSIDP)》, 21 August 2020 (2020-08-21), pages 1 - 5 *
ZOU, WEIBAO等: ""Determination of Optimum Tie Point Interval for SAR Image Coregistration by Decomposing Autocorrelation Coefficient"", 《IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING》, vol. 57, no. 7, 20 February 2019 (2019-02-20), pages 5067 - 5084, XP011731969, DOI: 10.1109/TGRS.2019.2896383 *
卜丽静: ""利用纹理特征的SAR与光学图像融合方法研究"", 《测绘工程》, vol. 24, no. 5, 25 May 2015 (2015-05-25), pages 5 - 10 *
张普照: ""基于空时建模的遥感影像变化检测方法与应用"", 《中国博士学位论文全文数据库 工程科技Ⅱ辑》, no. 7, 15 July 2020 (2020-07-15), pages 028 - 8 *
郝亚冰: ""SAR图像去噪、分割及目标检测方法研究"", 《中国优秀硕士学位论文全文数据库 信息科技辑》, no. 4, 15 April 2013 (2013-04-15), pages 136 - 795 *
陈稳: ""基于光学和SAR遥感影像融合的典型目标检测识别研究"", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》, no. 2, 15 February 2020 (2020-02-15), pages 028 - 218 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113076991A (en) * 2021-03-30 2021-07-06 中国人民解放军93114部队 Multi-target information comprehensive processing method and device based on nonlinear integral algorithm
CN113076991B (en) * 2021-03-30 2024-03-08 中国人民解放军93114部队 Nonlinear integration algorithm-based multi-target information comprehensive processing method and device
CN113538306A (en) * 2021-06-15 2021-10-22 西安电子科技大学 Multi-image fusion method for SAR image and low-resolution optical image
CN113538306B (en) * 2021-06-15 2024-02-13 西安电子科技大学 SAR image and low-resolution optical image multi-image fusion method
CN115236655A (en) * 2022-09-01 2022-10-25 成都理工大学 Landslide identification method, system, equipment and medium based on fully-polarized SAR
CN115236655B (en) * 2022-09-01 2022-12-20 成都理工大学 Landslide identification method, system, equipment and medium based on fully-polarized SAR
US11747498B1 (en) 2022-09-01 2023-09-05 Chengdu University Of Technology Method, system, device and medium for landslide identification based on full polarimetric SAR
CN115525727A (en) * 2022-10-14 2022-12-27 昆明理工大学 Agile power transmission line point cloud modeling and analyzing system
CN116452936A (en) * 2023-04-22 2023-07-18 安徽大学 Rotation target detection method integrating optics and SAR image multi-mode information
CN116452936B (en) * 2023-04-22 2023-09-29 安徽大学 Rotation target detection method integrating optics and SAR image multi-mode information

Similar Documents

Publication Publication Date Title
Agrawal et al. A novel joint histogram equalization based image contrast enhancement
CN112307901A (en) Landslide detection-oriented SAR and optical image fusion method and system
Wang et al. Biologically inspired image enhancement based on Retinex
Tseng et al. Automatic cloud removal from multi-temporal SPOT images
Luo et al. A novel algorithm of remote sensing image fusion based on shift-invariant Shearlet transform and regional selection
González-Audícana et al. A low computational-cost method to fuse IKONOS images using the spectral response function of its sensors
CN111079556A (en) Multi-temporal unmanned aerial vehicle video image change area detection and classification method
Li et al. A perceptually inspired variational method for the uneven intensity correction of remote sensing images
Rao et al. Spatiotemporal data fusion using temporal high-pass modulation and edge primitives
Guo et al. Haze and thin cloud removal using elliptical boundary prior for remote sensing image
Iwasokun et al. Image enhancement methods: a review
Zhang et al. Preprocessing and fusion analysis of GF-2 satellite Remote-sensed spatial data
Gao et al. Single fog image restoration with multi-focus image fusion
Bi et al. Haze removal for a single remote sensing image using low-rank and sparse prior
Shawal et al. Fundamentals of digital image processing and basic concept of classification
Harrison et al. Earth Observation: Data, Processing and Applications. Volume 2C: Processing—Image Transformations
Reddy Digital image processing: Principles and applications
Ying et al. Region-aware RGB and near-infrared image fusion
CN113487493B (en) GANilla-based SAR image automatic colorization method
CN115311556A (en) Remote sensing image processing method and system for natural resource management
Mani A survey of multi sensor satellite image fusion techniques
Meenakshisundaram Quality assessment of IKONOS and Quickbird fused images for urban mapping
Xu et al. A Multi-rule-based Relative Radiometric Normalization for Multi-Sensor Satellite Images
Nair et al. Benchmarking single image dehazing methods
Hu et al. Rapid dehazing algorithm based on large-scale median filtering for high-resolution visible near-infrared remote sensing images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination