CN115797796A - Target level change detection method based on dense matching of optical image and SAR image - Google Patents

Target level change detection method based on dense matching of optical image and SAR image Download PDF

Info

Publication number
CN115797796A
CN115797796A CN202310076260.6A CN202310076260A CN115797796A CN 115797796 A CN115797796 A CN 115797796A CN 202310076260 A CN202310076260 A CN 202310076260A CN 115797796 A CN115797796 A CN 115797796A
Authority
CN
China
Prior art keywords
pixel
target
optical image
image
sar image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310076260.6A
Other languages
Chinese (zh)
Other versions
CN115797796B (en
Inventor
胡玉新
向俞明
王峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Information Research Institute of CAS
Original Assignee
Aerospace Information Research Institute of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerospace Information Research Institute of CAS filed Critical Aerospace Information Research Institute of CAS
Priority to CN202310076260.6A priority Critical patent/CN115797796B/en
Publication of CN115797796A publication Critical patent/CN115797796A/en
Application granted granted Critical
Publication of CN115797796B publication Critical patent/CN115797796B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention provides a target level change detection method based on dense matching of an optical image and an SAR image, which comprises the following steps: converting the optical image and the SAR image into a common feature space, and constructing a first pixel-by-pixel high-dimensional feature corresponding to the optical image and a second pixel-by-pixel high-dimensional feature of the SAR image; performing common feature dimension dense matching on the optical image and the SAR image based on the first pixel-by-pixel high-dimensional feature and the second pixel-by-pixel high-dimensional feature, aligning the optical image and the SAR image pixel by pixel, and acquiring a point-by-point difference intensity map between the optical image and the SAR image; extracting a first target skeleton characteristic of the optical image and a second target skeleton characteristic of the SAR image; determining a target skeleton difference intensity map between the optical image and the SAR image according to the first target skeleton characteristic and the second target skeleton characteristic; and extracting target-level change information between the optical image and the SAR image according to the point-by-point difference intensity map and the target skeleton difference intensity map. The method does not depend on geometric correction and registration processing, and has high processing efficiency.

Description

Target level change detection method based on dense matching of optical image and SAR image
Technical Field
The invention relates to the technical field of heterogeneous remote sensing image processing, in particular to a target level change detection method based on dense matching of an optical image and an SAR image.
Background
Optical images and Synthetic Aperture Radar (SAR) images are the most common two types of sensor images, wherein the optical images belong to passive remote sensing, reflect sunlight to a sensor through a target to form images, can truly reflect color and brightness information of ground objects, but cannot work under the conditions of night and cloud, fog and snow weather. The SAR image is a sensor which utilizes microwaves to sense actively, reflects the structural characteristics and electromagnetic scattering characteristics of a ground object target, has the advantages of all-weather and all-day observation, and is seriously influenced by speckle noise. The advantages of the optical image and the SAR image are jointly utilized to carry out change detection, so that the method has important application requirements and value, for example, the high-quality optical image before disaster and the SAR image obtained after disaster in an emergency can be compared to judge the specific position and disaster degree of disaster occurrence, and therefore accurate and rapid rescue guidance work is realized.
The existing change detection method needs to perform high-precision geometric correction and registration processing on the remote sensing image, has low processing efficiency and complex flow, cannot meet the change detection requirement under emergency conditions, and particularly cannot meet the high-time-frequency information acquisition requirement when the high-precision geometric correction is performed on the scene-by-scene image for a high-time-frequency multi-source satellite constellation system.
In addition, because the imaging mechanisms of the optical image and the SAR image are completely different, different characteristics of the ground object target are reflected, and the remote sensing image is limited by the accuracy limits of a satellite platform and a load device, a non-negligible positioning error still exists. There is therefore a need for an automatic and robust method to deal with both the matching of optical and SAR images and the change detection problem.
Disclosure of Invention
In view of the above technical problems, the present invention provides a target level change detection method based on dense matching of an optical image and an SAR image, which is used to at least partially solve the above technical problems.
Based on the above, the invention provides a target level change detection method based on dense matching of an optical image and an SAR image, which comprises the following steps: converting the optical image and the SAR image into a common feature space, and constructing a first pixel-by-pixel high-dimensional feature corresponding to the optical image and a second pixel-by-pixel high-dimensional feature of the SAR image; performing common feature dimension dense matching on the optical image and the SAR image based on the first pixel-by-pixel high-dimensional feature and the second pixel-by-pixel high-dimensional feature, aligning the optical image and the SAR image pixel by pixel, and acquiring a point-by-point difference intensity map between the optical image and the SAR image; extracting a first target skeleton characteristic of the optical image and a second target skeleton characteristic of the SAR image; determining a target skeleton difference intensity map between the optical image and the SAR image according to the first target skeleton characteristic and the second target skeleton characteristic; and extracting target-level change information between the optical image and the SAR image according to the point-by-point difference intensity map and the target skeleton difference intensity map.
According to the embodiment of the invention, the step of converting the optical image and the SAR image into the common feature space and constructing the first pixel-by-pixel high-dimensional feature corresponding to the optical image and the second pixel-by-pixel high-dimensional feature of the SAR image comprises the following steps: constructing a first dense high-dimensional feature of the optical image based on the multi-scale characteristic of the monogenic phase model; constructing a second dense high-dimensional characteristic of the SAR image according to the dense ratio signal characteristic; calculating the consistency of scale parameters of the first dense high-dimensional feature and the second dense high-dimensional feature to ensure that the SAR image and the optical image have the pixel-by-pixel feature commonality; and respectively carrying out three-dimensional cavity convolution operation on the first dense high-dimensional feature and the second dense high-dimensional feature to obtain a first pixel-by-pixel high-dimensional feature and a second pixel-by-pixel high-dimensional feature.
According to the embodiment of the invention, the common feature dimension dense matching of the optical image and the SAR image based on the first pixel-by-pixel high-dimensional feature and the second pixel-by-pixel high-dimensional feature comprises the following steps: constructing a characteristic energy term according to the first pixel-by-pixel high-dimensional characteristic and the second pixel-by-pixel high-dimensional characteristic, and introducing a spatial smoothing term and a noise filtering term to construct a global objective function with densely matched characteristic dimensions; performing multi-level propagation optimization on the global objective function, and aligning the optical image and the SAR image pixel by pixel; the multilevel propagation optimization comprises the steps of establishing a multilevel characteristic space for the common characteristics of the optical image and the SAR image, and carrying out dense matching result transmission from a coarse level to a fine level layer by layer in the multilevel space; after the dense matching result is transmitted, acquiring a final dense matching result at the level of the original resolution; and acquiring the optimized target energy as a point-by-point difference intensity map.
According to an embodiment of the invention, the global objective function is:
Figure SMS_1
wherein ,(x,y) Coordinates of pixels in the optical image or SAR image, E: (u,v) For a target energy model with dense matching of common feature dimensions,uvrespectively expressing the horizontal coordinate offset and the vertical coordinate offset of the target energy,DenseF opt (x,y) Representing pixels in an optical image (x,y) The corresponding high-dimensional features are used for identifying the features,DenseF sar (x,y) Express pixel in SAR image (x,y) Corresponding high-dimensional feature: (A)DenseF opt (x,y)- DenseF sar (x+u,y+v) Is) a term representing a characteristic energy,ρ d ρ s a penalty function representing a characteristic energy term and a penalty function representing a spatial smoothness term in the target energy model, respectively +u、▽vA spatial smoothing term is represented that is,w x y(,) (|u x y, - u x’ y’, |+|v x y, - v x’ y’, |) represents a noise-filtered term,N x y, representing the noise term computation neighborhood, (/)x’,y’) RepresentN x y, The coordinates of the pixel points in (1),u x y, express pixel point (x,y) The amount of horizontal coordinate offset of the energy of,u x’ y’, express pixel point (x’,y’) The amount of horizontal coordinate offset of the energy of,v x y, representing a pixel (x,y) The amount of vertical coordinate offset of the energy of,v x’ y’, representing a pixel (x’,y’) The amount of vertical coordinate offset of the energy of,w x y(,) is a pixel point (x,y) The weight of (a) is determined,γis a regularization parameter.
According to an embodiment of the present invention, extracting the first target skeleton feature of the optical image includes: the method comprises the steps that the circulation energy estimation of a variational filter is used as a criterion of self-adaptive scale analysis, and self-adaptive scale significance estimation is carried out on an optical image to obtain a first significant target area; and extracting the characteristic of a target center line of the first salient target area, and acquiring the skeleton characteristic of the first salient target area as the first target skeleton characteristic.
According to the embodiment of the invention, the significance estimation calculation mode of the adaptive scale is as follows:
Figure SMS_2
wherein,Iis an original optical image and is a digital image,T i is as followsiThe result of the variation filtering of a cycle,INthe number of the circulation is the number of times,TVis a variational filter.
According to the embodiment of the invention, the extracting of the second target skeleton characteristic of the SAR image comprises the following steps: acquiring a second significant target area in the SAR image; extracting the annular ratio operator response of the second significant target region; smoothing the response of the annular ratio operator; calculating the geometric distance transformation of the target according to the annular ratio operator response after the smoothing treatment based on the geometric constraint of the target structure; and carrying out non-maximum suppression on the geometric distance transformation of the target to obtain a second target skeleton characteristic of the target.
According to an embodiment of the invention, according to
Figure SMS_3
Calculating circular ratio operator responseCRf n h n v n Ratio filters for three different directions of dense ratio respectivelyThe filter corresponds to a component of the second dense high-dimensional feature of the current scale.
According to an embodiment of the present invention, determining a target skeleton difference intensity map between the optical image and the SAR image according to the first target skeleton feature and the second target skeleton feature comprises: calculating the similarity of the target skeleton characteristics between the optical image and the SAR image according to the first target skeleton characteristics and the second target skeleton characteristics; determining a target skeleton difference intensity map by taking the similarity as the change intensity of a first salient target region of the optical image and a second salient target region of the SAR image; wherein, according to
Figure SMS_4
The degree of similarity is calculated and,NCC(O i (x,y),S i (x,y) Is a pixel point (x,y) A similarity between the corresponding first target skeletal feature and the second target skeletal feature,O i (x,y) Is a pixel point (x,y) A corresponding first target skeleton characteristic is determined,S i (x,y) Is a pixel point (x,y) A corresponding second target skeleton characteristic is determined,u O u S respectively, feature means of the first salient target region and the second salient target region.
According to the embodiment of the invention, the step of extracting the target level change information between the optical image and the SAR image according to the point-by-point difference intensity map and the target skeleton difference intensity map comprises the following steps: and (4) carrying out hysteresis threshold segmentation on the point-by-point difference intensity map and the target skeleton difference intensity map so as to extract target level change information.
The target level change detection method based on the dense matching of the optical image and the SAR image provided by the embodiment of the invention at least has the following beneficial effects:
the method converts the multi-temporal heterogeneous optical image and the SAR image into the common characteristic space with high similarity, avoids the positioning deviation and the radiation geometric inconsistency of the optical image and the SAR image caused by completely different imaging mechanisms, and avoids the initial positioning error of the image. And multi-temporal optics and SAR images are adopted as change detection objects, so that the problem that the optical images cannot be acquired due to the influence of cloud and rain weather after a disaster occurs can be solved.
The method comprises the steps of carrying out common feature dense matching based on a common feature space, carrying out multi-level propagation optimization on a global objective function in the matching process to obtain a point-by-point difference intensity map, then obtaining a skeleton difference intensity map based on significance detection, and finally obtaining target-level change information through threshold calculation by combining the target skeleton difference intensity map and the point-by-point difference intensity map.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent from the following description of the embodiments of the present invention with reference to the accompanying drawings, in which:
fig. 1 schematically shows a flowchart of a target-level change detection method based on dense matching of an optical image and an SAR image according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments and the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
In the present invention, unless otherwise expressly specified or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally formed; may be mechanically, electrically or otherwise in communication with each other; they may be directly connected or indirectly connected through intervening media, or may be in communication within two elements or in interactive relationship between two elements. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the description of the present invention, it is to be understood that the terms "longitudinal", "length", "circumferential", "front", "back", "left", "right", "top", "bottom", "inner", "outer", and the like, indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, are merely for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the referenced subsystems or elements must have particular orientations, be constructed and operated in particular orientations, and thus, are not to be construed as limiting the present invention.
Throughout the drawings, like elements are represented by like or similar reference numerals. And conventional structures or constructions will be omitted when they may obscure the understanding of the present invention. And the shapes, sizes and positional relationships of the components in the drawings do not reflect the actual sizes, proportions and actual positional relationships. Furthermore, in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim.
Similarly, in the above description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various disclosed aspects. Reference to the description of the terms "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or to implicitly indicate the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Fig. 1 schematically shows a flowchart of a target-level change detection method based on dense matching of an optical image and an SAR image according to an embodiment of the present invention.
As shown in FIG. 1, the target-level change detection method based on dense matching of optical images and SAR images may include operations S101-S105.
In operation S101, the optical image and the SAR image are converted into a common feature space, and a first pixel-by-pixel high-dimensional feature corresponding to the optical image and a second pixel-by-pixel high-dimensional feature of the SAR image are constructed.
In the embodiment of the present disclosure, operation S101 may include, for example: a first dense high-dimensional feature of the optical image is constructed based on multi-scale characteristics of the monogenic phase model. And constructing a second dense high-dimensional characteristic of the SAR image according to the dense ratio signal characteristic. And calculating the consistency of the scale parameters of the first dense high-dimensional features and the second dense high-dimensional features, so that the SAR image and the optical image have the characteristic commonality pixel by pixel. And respectively carrying out three-dimensional cavity convolution operation on the first dense high-dimensional feature and the second dense high-dimensional feature to obtain a first pixel-by-pixel high-dimensional feature and a second pixel-by-pixel high-dimensional feature.
Illustratively, in consideration of the radiation geometric difference of the optical image and the SAR image, the common characteristic construction method is designed to acquire the inherent common characteristic of the optical image and the SAR image and reduce the obvious difference. Assuming that the size of the optical image isW×H
The two-dimensional monogenic signal model is composed of an original two-dimensional signal and its Ruiz transformation, as follows:
Figure SMS_5
wherein,jis a unit of an imaginary number, and is,Tindicating transposition.
Then, convolution calculation is carried out on the two-dimensional monogenic signals and the multi-scale orthogonal filter, and local phase responses in different directions are obtained as follows:
Figure SMS_6
then the dense high-dimensional features of the optical image can be obtained as follows:
Figure SMS_7
wherein,DenseF opt representing a first dense high-dimensional feature of the optical image,F n H n V n respectively are monogenic phase characteristics of three different directions of the current scale,realimagrespectively taking the real part and the imaginary part of a complex number for operation,wl n for the current dimension log filter wavelength parameter,Iis an original image, and is a digital image,fftandifftrespectively a two-dimensional fourier transform and an inverse fourier transform,MFa single-acting signal filter, a matrix dot product operation,Nis the total dimension of the scale and is,nfor the current scale dimension, U represents the cumulative multidimensional feature in the scale dimension.
Assuming total dimension of scaleN=4, then the characteristic dimension is obtained asW×HX 12 first dense high dimensional features.
And then, constructing pixel-by-pixel high-dimensional features suitable for SAR characteristics according to the dense ratio signal features. Specifically, the convolution of the optical image and the filter bank is replaced by an operator based on a ratio signal, so that multiplicative speckle noise in the SAR image can be effectively inhibited; the calculation of the ratio signal is performed using a two-dimensional Gabor filter, as follows:
Figure SMS_8
wherein,σis the standard deviation of the gaussian kernel and,ωis the frequency of the sine curve and takes a value of 1.8%σθIs the filter rotation angle.
Current scale levelθ=0And is verticalθ=πThe average of the convolution of the processing window with the original image is calculated as follows:
Figure SMS_9
wherein,
Figure SMS_10
Figure SMS_11
respectively the convolution mean of the horizontal filter and the image,
Figure SMS_12
Figure SMS_13
respectively, the convolution mean of the vertical filter and the image. In addition, the convolution of the isotropic filter is replaced by designing an isotropic conical filter as follows:
Figure SMS_14
by accumulating multi-direction local phase responses of different scales, a second dense high-dimensional feature of the SAR image can be constructedDenseF sar
Figure SMS_15
Wherein,f n h n v n the components of the second dense high-dimensional feature of the current scale corresponding to the ratio filters of the dense ratio in three different directions are respectively.
By calculation ofDenseF opt AndDenseF sar the consistency of scale parameters of the high-dimensional features ensures that the sizes of actual convolution windows of the current scale log filter and the two-dimensional Gabor filter are consistent, so that the pixel-by-pixel feature commonality of the optical and SAR images can be ensured; last pair ofDenseF opt AndDenseF sar performing three-dimensional cavity convolution operation, enhancing the noise immunity of the pixel-by-pixel high-dimensional features, and capturing the structural characteristics of the multi-scale target to obtain a first pixel-by-pixel high-dimensional feature and a second pixel-by-pixel high-dimensional feature, as follows:
Figure SMS_16
wherein,conv3din order to perform a three-dimensional convolution,sizefor the convolution kernel size, the value may be, for example, 5,dilationfor the hole factor, the value may be, for example, 2.
In operation S102, common feature dimension dense matching is performed on the optical image and the SAR image based on the first pixel-by-pixel high-dimensional feature and the second pixel-by-pixel high-dimensional feature, the optical image and the SAR image are aligned pixel by pixel, and a point-by-point difference intensity map between the optical image and the SAR image is obtained.
In the embodiment of the present invention, the process of dense matching of the common feature dimension may be, for example: and constructing a characteristic energy term according to the first pixel-by-pixel high-dimensional characteristic and the second pixel-by-pixel high-dimensional characteristic, and introducing a spatial smoothing term and a noise filtering term to construct a global objective function with densely matched characteristic dimensions. Performing multilevel propagation optimization on the global objective function, aligning the optical image and the SAR image pixel by pixel, wherein the multilevel propagation optimization comprises establishing a multilevel feature space for the common features of the optical image and the SAR image, and performing dense matching result transfer from a coarse level to a fine level layer by layer in the multilevel space; after the result of the dense matching is passed, the originalThe level of the starting resolution obtains the final dense matching result. Obtaining optimized target energy as a point-by-point difference intensity mapD pixel
Converting the image matching problem into a pixel-by-pixel offset estimation problem based on the consistency of the common features; after the assumption of constant feature dimension and the assumption of spatial smoothing are satisfied, the global objective function with dense matching feature dimensions constructed in this way may be:
Figure SMS_17
wherein, (ii) (x,y) Coordinates of pixel points in the optical image or SAR image, E: (u,v) For a target energy model with dense matching of common feature dimensions,uvthe horizontal coordinate offset and the vertical coordinate offset of the target energy respectively,DenseF opt (x,y) Indicating a pixel in an optical image (x,y) The corresponding high-dimensional features of the image,DenseF sar (x,y) Express pixel points in SAR image (x,y) Corresponding high-dimensional feature: (A)DenseF opt (x,y)- DenseF sar (x+u,y+v) Is) a term representing a characteristic energy,ρ d ρ s a penalty function representing a characteristic energy term and a penalty function representing a spatial smoothness term, respectively, in the target energy modelu、▽vA spatial smoothing term is represented that is,w x y(,) (|u x y, - u x’ y’, |+|v x y, - v x’ y’, |) represents a noise-filtered term,N x y, representing a noise term computation neighborhood, (x’,y’) RepresentN x y, The coordinates of the pixel points in (1),u x y, express pixel point (x,y) The amount of horizontal coordinate offset of the energy of,u x’ y’, representing a pixel (x’,y’) Horizontal coordinate offset of energy ofThe amount of the (B) component (A),v x y, representing a pixel (x,y) The amount of vertical coordinate offset of the energy of,v x’ y’, express pixel point (x’,y’) The amount of vertical coordinate offset of the energy of,w x y(,) is a pixel point (x,y) The weight of (a) is determined,γis a regularization parameter.
The penalty function of the characteristic energy item and the penalty function of the space smoothing item can adopt L1 norm; in consideration of speckle noise which generally exists in SAR images, a noise filtering term is introduced into a target function to eliminate noise in the target function, and a weighted median filter in an L1 norm form is adopted as the noise filtering term. Considering that the pixels in the homogeneous region should be weighted more heavily, while the pixels of the heterogeneous region may depend on the consistency of the common characteristic, the coefficient of variation of the current pixel is taken as the weightw x y(,) The following were used:
Figure SMS_18
wherein, var x y(,) The variance of the neighborhood is computed for the noise term,m x y(,) the variance of the neighborhood is computed for the noise term.
In operation S103, a first target skeleton feature of the optical image and a second target skeleton feature of the SAR image are extracted.
In an embodiment of the present invention, extracting the first target skeleton feature of the optical image includes: and performing adaptive scale significance estimation on the optical image by using the cyclic energy estimation of the variational filter as a criterion of adaptive scale analysis to obtain a first significant target region. And extracting the characteristic of a target center line of the first salient target area, and acquiring the skeleton characteristic of the first salient target area as the first target skeleton characteristic.
Exemplarily, for the purpose of target level change analysis, in order to be applicable to multi-scale targets widely existing in remote sensing images, adaptive scale saliency estimation needs to be performed on optical images; the cyclic energy estimation of the variational filter is adopted as the criterion of the self-adaptive scale analysis, and the energy of a homogeneous region is smaller and corresponds to a large-scale target; the energy change of the heterogeneous region is obvious and corresponds to a small-scale target; the significance estimation of the adaptive scale is calculated as follows:
Figure SMS_19
wherein,Iis an original optical image and is a digital image,T i is as followsiThe result of the variation filtering of one cycle,INthe number of the circulation is the number of times,TVfor a variational filter, the kernel function of the variational filter is:
Figure SMS_20
by accumulating the variation of the filtering result in all cycles, the optical image significance intensity map can be obtainedSigThe salient region can be obtained by threshold segmentation, and further the optical dense high-dimensional characteristics of the salient region are obtainedDenseF opt Carrying out feature dimension polymerization to obtain structural features of the salient regionSF opt Comprises the following steps:
Figure SMS_21
to pairSF opt And performing skeleton central line estimation based on morphological processing and Hough transformation, and extracting position information of the skeleton central line to obtain a first target skeleton characteristic of the optical image salient target region.
In the embodiment of the present invention, extracting the second target skeleton feature of the SAR image includes: and acquiring a second significant target region in the SAR image. And extracting the annular ratio operator response of the second significant target region. And smoothing the response of the annular ratio operator. And calculating the geometric distance transformation of the target according to the annular ratio operator response after the smoothing treatment based on the geometric constraint of the target structure. And carrying out non-maximum suppression on the geometric distance transformation of the target to obtain a second target skeleton characteristic of the target.
Illustratively, the strong scattering point structure position information of the target area in the SAR image is extracted based on the annular ratio operator, so that the false alarm influence caused by the high-frequency component of the noise can be effectively inhibited, and the method can be used according to the principle that
Figure SMS_22
Calculating circular ratio operator responseCR
Then responding to the annular ratio operatorCRAnd smoothing is carried out to enhance the geometrical characteristics of the SAR image, the structural characteristics of the SAR image are calculated by using a distance transformation method based on the geometrical constraint of the target structure, and the second target skeleton characteristic of the SAR image can be obtained after the non-maximum value inhibition is carried out on the distance transformation result.
In operation S104, a target skeleton difference intensity map between the optical image and the SAR image is determined according to the first target skeleton feature and the second target skeleton feature.
In the embodiment of the present invention, the process of determining the target skeleton difference intensity map may be: and calculating the similarity of the target skeleton characteristics between the optical image and the SAR image according to the first target skeleton characteristics and the second target skeleton characteristics. And determining a target skeleton difference intensity map by taking the similarity as the change intensity of the first salient target region of the optical image and the second salient target region of the SAR image.
Exemplarily, for an optical image and a SAR target skeleton feature similarity measurement strategy, each optically significant target locates a SAR region according to a dense matching result, and searches a maximum correlation value in a local neighborhood to suppress a structural position deviation caused by imaging model difference, and the similarity (maximum correlation value) is calculated as follows:
Figure SMS_23
the degree of similarity is calculated and,NCC(O i (x,y),S i (x,y) Is a pixel point (x,y) A similarity between the corresponding first target skeletal feature and the second target skeletal feature,O i (x,y) Is a pixel point (x,y) A corresponding first target skeleton characteristic is set for each of the first target skeleton characteristics,S i (x,y) Is a pixel point (x,y) A corresponding second target skeleton characteristic is determined,u O u S respectively, feature means of the first salient target region and the second salient target region.
Taking the similarity measurement as the change strength of the target area, namely obtaining a target skeleton difference strength graphD target
In operation S105, target-level variation information between the optical image and the SAR image is extracted from the point-by-point difference intensity map and the target skeleton difference intensity map.
In the embodiment of the invention, the point-by-point difference intensity map and the target skeleton difference intensity map are subjected to hysteresis threshold segmentation to extract target level change information.
Illustratively, in a point-by-point differential intensity mapD pixel Intensity map of difference from target skeletonD target As a change analysis criterion, joint threshold segmentation is performed according to a hysteresis threshold method as follows:
Figure SMS_24
wherein,t global is a global change threshold greater thant global The area of (2) is a possible change area, and the difference intensity of the target in the possible change area is judged to be smaller than a local threshold valuet local The final change target area is obtained, and the final change analysis result is obtained.
It should be understood that there may be many different methods for detecting the change by using the remote sensing image, as long as the optical and SAR image dense matching algorithm is used as the change detection object, and the target skeleton structure characteristic is used as the change detection reference, which belongs to the protection scope of the present disclosure.
In summary, the target-level change detection method provided by the invention adopts multi-temporal optics and SAR images as change detection objects, and can overcome the problem that optical images cannot be acquired after a disaster is suffered from the influence of cloud and rain weather. The target-level change detection method simultaneously realizes the integrated processing of dense matching and change detection of the heterogeneous images, does not depend on the high-precision geometric correction result of the images, and is favorable for practical engineering application. The target level change detection method can quickly obtain the change information of the target level, and is particularly suitable for a high-time-frequency multi-source satellite constellation system.
The above-mentioned embodiments, objects, technical solutions and advantages of the present invention are further described in detail, it should be understood that the above-mentioned embodiments are only examples of the present invention, and should not be construed as limiting the present invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A target level change detection method based on dense matching of an optical image and an SAR image is characterized by comprising the following steps:
converting the optical image and the SAR image into a common feature space, and constructing a first pixel-by-pixel high-dimensional feature corresponding to the optical image and a second pixel-by-pixel high-dimensional feature of the SAR image;
carrying out common feature dimension dense matching on the optical image and the SAR image based on the first pixel-by-pixel high-dimensional feature and the second pixel-by-pixel high-dimensional feature, aligning the optical image and the SAR image pixel by pixel, and acquiring a point-by-point difference intensity map between the optical image and the SAR image;
extracting a first target skeleton characteristic of the optical image and a second target skeleton characteristic of the SAR image;
determining a target skeleton difference intensity map between the optical image and the SAR image according to the first target skeleton characteristic and the second target skeleton characteristic;
and extracting target-level change information between the optical image and the SAR image according to the point-by-point difference intensity map and the target skeleton difference intensity map.
2. The method for detecting target-level changes according to claim 1, wherein the transforming the optical image and the SAR image into a common feature space, and the constructing the first pixel-by-pixel high-dimensional feature corresponding to the optical image and the second pixel-by-pixel high-dimensional feature of the SAR image comprises:
constructing a first dense high-dimensional feature of the optical image based on the multi-scale characteristics of the monogenic phase model;
constructing a second dense high-dimensional feature of the SAR image according to the dense ratio signal feature;
calculating the consistency of scale parameters of the first dense high-dimensional feature and the second dense high-dimensional feature to ensure that the SAR image and the optical image have the pixel-by-pixel feature commonality;
and respectively carrying out three-dimensional cavity convolution operation on the first intensive high-dimensional feature and the second intensive high-dimensional feature to obtain the first pixel-by-pixel high-dimensional feature and the second pixel-by-pixel high-dimensional feature.
3. The target-level change detection method of claim 1, wherein the performing common feature dimension dense matching on the optical image and the SAR image based on the first pixel-by-pixel high-dimensional feature and the second pixel-by-pixel high-dimensional feature comprises:
constructing a characteristic energy item according to the first pixel-by-pixel high-dimensional characteristic and the second pixel-by-pixel high-dimensional characteristic, and introducing a spatial smoothing item and a noise filtering item to construct a global objective function with densely matched characteristic dimensions;
performing multi-level propagation optimization on the global objective function, and aligning the optical image and the SAR image pixel by pixel; the multilevel propagation optimization comprises the steps of establishing a multilevel characteristic space for the common characteristics of the optical image and the SAR image, and carrying out dense matching result transmission from a coarse level to a fine level layer by layer in the multilevel space; after the dense matching result is transmitted, acquiring a final dense matching result at the level of the original resolution;
and acquiring the optimized target energy as the point-by-point difference intensity map.
4. The target-level change detection method of claim 3, wherein the global objective function is:
Figure QLYQS_1
wherein, (ii) (x,y) Coordinates of pixels in the optical image or the SAR image, E: (u,v) For a target energy model with dense matching of common feature dimensions,uvthe horizontal coordinate offset and the vertical coordinate offset of the target energy are expressed respectively,DenseF opt (x,y) Representing pixels in the optical image (x,y) The corresponding high-dimensional features are used for identifying the features,DenseF sar (x,y) (ii) representing a pixel in the SAR imagex,y) Corresponding high-dimensional feature: (A)DenseF opt (x,y)-DenseF sar (x+u,y+v) Is) representative of the characteristic energy term,ρ d ρ s a penalty function representing a characteristic energy term and a penalty function representing a spatial smoothness term, respectively, in the target energy modelu、▽vThe spatial smoothing term is represented as a function of,w x y(,) (|u x y, - u x’ y’, |+|v x y, - v x’ y’, |) represents the noise-filtered term,N x y, representing the noise term computation neighborhood, (/)x’,y’) To representN x y, The coordinates of the pixel points in (1),u x y, representing a pixel (x,y) The amount of horizontal coordinate offset of the energy of,u x’ y’, representing a pixel (x’,y’) The amount of horizontal coordinate offset of the energy of,v x y, express pixel point (x,y) The amount of vertical coordinate offset of the energy of,v x’ y’, representing a pixel (x’,y’) The amount of vertical coordinate offset of the energy of,w x y(,) is a pixel point (x,y) The weight of (a) is determined,γis a regularization parameter.
5. The method of claim 1, wherein extracting the first target skeletal features of the optical image comprises:
adopting the cycle energy estimation of a variation filter as a criterion of self-adaptive scale analysis to carry out self-adaptive scale significance estimation on the optical image so as to obtain a first significant target area;
and extracting the characteristic of a target center line of the first salient target region, and acquiring the skeleton characteristic of the first salient target region as the first target skeleton characteristic.
6. The method of claim 5, wherein the adaptive scale significance estimate is calculated by:
Figure QLYQS_2
wherein,Iis an original optical image of the object to be imaged,T i is as followsiThe result of the variation filtering of a cycle,INthe number of the circulation is the number of times,TVis a variational filter.
7. The method of claim 2, wherein extracting the second target skeleton feature of the SAR image comprises:
acquiring a second significant target area in the SAR image;
extracting the annular ratio operator response of the second significant target region;
smoothing the response of the annular ratio operator;
calculating the geometric distance transformation of the target according to the annular ratio operator response after the smoothing treatment based on the geometric constraint of the target structure;
and carrying out non-maximum suppression on the geometric distance transformation of the target to obtain the second target skeleton characteristic of the target.
8. The target-level change detection method of claim 7, based on
Figure QLYQS_3
Calculating the ring ratio operator responseCRf n h n v n And the components of the second dense high-dimensional features of the current scale corresponding to the ratio filters of the dense ratios in three different directions are respectively.
9. The method of claim 1, wherein determining the target-skeleton difference intensity map between the optical image and the SAR image according to the first target-skeleton feature and the second target-skeleton feature comprises:
calculating the similarity of the target skeleton characteristics between the optical image and the SAR image according to the first target skeleton characteristics and the second target skeleton characteristics;
determining the target skeleton difference intensity map by taking the similarity as the change intensity of a first salient target region of the optical image and a second salient target region of the SAR image;
wherein, according to
Figure QLYQS_4
The degree of similarity is calculated by calculating the degree of similarity,NCC(O i (x,y),S i (x,y) Is a pixel point (x,y) A similarity between the corresponding first and second target skeletal features,O i (x,y) Is a pixel point(x,y) A corresponding one of the first target skeleton features,S i (x,y) Is a pixel point (x,y) A corresponding one of the second target skeleton features,u O u S the feature mean values of the first salient target region and the second salient target region are respectively.
10. The method of claim 1, wherein the extracting target-level variation information between the optical image and the SAR image according to the point-by-point difference intensity map and the target skeleton difference intensity map comprises:
and carrying out hysteresis threshold segmentation on the point-by-point difference intensity map and the target skeleton difference intensity map so as to extract the target level change information.
CN202310076260.6A 2023-02-08 2023-02-08 Target level change detection method based on dense matching of optical image and SAR image Active CN115797796B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310076260.6A CN115797796B (en) 2023-02-08 2023-02-08 Target level change detection method based on dense matching of optical image and SAR image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310076260.6A CN115797796B (en) 2023-02-08 2023-02-08 Target level change detection method based on dense matching of optical image and SAR image

Publications (2)

Publication Number Publication Date
CN115797796A true CN115797796A (en) 2023-03-14
CN115797796B CN115797796B (en) 2023-05-02

Family

ID=85430292

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310076260.6A Active CN115797796B (en) 2023-02-08 2023-02-08 Target level change detection method based on dense matching of optical image and SAR image

Country Status (1)

Country Link
CN (1) CN115797796B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170019653A1 (en) * 2014-04-08 2017-01-19 Sun Yat-Sen University Non-feature extraction-based dense sfm three-dimensional reconstruction method
CN109300115A (en) * 2018-09-03 2019-02-01 河海大学 A kind of multispectral high-resolution remote sensing image change detecting method of object-oriented
CN111062972A (en) * 2019-12-25 2020-04-24 泉州装备制造研究所 Image tracking method based on image frequency domain conversion
CN114299397A (en) * 2021-12-30 2022-04-08 中国科学院空天信息创新研究院 Multi-temporal SAR image change detection method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170019653A1 (en) * 2014-04-08 2017-01-19 Sun Yat-Sen University Non-feature extraction-based dense sfm three-dimensional reconstruction method
CN109300115A (en) * 2018-09-03 2019-02-01 河海大学 A kind of multispectral high-resolution remote sensing image change detecting method of object-oriented
CN111062972A (en) * 2019-12-25 2020-04-24 泉州装备制造研究所 Image tracking method based on image frequency domain conversion
CN114299397A (en) * 2021-12-30 2022-04-08 中国科学院空天信息创新研究院 Multi-temporal SAR image change detection method

Also Published As

Publication number Publication date
CN115797796B (en) 2023-05-02

Similar Documents

Publication Publication Date Title
CN110443836B (en) Point cloud data automatic registration method and device based on plane features
CN111882612B (en) Vehicle multi-scale positioning method based on three-dimensional laser detection lane line
CN103345757B (en) Optics under multilevel multi-feature constraint and SAR image autoegistration method
CN103679714B (en) A kind of optics and SAR automatic image registration method based on gradient cross-correlation
CN104091369B (en) Unmanned aerial vehicle remote-sensing image building three-dimensional damage detection method
CN109409292A (en) The heterologous image matching method extracted based on fining characteristic optimization
CN104318548A (en) Rapid image registration implementation method based on space sparsity and SIFT feature extraction
CN103886611A (en) Image matching method suitable for automatically detecting flight quality of aerial photography
CN106097430B (en) A kind of laser stripe center line extraction method of more gaussian signal fittings
CN116279592A (en) Method for dividing travelable area of unmanned logistics vehicle
Zhang et al. Multiple Saliency Features Based Automatic Road Extraction from High‐Resolution Multispectral Satellite Images
CN115797796B (en) Target level change detection method based on dense matching of optical image and SAR image
CN111368716B (en) Geological disaster damage cultivated land extraction method based on multi-source space-time data
CN117036600A (en) Human body modeling system and method based on integration of ToF camera and millimeter wave radar
CN109872353B (en) White light data and CT data registration method based on improved iterative closest point algorithm
CN116778266A (en) Multi-scale neighborhood diffusion remote sensing point cloud projection image processing method
CN115082561B (en) Calibration method, device, equipment and medium for roadside sensor
CN106355576A (en) SAR image registration method based on MRF image segmentation algorithm
CN116468760A (en) Multi-source remote sensing image registration method based on anisotropic diffusion description
CN113379710B (en) Underwater target sonar accurate measurement system and method
CN115588033A (en) Synthetic aperture radar and optical image registration system and method based on structure extraction
CN115100446A (en) Similarity measurement method for matching SAR and visible light remote sensing image
CN113138375B (en) Combined calibration method
CN113343747B (en) Multi-mode image robust matching VNS method
CN116385502B (en) Image registration method based on region search under geometric constraint

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant