CN115797796B - Target level change detection method based on dense matching of optical image and SAR image - Google Patents

Target level change detection method based on dense matching of optical image and SAR image Download PDF

Info

Publication number
CN115797796B
CN115797796B CN202310076260.6A CN202310076260A CN115797796B CN 115797796 B CN115797796 B CN 115797796B CN 202310076260 A CN202310076260 A CN 202310076260A CN 115797796 B CN115797796 B CN 115797796B
Authority
CN
China
Prior art keywords
target
pixel
feature
optical image
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310076260.6A
Other languages
Chinese (zh)
Other versions
CN115797796A (en
Inventor
胡玉新
向俞明
王峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Information Research Institute of CAS
Original Assignee
Aerospace Information Research Institute of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerospace Information Research Institute of CAS filed Critical Aerospace Information Research Institute of CAS
Priority to CN202310076260.6A priority Critical patent/CN115797796B/en
Publication of CN115797796A publication Critical patent/CN115797796A/en
Application granted granted Critical
Publication of CN115797796B publication Critical patent/CN115797796B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention provides a target level change detection method based on dense matching of an optical image and an SAR image, which comprises the following steps: converting the optical image and the SAR image into a common feature space, and constructing a first pixel-by-pixel high-dimensional feature corresponding to the optical image and a second pixel-by-pixel high-dimensional feature corresponding to the SAR image; performing common feature dimension intensive matching on the optical image and the SAR image based on the first pixel-by-pixel high-dimensional feature and the second pixel-by-pixel high-dimensional feature, aligning the optical image and the SAR image pixel by pixel, and obtaining a point-by-point difference intensity map between the optical image and the SAR image; extracting a first target skeleton feature of the optical image and a second target skeleton feature of the SAR image; determining a target skeleton difference intensity map between the optical image and the SAR image according to the first target skeleton feature and the second target skeleton feature; and extracting target level change information between the optical image and the SAR image according to the point-by-point difference intensity map and the target skeleton difference intensity map. The method is independent of geometric correction and registration processing, and has high processing efficiency.

Description

Target level change detection method based on dense matching of optical image and SAR image
Technical Field
The invention relates to the technical field of heterogeneous remote sensing image processing, in particular to a target level change detection method based on dense matching of an optical image and an SAR image.
Background
Optical images and synthetic aperture radar (Synthetic Aperture Radar, SAR) images are the most common two types of sensor images, wherein the optical images belong to passive remote sensing, and are imaged by reflecting sunlight to a sensor through a target, so that the color and brightness information of ground objects can be truly reflected, but the sensor cannot work at night and under cloud and snow weather conditions. SAR images are sensors which perform active sensing by microwaves, reflect the structural characteristics and electromagnetic scattering characteristics of ground object targets, have the advantages of all-weather and all-day observation, and are seriously affected by speckle noise. The method has important application requirements and values by jointly utilizing the advantages of the optical and SAR images, for example, the high-quality optical images before the disaster and the SAR images obtained in emergency after the disaster can be compared to judge the specific position and disaster receiving degree of the disaster, so that accurate and rapid rescue guidance work is realized.
The existing change detection method needs to perform high-precision geometric correction and registration processing on remote sensing images, has low processing efficiency and complex flow, can not meet the change detection requirement under emergency conditions, and particularly can not meet the information acquisition requirement of high time frequency when performing high-precision geometric correction on scenery-by-scenery images for a multi-source satellite constellation system.
In addition, because the imaging mechanisms of the optical and SAR images are completely different, different characteristics of ground object targets are reflected, and the remote sensing images are limited by the precision of a satellite platform and a load device, a non-negligible positioning error still exists. There is therefore a need for an automatic and robust method to deal with both the matching and change detection problems of optical and SAR images.
Disclosure of Invention
The invention provides a target level change detection method based on dense matching of an optical image and an SAR image, which is used for at least partially solving the technical problems.
Based on the above, the invention provides a target level change detection method based on dense matching of an optical image and an SAR image, which comprises the following steps: converting the optical image and the SAR image into a common feature space, and constructing a first pixel-by-pixel high-dimensional feature corresponding to the optical image and a second pixel-by-pixel high-dimensional feature corresponding to the SAR image; performing common feature dimension intensive matching on the optical image and the SAR image based on the first pixel-by-pixel high-dimensional feature and the second pixel-by-pixel high-dimensional feature, aligning the optical image and the SAR image pixel by pixel, and obtaining a point-by-point difference intensity map between the optical image and the SAR image; extracting a first target skeleton feature of the optical image and a second target skeleton feature of the SAR image; determining a target skeleton difference intensity map between the optical image and the SAR image according to the first target skeleton feature and the second target skeleton feature; and extracting target level change information between the optical image and the SAR image according to the point-by-point difference intensity map and the target skeleton difference intensity map.
According to an embodiment of the present invention, converting an optical image and an SAR image into a common feature space, and constructing a first pixel-by-pixel high-dimensional feature corresponding to the optical image and a second pixel-by-pixel high-dimensional feature of the SAR image includes: constructing a first dense high-dimensional feature of the optical image based on the multi-scale characteristics of the single-phase model; constructing a second dense high-dimensional characteristic of the SAR image according to the dense ratio signal characteristic; calculating the consistency of scale parameters of the first dense high-dimensional feature and the second dense high-dimensional feature to ensure that the SAR image and the optical image have pixel-by-pixel feature commonality; and respectively carrying out three-dimensional cavity convolution operation on the first dense high-dimensional feature and the second dense high-dimensional feature to obtain a first pixel-by-pixel high-dimensional feature and a second pixel-by-pixel high-dimensional feature.
According to an embodiment of the present invention, performing a common feature dimension dense matching of an optical image and a SAR image based on a first pixel-wise high-dimensional feature and a second pixel-wise high-dimensional feature comprises: constructing a feature energy term according to the first pixel-by-pixel high-dimensional feature and the second pixel-by-pixel high-dimensional feature, and introducing a space smoothing term and a noise filtering term to construct a global objective function with densely matched feature dimensions; performing multi-level propagation optimization on the global objective function, and aligning the optical image and the SAR image pixel by pixel; the multi-level propagation optimization comprises the steps of establishing a multi-level feature space for the common features of the optical image and the SAR image, and carrying out dense matching result transfer from a coarse level to a fine level layer by layer in the multi-level space; after the result of dense matching is transferred, a final dense matching result is obtained at the level of the original resolution; and obtaining the optimized target energy as a point-by-point difference intensity graph.
According to an embodiment of the invention, the global objective function is:
Figure SMS_1
wherein, the method comprises the following steps ofx,y) E is the pixel point coordinates in the optical image or SAR imageu,v) A target energy model densely matched for the common feature dimension,uvthe horizontal coordinate offset and the vertical coordinate offset of the target energy are expressed respectively,DenseF opt (x,y) Representing pixel points in the optical imagex,y) The corresponding high-dimensional features are provided in the form of,DenseF sar (x,y) Representing pixel points in SAR imagex,y) Corresponding high-dimensional characteristics [ ]DenseF opt (x,y)- DenseF sar (x+u,y+v) A) represents a characteristic energy term,ρ d ρ s penalty functions respectively representing characteristic energy terms and spatial smoothing terms in the target energy modelu、▽vThe term of spatial smoothing is represented as such,w x y(,) (|u x y, - u x’ y’, |+|v x y, - v x’ y’, i) represents a noise-filtered term,N x y, representing the calculation neighborhood of noise itemx’,y’) Representation ofN x y, In the number of pixels in the image, the coordinates of the pixel points in (a),u x y, representing pixel points [ ]x,y) Is set in the horizontal coordinate offset of the energy of (a),u x’ y’, representing pixel points [ ]x’,y’) Is set in the horizontal coordinate offset of the energy of (a),v x y, representing pixel points [ ]x,y) Is set in the vertical coordinate offset of the energy of (c),v x’ y’, representing pixel points [ ]x’,y’) Is set in the vertical coordinate offset of the energy of (c),w x y(,) is pixel point #)x,y) Is used for the weight of the (c),γis a regularization parameter.
According to an embodiment of the present invention, extracting a first target skeleton feature of an optical image includes: the cyclic energy estimation of the variational filter is adopted as a criterion of the self-adaptive scale analysis, and the saliency estimation of the self-adaptive scale is carried out on the optical image so as to obtain a first salient target area; and extracting the target central line characteristic of the first remarkable target area, and acquiring the skeleton characteristic of the first remarkable target area as a first target skeleton characteristic.
According to the embodiment of the invention, the significance estimation calculation mode of the self-adaptive scale is as follows:
Figure SMS_2
wherein, Ias an original optical image of the object,T i is the firstiVariational filtering of individual cyclesAs a result of this, the processing time,INfor the number of cycles to be counted,TVis a variational filter.
According to an embodiment of the present invention, extracting a second target skeleton feature of a SAR image includes: acquiring a second significant target region in the SAR image; extracting the response of the annular ratio operator of the second significant target region; smoothing the response of the annular ratio operator; based on the geometric constraint of the target structure, calculating the geometric distance transformation of the target according to the response of the smoothed annular ratio operator; and performing non-maximum suppression on the geometric distance transformation of the target to obtain a second target skeleton characteristic of the target.
According to an embodiment of the invention, according to
Figure SMS_3
Computing a circular ratio operator responseCRf n h n v n The components of the second dense high-dimensional characteristic of the current scale corresponding to the ratio filters with dense ratio in three different directions are respectively.
According to an embodiment of the present invention, determining a target skeleton difference intensity map between an optical image and a SAR image from a first target skeleton feature and a second target skeleton feature comprises: calculating the similarity of the target skeleton features between the optical image and the SAR image according to the first target skeleton features and the second target skeleton features; the similarity is used as the change intensity of a first obvious target area of the optical image and a second obvious target area of the SAR image, and a target skeleton difference intensity map is determined; wherein according to
Figure SMS_4
The degree of similarity is calculated and,NCC(O i (x,y),S i (x,y) Is pixel point [ ]x,y) A similarity between the corresponding first target skeletal feature and the second target skeletal feature,O i (x,y) Is pixel point #)x,y) A corresponding first target skeleton feature is provided,S i (x,y) Is pixel point #)x,y) A corresponding second target skeletal feature is provided,u O u S feature averages for the first salient object region and the second salient object region, respectively.
According to an embodiment of the present invention, extracting target level change information between an optical image and an SAR image from a point-by-point difference intensity map and a target skeleton difference intensity map includes: and performing hysteresis threshold segmentation on the point-by-point difference intensity map and the target skeleton difference intensity map to extract target level change information.
The target level change detection method based on dense matching of the optical image and the SAR image provided by the embodiment of the invention at least comprises the following beneficial effects:
the method converts the multi-time heterogeneous optical image and SAR image into the common feature space with high similarity, avoids the positioning deviation and the radiation geometric inconsistency caused by completely different imaging mechanisms of the optical image and the SAR image, and avoids the initial positioning error of the image. And the multi-temporal optics and SAR images are used as change detection objects, so that the problem that the optical images cannot be acquired due to the influence of cloud and rain weather after disaster is solved.
The method comprises the steps of carrying out intensive matching of common features based on a common feature space, carrying out multi-level propagation optimization on a global objective function in the matching process to obtain a point-by-point difference intensity map, obtaining a skeleton difference intensity map based on significance detection, finally combining the target skeleton difference intensity map and the point-by-point difference intensity map, and obtaining target level change information through threshold calculation.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent from the following description of embodiments of the present invention with reference to the accompanying drawings, in which:
fig. 1 schematically shows a flowchart of a target level change detection method based on dense matching of an optical image and an SAR image according to an embodiment of the present invention.
Detailed Description
The present invention will be further described in detail below with reference to specific embodiments and with reference to the accompanying drawings, in order to make the objects, technical solutions and advantages of the present invention more apparent. It will be apparent that the described embodiments are some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The terms "comprises," "comprising," and/or the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
In the present invention, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly and include, for example, either permanently connected, removably connected, or integrally formed therewith; may be mechanically connected, may be electrically connected or may communicate with each other; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
In the description of the present invention, it should be understood that the terms "longitudinal," "length," "circumferential," "front," "rear," "left," "right," "top," "bottom," "inner," "outer," and the like indicate an orientation or a positional relationship based on that shown in the drawings, merely for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the subsystem or element in question must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present invention.
Like elements are denoted by like or similar reference numerals throughout the drawings. Conventional structures or constructions will be omitted when they may cause confusion in the understanding of the invention. And the shape, size and position relation of each component in the figure do not reflect the actual size, proportion and actual position relation. In addition, in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim.
Similarly, in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various disclosed aspects. The description of the terms "one embodiment," "some embodiments," "example," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present invention, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise.
Fig. 1 schematically shows a flowchart of a target level change detection method based on dense matching of an optical image and an SAR image according to an embodiment of the present invention.
As shown in fig. 1, the target level change detection method based on dense matching of the optical image and the SAR image may include operations S101 to S105.
In operation S101, the optical image and the SAR image are converted into a common feature space, and a first pixel-by-pixel high-dimensional feature corresponding to the optical image and a second pixel-by-pixel high-dimensional feature of the SAR image are constructed.
In the embodiment of the present disclosure, operation S101 may include, for example: a first dense high-dimensional feature of the optical image is constructed based on the multi-scale characteristics of the single-phase model. And constructing a second dense high-dimensional characteristic of the SAR image according to the dense ratio signal characteristic. And calculating the consistency of the scale parameters of the first dense high-dimensional feature and the second dense high-dimensional feature, so that the SAR image and the optical image have pixel-by-pixel feature commonality. And respectively carrying out three-dimensional cavity convolution operation on the first dense high-dimensional feature and the second dense high-dimensional feature to obtain a first pixel-by-pixel high-dimensional feature and a second pixel-by-pixel high-dimensional feature.
Illustratively, the design commonality feature construction method obtains the intrinsic commonality of the optical image and the SAR image in consideration of the radiation geometric difference of the optical image and the SAR image, and reduces the obvious difference. Assume that the optical image has a size ofW×H
The two-dimensional mono-model consists of an original two-dimensional signal and the Rayleigh transform thereof, and is as follows:
Figure SMS_5
wherein, jin units of imaginary numbers,Trepresenting the transpose.
And then carrying out convolution calculation on the two-dimensional mono-signal and the multi-scale orthogonal filter to obtain local phase responses in different directions as follows:
Figure SMS_6
the dense high-dimensional characteristics of the optical image can be obtained as follows:
Figure SMS_7
wherein, DenseF opt representing a first dense high-dimensional feature of the optical image,F n H n V n respectively the single-phase characteristics of three different directions of the current scale,realimagthe operations are respectively taking the real part and the imaginary part of the complex number,wl n for the current scale log filter wavelength parameter,Ias an original image, the image is a picture,fftandifftrespectively a two-dimensional fourier transform and an inverse fourier transform,MFis a single-pass signal filter, is a matrix dot product operation,Nas a dimension of the overall dimension of the scale,nfor the current dimension of the scale, U represents accumulating multidimensional features in the dimension of the scale.
Assuming a total dimension of dimensionsN=4, then the feature dimension is obtained asW×HX 12 first dense high dimensional feature.
Next, pixel-by-pixel high-dimensional features suitable for SAR characteristics are constructed from the dense ratio signal features. Specifically, the convolution of the optical image and the filter bank is replaced by an operator based on the ratio signal, so that multiplicative speckle noise in the SAR image can be effectively inhibited; the ratio signal is calculated using a two-dimensional Gabor filter as follows:
Figure SMS_8
wherein, σis the standard deviation of the gaussian kernel,ωis sinusoidal and takes a value of 1.8 +.σθIs the filter rotation angle.
Current scale levelθ=0And verticallyθ=πThe mean value of the convolution of the processing window with the original image is calculated as follows:
Figure SMS_9
wherein,
Figure SMS_10
、/>
Figure SMS_11
convolution means of horizontal filter and image, respectively, ">
Figure SMS_12
、/>
Figure SMS_13
The convolution means of the vertical filter and the image, respectively. Furthermore, by designing an isotropic conical filter instead of the convolution of the isotropic filter, the following is shown:
Figure SMS_14
by accumulating multi-directional local phase responses of different scales, a second dense high-dimensional characteristic of the SAR image can be constructedDenseF sar
Figure SMS_15
Wherein, f n h n v n the components of the second dense high-dimensional characteristic of the current scale corresponding to the ratio filters with dense ratio in three different directions are respectively.
By calculation ofDenseF opt AndDenseF sar the dimension parameters of the high-dimensional features are consistent, so that the actual convolution window sizes of the current dimension log filter and the two-dimensional Gabor filter are consistent, and the pixel-by-pixel feature commonality of the optical and SAR images can be ensured; finally toDenseF opt AndDenseF sar performing three-dimensional cavity convolution operation, enhancing noise immunity of pixel-by-pixel high-dimensional features, capturing structural characteristics of a multi-scale target, and obtaining a first pixel-by-pixel high-dimensional feature and a second pixel-by-pixel high-dimensional feature, wherein the steps are as follows:
Figure SMS_16
wherein, conv3din the case of a three-dimensional convolution,sizefor the convolution kernel size, the value may be, for example, 5,dilationthe cavitation factor may be, for example, 2.
In operation S102, a common feature dimension dense matching is performed on the optical image and the SAR image based on the first pixel-by-pixel high-dimensional feature and the second pixel-by-pixel high-dimensional feature, and the optical image and the SAR image are aligned pixel by pixel, so as to obtain a point-by-point difference intensity map between the optical image and the SAR image.
In the embodiment of the present invention, the process of dense matching of the common feature dimension may be, for example: and constructing a feature energy term according to the first pixel-by-pixel high-dimensional feature and the second pixel-by-pixel high-dimensional feature, and introducing a space smoothing term and a noise filtering term to construct a global objective function with densely matched feature dimensions. Performing multi-level propagation optimization on the global objective function, and aligning the optical image and the SAR image pixel by pixel, wherein the multi-level propagation optimization comprises the steps of establishing a multi-level feature space for the commonality features of the optical image and the SAR image, and performing dense matching result transfer from a coarse level to a fine level layer by layer in the multi-level space; after the result transfer of the dense matching, the final dense matching result is obtained at the level of the original resolution. Obtaining optimized target energy as a point-by-point difference intensity mapD pixel
Based on consistency of the common features, converting the image matching problem into a pixel-by-pixel offset estimation problem; after satisfying the feature dimension constant assumption and the space smoothing assumption, the global objective function for constructing feature dimension dense matching can be:
Figure SMS_17
wherein, the method comprises the following steps ofx,y) E is the pixel point coordinates in the optical image or SAR imageu,v) A target energy model densely matched for the common feature dimension,uvthe horizontal and vertical coordinate offsets of the target energy respectively,DenseF opt (x,y) Representing pixel points in the optical imagex,y) The corresponding high-dimensional features are provided in the form of,DenseF sar (x,y) Representing pixel points in SAR imagex,y) Corresponding high-dimensional characteristics [ ]DenseF opt (x,y)- DenseF sar (x+u,y+v) A) represents a characteristic energy term,ρ d ρ s penalty functions respectively representing characteristic energy terms and spatial smoothing terms in the target energy modelu、▽vThe term of spatial smoothing is represented as such,w x y(,) (|u x y, - u x’ y’, |+|v x y, - v x’ y’, i) represents a noise-filtered term,N x y, representing the calculation neighborhood of noise itemx’,y’) Representation ofN x y, In the number of pixels in the image, the coordinates of the pixel points in (a),u x y, representing pixel points [ ]x,y) Is set in the horizontal coordinate offset of the energy of (a),u x’ y’, representing pixel points [ ]x’,y’) Is set in the horizontal coordinate offset of the energy of (a),v x y, representing pixel points [ ]x,y) Is set in the vertical coordinate offset of the energy of (c),v x’ y’, representing pixel points [ ]x’,y’) Is set in the vertical coordinate offset of the energy of (c),w x y(,) is pixel point #)x,y) Is used for the weight of the (c),γis a regularization parameter.
The penalty function of the feature energy term and the penalty function of the spatial smoothing term may employ an L1 norm; in consideration of speckle noise commonly existing in SAR images, noise filtering terms are introduced into the objective function to eliminate noise in the objective function, and a weighted median filter in an L1 norm form is adopted as the noise filtering terms. Considering that the pixels in the homogeneous region should be weighted more heavily, while the pixels of the heterogeneous region may depend on the consistency of the commonality characteristics, the coefficient of variation of the current pixel is taken as the weightw x y(,) The following are provided:
Figure SMS_18
wherein var x y(,) The variance of the neighborhood is calculated for the noise term,m x y(,) the variance of the neighborhood is calculated for the noise term.
In operation S103, a first target skeleton feature of the optical image and a second target skeleton feature of the SAR image are extracted.
In an embodiment of the present invention, extracting a first target skeleton feature of an optical image includes: and adopting the cyclic energy estimation of the variational filter as a criterion of the adaptive scale analysis, and carrying out the saliency estimation of the adaptive scale on the optical image so as to acquire a first salient target area. And extracting the target central line characteristic of the first remarkable target area, and acquiring the skeleton characteristic of the first remarkable target area as a first target skeleton characteristic.
For example, for the purpose of target level change analysis, adaptive scale saliency estimation of an optical image is required in order to be applicable to multi-scale targets widely existing in remote sensing images; the cyclic energy estimation of the variational filter is adopted as a criterion of the self-adaptive scale analysis, and the energy of the homogeneous region is smaller and corresponds to a large-scale target; and the energy change of the heterogeneous region is obvious and corresponds to a small-scale target; the significance estimate for the adaptive scale is calculated as follows:
Figure SMS_19
wherein, Ias an original optical image of the object,T i is the firstiThe result of the variational filtering of each cycle,INfor the number of cycles to be counted,TVfor a variational filter, the kernel function of the variational filter is:
Figure SMS_20
by accumulating the filter result changes in all cycles, an optical image significance intensity map can be obtainedSigThe saliency region can be obtained through threshold segmentation, and then the optically dense high-dimensional characteristic of the saliency region is obtainedDenseF opt Feature dimension aggregation is carried out to obtain the structural features of the salient regionSF opt The method comprises the following steps:
Figure SMS_21
for a pair ofSF opt And (3) performing skeleton center line estimation based on morphological processing and Hough transformation, extracting position information of the skeleton center line, and obtaining a first target skeleton characteristic of the obvious target region of the optical image.
In an embodiment of the present invention, extracting a second target skeleton feature of an SAR image includes: and acquiring a second significant target region in the SAR image. And extracting the annular ratio operator response of the second significant target region. And smoothing the response of the ring ratio operator. And calculating the geometric distance transformation of the target according to the smoothed annular ratio operator response based on the geometric constraint of the target structure. And performing non-maximum suppression on the geometric distance transformation of the target to obtain a second target skeleton characteristic of the target.
Illustratively, the structural position information of the strong scattering points of the target area in the SAR image is extracted based on the annular ratio operator, so that the false alarm influence caused by noise high-frequency components can be effectively restrained, and the method can be based on the following steps
Figure SMS_22
Computing a circular ratio operator responseCR
And then respond to the ring ratio operatorCRSmoothing is carried out to enhance the geometric characteristics of the SAR image, the structural characteristics of the target are calculated by using a distance transformation method based on the geometric constraint of the target structure, and the second target skeleton characteristic of the SAR image can be obtained after non-maximum suppression is carried out on the distance transformation result.
In operation S104, a target skeleton difference intensity map between the optical image and the SAR image is determined from the first target skeleton feature and the second target skeleton feature.
In the embodiment of the invention, the process of determining the target skeleton difference intensity map may be: and calculating the similarity of the target skeleton characteristics between the optical image and the SAR image according to the first target skeleton characteristics and the second target skeleton characteristics. And determining a target skeleton difference intensity map by taking the similarity as the change intensity of the first significant target region of the optical image and the second significant target region of the SAR image.
Illustratively, for an optical image and a SAR target skeleton feature similarity measurement strategy, each optical salient target locates a SAR region according to a dense matching result, and searches a local neighborhood for a maximum correlation value to suppress a structural position deviation introduced by an imaging model difference, and the calculation of the similarity (maximum correlation value) is as follows:
Figure SMS_23
the degree of similarity is calculated and,NCC(O i (x,y),S i (x,y) Is pixel point [ ]x,y) A similarity between the corresponding first target skeletal feature and the second target skeletal feature,O i (x,y) Is pixel point #)x,y) A corresponding first target skeleton feature is provided,S i (x,y) Is pixel point #)x,y) A corresponding second target skeletal feature is provided,u O u S feature averages for the first salient object region and the second salient object region, respectively.
Taking the similarity measure as the change intensity of the target area, namely the target skeleton difference intensity graphD target
In operation S105, target level change information between the optical image and the SAR image is extracted from the point-by-point difference intensity map and the target skeleton difference intensity map.
In the embodiment of the invention, the point-by-point difference intensity map and the target skeleton difference intensity map are subjected to hysteresis threshold method segmentation so as to extract target level change information.
Illustratively, the intensity map is differentiated point by pointD pixel And a target skeleton difference intensity mapD target As a change analysis standard, joint threshold segmentation is performed according to a hysteresis threshold method, as follows:
Figure SMS_24
wherein, t global is a global change threshold greater thant global Is a possible change region, and the target difference intensity is judged in the possible change region to be smaller than the local threshold valuet local And (4) obtaining a final change analysis result as a final change target area.
It should be understood that there may be a variety of different methods for detecting a change by using a remote sensing image, so long as an optical and SAR image dense matching algorithm is used as a change detection object, and a target skeleton structural feature is used as a change detection reference, which belongs to the protection scope of the present disclosure.
In summary, the target level change detection method provided by the invention adopts multi-temporal optics and SAR images as change detection objects, so that the problem that the optical images cannot be acquired due to the influence of cloud and rain weather after disaster is solved. The target level change detection method simultaneously realizes the dense matching and change detection integrated processing of the heterogeneous images, does not depend on high-precision image geometric correction results, and is beneficial to practical engineering application. The target level change detection method can quickly obtain the change information of the target level, and is particularly suitable for a high-time-frequency multi-source satellite constellation system.
While the foregoing is directed to embodiments of the present invention, other and further details of the invention may be had by the present invention, it should be understood that the foregoing description is merely illustrative of the present invention and that no limitations are intended to the scope of the invention, except insofar as modifications, equivalents, improvements or modifications are within the spirit and principles of the invention.

Claims (8)

1. The target level change detection method based on dense matching of the optical image and the SAR image is characterized by comprising the following steps of:
converting an optical image and an SAR image into a common feature space, and constructing a first pixel-by-pixel high-dimensional feature corresponding to the optical image and a second pixel-by-pixel high-dimensional feature of the SAR image, wherein the method comprises the following steps: constructing a first dense high-dimensional feature of the optical image based on the multi-scale characteristics of the single-phase model; constructing a second dense high-dimensional characteristic of the SAR image according to the dense ratio signal characteristic; calculating the consistency of scale parameters of the first dense high-dimensional feature and the second dense high-dimensional feature to enable the SAR image and the optical image to be in pixel-by-pixel feature commonality; performing three-dimensional cavity convolution operation on the first dense high-dimensional feature and the second dense high-dimensional feature to obtain the first pixel-by-pixel high-dimensional feature and the second pixel-by-pixel high-dimensional feature;
performing common feature dimension dense matching on the optical image and the SAR image based on the first pixel-by-pixel high-dimensional feature and the second pixel-by-pixel high-dimensional feature, aligning the optical image and the SAR image pixel by pixel, and obtaining a point-by-point difference intensity map between the optical image and the SAR image, wherein the method comprises the following steps: constructing a feature energy term according to the first pixel-by-pixel high-dimensional feature and the second pixel-by-pixel high-dimensional feature, and introducing a space smoothing term and a noise filtering term to construct a global objective function with densely matched feature dimensions; performing multi-level propagation optimization on the global objective function, and aligning the optical image and the SAR image pixel by pixel; the multi-level propagation optimization comprises the steps of establishing a multi-level feature space for the common features of the optical image and the SAR image, and carrying out dense matching result transfer from a coarse level to a fine level layer by layer in the multi-level space; after the result of dense matching is transferred, a final dense matching result is obtained at the level of the original resolution; acquiring optimized target energy as the point-by-point difference intensity map;
extracting a first target skeleton feature of the optical image and a second target skeleton feature of the SAR image;
determining a target skeleton difference intensity map between the optical image and the SAR image according to the first target skeleton feature and the second target skeleton feature;
and extracting target level change information between the optical image and the SAR image according to the point-by-point difference intensity map and the target skeleton difference intensity map.
2. The target level change detection method according to claim 1, wherein the global objective function is:
Figure QLYQS_1
wherein, the method comprises the following steps ofx,y) E is the pixel point coordinates in the optical image or the SAR imageu,v) A target energy model densely matched for the common feature dimension,uvthe horizontal coordinate offset and the vertical coordinate offset of the target energy are expressed respectively,DenseF opt (x,y) Representing pixel points in the optical imagex,y) The corresponding high-dimensional features are provided in the form of,DenseF sar (x,y) Representing pixel points in the SAR imagex,y) Corresponding high-dimensional characteristics [ ]DenseF opt (x,y)-DenseF sar (x+u,y+v) A) represents the characteristic energy term,ρ d ρ s penalty functions respectively representing characteristic energy terms and spatial smoothing terms in the target energy modelu、▽vRepresenting the term of the spatial smoothing as described,w x y(,) (|u x y, - u x’ y’, |+|v x y, - v x’ y’, i) represents the noise filtered term,N x y, representing the calculation neighborhood of noise itemx’,y’) Representation ofN x y, In the number of pixels in the image, the coordinates of the pixel points in (a),u x y, representing pixel points [ ]x,y) Is set in the horizontal coordinate offset of the energy of (a),u x’ y’, representing pixel points [ ]x’,y’) Is set in the horizontal coordinate offset of the energy of (a),v x y, representing pixel points [ ]x,y) Is set in the vertical coordinate offset of the energy of (c),v x’ y’, representing pixel points [ ]x’,y’) Is set in the vertical coordinate offset of the energy of (c),w x y(,) is pixel point #)x,y) Is used for the weight of the (c),γis a regularization parameter.
3. The method of claim 1, wherein extracting the first target skeleton feature of the optical image comprises:
the cyclic energy estimation of the variational filter is adopted as a criterion of adaptive scale analysis, and the saliency estimation of the adaptive scale is carried out on the optical image so as to obtain a first salient target area;
and extracting the target central line characteristic of the first remarkable target area, and acquiring the skeleton characteristic of the first remarkable target area as the first target skeleton characteristic.
4. A method of target level change detection according to claim 3, wherein the saliency estimation of the adaptive scale is calculated by:
Figure QLYQS_2
wherein, Ias an original optical image of the object,T i is the firstiThe result of the variational filtering of each cycle,INfor the number of cycles to be counted,TVis a variational filter.
5. The target level change detection method according to claim 1, wherein extracting the second target skeleton feature of the SAR image comprises:
acquiring a second significant target region in the SAR image;
extracting the response of the annular ratio operator of the second significant target region;
smoothing the response of the annular ratio operator;
based on the geometric constraint of the target structure, calculating the geometric distance transformation of the target according to the response of the smoothed annular ratio operator;
and performing non-maximum suppression on the geometric distance transformation of the target to obtain the second target skeleton characteristic of the target.
6. The method of claim 5, wherein the target level change detection is based on
Figure QLYQS_3
Calculating the loop ratio operator responseCRf n h n v n The components of the second dense high-dimensional characteristic of the current scale corresponding to the ratio filters with dense ratio in three different directions are respectively.
7. The method of claim 1, wherein determining a target skeleton difference intensity map between the optical image and the SAR image from the first target skeleton feature and the second target skeleton feature comprises:
calculating target skeleton feature similarity between the optical image and the SAR image according to the first target skeleton feature and the second target skeleton feature;
the similarity is used as the change intensity of a first significant target area of the optical image and a second significant target area of the SAR image, and the target skeleton difference intensity map is determined;
wherein according to
Figure QLYQS_4
The degree of similarity is calculated and the degree of similarity,NCC(O i (x,y),S i (x,y) Is pixel point [ ]x,y) A similarity between the corresponding first and second target skeletal features,O i (x,y) Is pixel point #)x,y) The corresponding first target skeleton feature,S i (x,y) Is pixel point #)x,y) The corresponding second target skeleton feature,u O u S feature averages for the first salient target region and the second salient target region, respectively.
8. The method according to claim 1, wherein the extracting target level change information between the optical image and the SAR image from the point-wise difference intensity map and the target skeleton difference intensity map comprises:
and performing hysteresis threshold segmentation on the point-by-point difference intensity map and the target skeleton difference intensity map to extract the target level change information.
CN202310076260.6A 2023-02-08 2023-02-08 Target level change detection method based on dense matching of optical image and SAR image Active CN115797796B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310076260.6A CN115797796B (en) 2023-02-08 2023-02-08 Target level change detection method based on dense matching of optical image and SAR image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310076260.6A CN115797796B (en) 2023-02-08 2023-02-08 Target level change detection method based on dense matching of optical image and SAR image

Publications (2)

Publication Number Publication Date
CN115797796A CN115797796A (en) 2023-03-14
CN115797796B true CN115797796B (en) 2023-05-02

Family

ID=85430292

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310076260.6A Active CN115797796B (en) 2023-02-08 2023-02-08 Target level change detection method based on dense matching of optical image and SAR image

Country Status (1)

Country Link
CN (1) CN115797796B (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103914874B (en) * 2014-04-08 2017-02-01 中山大学 Compact SFM three-dimensional reconstruction method without feature extraction
CN109300115B (en) * 2018-09-03 2021-11-30 河海大学 Object-oriented multispectral high-resolution remote sensing image change detection method
CN111062972B (en) * 2019-12-25 2023-11-21 泉州装备制造研究所 Image tracking method based on image frequency domain conversion
CN114299397A (en) * 2021-12-30 2022-04-08 中国科学院空天信息创新研究院 Multi-temporal SAR image change detection method

Also Published As

Publication number Publication date
CN115797796A (en) 2023-03-14

Similar Documents

Publication Publication Date Title
CN110443836B (en) Point cloud data automatic registration method and device based on plane features
Bae et al. A method for automated registration of unorganised point clouds
CN103679714B (en) A kind of optics and SAR automatic image registration method based on gradient cross-correlation
CN102298779B (en) Image registering method for panoramic assisted parking system
CN109272489A (en) Inhibit the method for detecting infrared puniness target with multiple dimensioned local entropy based on background
CN107560592B (en) Precise distance measurement method for photoelectric tracker linkage target
CN104318548A (en) Rapid image registration implementation method based on space sparsity and SIFT feature extraction
CN102176243A (en) Target ranging method based on visible light and infrared camera
CN108230375B (en) Registration method of visible light image and SAR image based on structural similarity rapid robustness
CN104574393A (en) Three-dimensional pavement crack image generation system and method
CN112001958A (en) Virtual point cloud three-dimensional target detection method based on supervised monocular depth estimation
Shi et al. A visual circle based image registration algorithm for optical and SAR imagery
CN108376409B (en) Light field image registration method and system
CN111462198B (en) Multi-mode image registration method with scale, rotation and radiation invariance
CN106682678A (en) Image angle point detection and classification method based on support domain
CN106097430B (en) A kind of laser stripe center line extraction method of more gaussian signal fittings
Liu et al. Multi-sensor image registration by combining local self-similarity matching and mutual information
CN110533679A (en) SAR image edge detection method based on logarithmic transformation Yu gal cypress convolution
CN115909025A (en) Terrain vision autonomous detection and identification method for small celestial body surface sampling point
CN115797796B (en) Target level change detection method based on dense matching of optical image and SAR image
CN114529681A (en) Hand-held double-camera building temperature field three-dimensional model construction method and system
CN115082561B (en) Calibration method, device, equipment and medium for roadside sensor
CN106355576A (en) SAR image registration method based on MRF image segmentation algorithm
CN115588033A (en) Synthetic aperture radar and optical image registration system and method based on structure extraction
CN115100446A (en) Similarity measurement method for matching SAR and visible light remote sensing image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant