CN117475310A - SAR image-based human activity change detection method and system - Google Patents

SAR image-based human activity change detection method and system Download PDF

Info

Publication number
CN117475310A
CN117475310A CN202311483336.3A CN202311483336A CN117475310A CN 117475310 A CN117475310 A CN 117475310A CN 202311483336 A CN202311483336 A CN 202311483336A CN 117475310 A CN117475310 A CN 117475310A
Authority
CN
China
Prior art keywords
image
sar
sar image
change detection
human activity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311483336.3A
Other languages
Chinese (zh)
Inventor
蔡明勇
邰文飞
解明礼
申文明
王丽霞
张新胜
马万栋
史园莉
肖桐
毕晓玲
陈绪慧
史雪威
吴玲
申振
任致华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Satellite Application Center for Ecology and Environment of MEE
Original Assignee
Satellite Application Center for Ecology and Environment of MEE
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Satellite Application Center for Ecology and Environment of MEE filed Critical Satellite Application Center for Ecology and Environment of MEE
Priority to CN202311483336.3A priority Critical patent/CN117475310A/en
Publication of CN117475310A publication Critical patent/CN117475310A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a human activity change detection method and system based on SAR images, and relates to the technical field of ecological and remote sensing information. The method comprises the following steps: acquiring SAR images of different periods of a target area, and performing spatial registration on the SAR images of different periods based on DEM data; performing change detection on the SAR image subjected to spatial registration based on multi-scale depth feature fusion to obtain an SAR image change detection map; comparing and analyzing the SAR image change detection map with a human activity characteristic data set to obtain suspected ecological damage map spots; and carrying out boundary modification and attribute assignment on the suspected ecological damage pattern spots to obtain final ecological damage pattern spots. The invention solves the problem of insufficient coverage of the optical image in a cloudy and rainy area, expands the application of SAR image in the field of human activity supervision, and can meet the supervision requirements of ecological protection red line in all directions, high precision and short period.

Description

SAR image-based human activity change detection method and system
Technical Field
The invention relates to the technical field of ecological and remote sensing information, in particular to a human activity change detection method and system based on SAR images.
Background
At present, the ecological protection red line human activity supervision mainly adopts manual visual interpretation and automatic change detection based on high-resolution optical remote sensing images, but the optical images are greatly affected by weather, and the problems that optical data acquisition is difficult or the quality of acquired images is poor and the like are often existed in cloudy and rainy areas, so that the accuracy and timeliness of remote sensing monitoring data results are difficult to ensure.
Therefore, how to solve the problems of low monitoring precision, poor timeliness and the like of human activity change of an optical remote sensing image in a cloudy and rainy region, and how to construct a radar and optical space-time coupling method for identifying a typical human interference target in the cloudy and rainy region are the problems to be solved by the technicians in the field.
Disclosure of Invention
In view of the above, the invention provides a human activity change detection method and system based on SAR images, which are used for solving the problems of low human activity change monitoring precision and poor timeliness of optical remote sensing images in cloudy and rainy areas.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
the invention firstly discloses a human activity change detection method based on SAR images, as shown in figure 1, comprising the following steps:
acquiring SAR images of different periods of a target area, and performing spatial registration on the SAR images of different periods based on DEM data;
performing change detection on the SAR image subjected to spatial registration based on multi-scale depth feature fusion to obtain an SAR image change detection map;
comparing and analyzing the SAR image change detection map with the human activity characteristic data set to obtain suspected ecological damage map spots;
and carrying out boundary modification and attribute assignment on the suspected ecological damage pattern spots to obtain the final ecological damage pattern spots.
Further, spatial registration is performed on the SAR images in different periods based on DEM data, and the method specifically comprises the following steps:
s1.1: acquiring front and rear SAR images of a target area in different periodsS 1 、S 2
S1.2: obtaining DEM data in a target area, and selecting control points of the DEM data;
s1.3: turning the control point of the DEM data to an analog image;
s1.4: establishing polynomial transformation of the analog image and each SAR image;
s1.5: converting control points of the analog image into each SAR image by using a point set matching algorithm with vector field consistency;
s1.6: establishing front and rear SAR images S 1 、S 2 A polynomial transformation model of (2);
s1.7, resampling to realize SAR image space registration and obtaining two SAR images I after space registration 1 And I 2
Further, in the step S1.3, the step of transferring the control point of the DEM data to the analog image specifically includes:
selecting a distance-Doppler R-D model, and establishing a lookup table for associating DEM data with a simulation image, namely: (X, Y, Z) DEM ↔(i,j) SIM Wherein, (X, Y, Z) represents DEM space coordinates; (i, j) represents the spatial coordinates of the simulated image;
matching the simulated image with the SAR image to obtain a coordinate corresponding relation between the simulated image SIM and the SAR image, namely: (i, j) SIM ↔(i 1 ,j 1SAR In the formula (i) 1 ,j 1 ) And representing the spatial coordinates of the SAR image.
Further, a polynomial transformation of the analog image and each SAR image is established, including the following polynomial transformations:
i 1 =a 0 +a 1 i+a 2 j+a 3 ij+a 4 i 2 +a 5 j 2
j 1 =b 0 +b 1 i+b 2 j+b 3 ij+b 4 i 2 +b 5 j 2
wherein a is 0 ,a 1 ,a 2 ,a 3 ,a 4 ,a 5 ,b 0 ,b 1 ,b 2 ,b 3 ,b 4 ,b 5 Each representing a polynomial coefficient.
Further, two front and rear SAR images S are established 1 、S 2 A polynomial transformation model of (a), comprising the following polynomials:
X S1 =a 0 +a 1 X S2 +a 2 Y S2 +a 3 X S2 Y S2 +a 4 X S2 2 +a 5 Y S2 2
Y S2 =b 0 +b 1 X S1 +b 2 Y S1 +b 3 X S1 Y S1 +b 4 X S1 2 +b 5 Y S1 2
wherein X is S2 ,Y S2 Represent S 2 Feature point coordinates of the image; x is X S1 ,Y S1 Represent S 1 Feature point coordinates of the image; a, a 0 ,a 1 ,a 2 ,a 3 ,a 4 ,a 5 ,b 0 ,b 1 ,b 2 ,b 3 ,b 4 ,b 5 Each representing the polynomial coefficients in step S1.4.
Further, the method for detecting the changes of the SAR image after spatial registration based on multi-scale depth feature fusion specifically comprises the following steps:
s2.1, spatially registered SAR image I 1 、I 2 Carrying out logarithmic operation to obtain a logarithmic ratio difference diagram;
s2.2, carrying out three-layer synchronous extrusion wavelet transform decomposition on the logarithmic ratio difference diagram obtained in the step S2.1 to obtain a low-frequency component and a high-frequency component of each layer after decomposition;
wherein: i LR Representing spatially registered SAR image I 1 、I 2 Carrying out logarithmic operation to obtain a logarithmic ratio difference diagram;representing low frequency components of the first layer; />、/>、/>Three high frequency components representing the first layer; />Representing low frequency components of the second layer; />、/>、/>Three high frequency components representing the second layer; />Representing low frequency components of the third layer; />、/>Three high frequency components representing the third layer;
s2.3, performing inverse two-dimensional static wavelet transformation based on the low-frequency component and the high-frequency component after decomposition of each layer, and obtaining a multi-scale difference map through independent reconstruction.
S2.4, dividing the multi-scale difference graph into two types by utilizing a fuzzy clustering analysis FCM algorithm, taking the classified multi-scale difference graph as a pseudo tag for training CNN, and selecting a reliable training sample from the multi-scale difference graph;
s2.5, inputting a training sample into a CNN model, training CNN by using a BP algorithm with random gradient descent, and generating two SAR images I by using the trained CNN model 1 、I 2 A change detection result map of (2).
Further, in the step S2.1, the spatially registered SAR image I 1 、I 2 The logarithmic operation is carried out, and the logarithmic operation specifically comprises the following expression:
I LR (x,y)=∣logI 2 (x,y)-logI 1 (x,y)∣;
wherein I is LR (x, y) represents the logarithmic ratio difference image element value, I 1 (x,y)、I 2 (x, y) are respectively I 1 、I 2 And the pixel value corresponding to the image.
Further, in step S2.3, the inverse two-dimensional static wavelet transform is specifically implemented by the following expression:
wherein:representing the reconstructed low frequency component of the third layer; />、/> 、/>Representing the reconstructed high frequency component of the third layer; />Representing the reconstructed low frequency component of the second layer;
、 />、/>representing the reconstructed high frequency component of the second layer; />Representing the reconstructed low frequency component of the first layer.
Further, the human activity feature data set is acquired by:
step1, obtaining a typical feature sample, wherein the typical feature sample comprises six types of mineral resource development, industrial development and construction, energy development and construction, travel development and construction, traffic development and construction and other development and construction besides the typical feature sample;
step2, analyzing optical image characteristics of the typical object sample, wherein the optical image characteristics comprise spatial characteristics, attribute characteristics and texture characteristics;
step3, analyzing SAR image features of the typical feature sample, wherein the SAR image features comprise imaging scattering, geometric features and basic image features;
step4, analyzing the performance characteristics of the typical object sample target on the SAR image from the tone, texture, geometric shape characteristics and contextual characteristics;
step5, matching the optical image of the typical object sample with the corresponding SAR image;
step6, calculating eight characteristic statistic parameters of the target ground object sample based on the gray level co-occurrence matrix, wherein the eight characteristic statistic parameters comprise entropy, mean value, variance, contrast, correlation, dissimilarity, homogeneity and angular second moment;
step7, constructing a characteristic data set of the target ground object by utilizing the eight characteristic statistic parameters and the gray level characteristics of the optical image of the target ground object.
In addition, the invention also discloses a human activity change detection system based on the SAR image, which comprises a plurality of mutually connected computer modules, wherein the computer modules can realize the human activity change detection method based on the SAR image.
Compared with the prior art, the invention discloses a human activity change detection method and system based on SAR images, which has the following beneficial effects:
according to the invention, a set of intelligent automatic extraction system for the human activity change of the ecological protection red line by combining optics and SAR images is established, so that the problem of insufficient coverage of the optical images in a cloudy and rainy area is solved, the application of the SAR images in the human activity supervision field is expanded, and the supervision requirements of the ecological protection red line in all directions, high precision and short period can be met.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present invention, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic overall flow chart of a human activity change detection method based on SAR images according to an embodiment of the present invention.
Fig. 2 is a flowchart of performing two SAR image registration based on an analog image according to an embodiment of the present invention.
FIG. 3 is a three-layer exploded and reconstructed schematic view of a log-to-log difference plot provided by an embodiment of the present invention.
Fig. 4 is a flowchart of a SAR image change detection algorithm according to an embodiment of the present invention.
Fig. 5 is a flowchart of extracting ground object information of an SAR image driven by an optical image according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The embodiment of the invention discloses a human activity change detection method based on SAR images, which is shown in figure 1 and comprises the following steps:
s1, acquiring SAR images of different periods of a target area, and performing spatial registration on the SAR images of different periods based on DEM data;
s2, performing change detection on the SAR image subjected to spatial registration based on multi-scale depth feature fusion, and obtaining an SAR image change detection map;
s3, comparing and analyzing the SAR image change detection map with a human activity characteristic data set to obtain suspected ecological damage map spots;
s4, carrying out boundary modification and attribute assignment on the suspected ecological damage pattern spots to obtain final ecological damage pattern spots.
Each step is further described below.
As shown in fig. 2, in step S1, spatial registration of the SAR image based on DEM data may specifically include the following steps:
s1.1: acquiring front and rear SAR images S of different periods of a target area 1 、S 2
S1.2: obtaining DEM data in a target area, and selecting control points of the DEM data;
s1.3: turning the control point of the DEM data to an analog image, wherein the method specifically comprises the following steps:
selecting a range-Doppler R-D model to establish a correlationA look-up table of DEM data and analog images, namely: (X, Y, Z) DEM ↔(i,j) SIM Wherein, (X, Y, Z) represents DEM space coordinates; (i, j) represents the spatial coordinates of the simulated image;
matching the simulated image with the SAR image to obtain a coordinate corresponding relation between the simulated image SIM and the SAR image, namely: (i, j) SIM ↔(i 1 ,j 1SAR In the formula (i) 1 ,j 1 ) Representing the spatial coordinates of the SAR image;
s1.4: establishing polynomial transformation of the analog image and each SAR image;
i 1 =a 0 +a 1 i+a 2 j+a 3 ij+a 4 i 2 +a 5 j 2
j 1 =b 0 +b 1 i+b 2 j+b 3 ij+b 4 i 2 +b 5 j 2
wherein a is 0 ,a 1 ,a 2 ,a 3 ,a 4 ,a 5 ,b 0 ,b 1 ,b 2 ,b 3 ,b 4 ,b 5 All represent polynomial coefficients;
s1.5: converting control points of the analog image into each SAR image by using a point set matching (VFC) algorithm with vector field consistency;
s1.6: establishing front and rear SAR images S 1 、S 2 A polynomial transformation model of (2);
X S1 =a 0 +a 1 X S2 +a 2 Y S2 +a 3 X S2 Y S2 +a 4 X S2 2 +a 5 Y S2 2
Y S2 =b 0 +b 1 X S1 +b 2 Y S1 +b 3 X S1 Y S1 +b 4 X S1 2 +b 5 Y S1 2
wherein X is S2 ,Y S2 Represent S 2 Feature point coordinates of the image; x is X S1 ,Y S1 Represent S 1 Feature point coordinates of the image; a, a 0 ,a 1 ,a 2 ,a 3 ,a 4 ,a 5 ,b 0 ,b 1 ,b 2 ,b 3 ,b 4 ,b 5 Each representing the polynomial coefficients in step S1.4.
S1.7, resampling to realize SAR image space registration and obtaining two SAR images I after space registration 1 And I 2
In step S2, the detection of the change of the spatially registered SAR image based on the multi-scale depth feature fusion may specifically include the following steps:
s2.1, spatially registered SAR image I 1 、I 2 And carrying out logarithmic operation to obtain a logarithmic ratio difference graph.
The log ratio difference can be obtained by the following formula:
I LR (x,y)=∣logI 2 (x,y)-logI 1 (x, y) |; wherein I is LR (x, y) represents the logarithmic ratio difference image element value, I 1 (x,y)、I 2 (x, y) are respectively I 1 、I 2 And the pixel value corresponding to the image.
S2.2, carrying out three-layer synchronous extrusion wavelet transform decomposition on the logarithmic ratio difference diagram obtained in the step S2.1 to obtain a low-frequency component and a high-frequency component of each layer after decomposition;
wherein: i LR Representing spatially registered SAR image I 1 、I 2 Carrying out logarithmic operation to obtain a logarithmic ratio difference diagram;representing low frequency components of the first layer; />、/>、/>Three high frequency components representing the first layer; />Representing low frequency components of the second layer; />、/>、/>Three high frequency components representing the second layer; />Representing low frequency components of the third layer; />、/>Representing three high frequency components of the third layer.
S2.3, performing inverse two-dimensional static wavelet transformation based on the low-frequency component and the high-frequency component after decomposition of each layer, and obtaining a multi-scale difference map through independent reconstruction; the inverse two-dimensional static wavelet transform is realized by the following expression:
wherein:representing the reconstructed low frequency component of the third layer; />、/> 、/>Representing the reconstructed high frequency component of the third layer; />Representing the reconstructed low frequency component of the second layer;
、 />、/>representing the reconstructed high frequency component of the second layer; />Representing the reconstructed low frequency component of the first layer.
In the above process, the three-layer decomposition and reconstruction overall process of the log-ratio difference map is shown in fig. 3, and fig. 4 is a flowchart of the SAR image change detection algorithm provided in the embodiment.
As shown in fig. 5, the human activity feature data set in this embodiment is obtained by:
step1, obtaining a typical feature sample, wherein the typical feature sample comprises six types of mineral resource development, industrial development and construction, energy development and construction, travel development and construction, traffic development and construction and other development and construction besides the typical feature sample;
step2, analyzing optical image characteristics of the typical object sample, wherein the optical image characteristics comprise spatial characteristics, attribute characteristics and texture characteristics;
step3, analyzing SAR image features of the typical feature sample, wherein the SAR image features comprise imaging scattering, geometric features and basic image features;
step4, analyzing the performance characteristics of the typical object sample target on the SAR image from the tone, texture, geometric shape characteristics and contextual characteristics;
step5, matching the optical image of the typical object sample with the corresponding SAR image;
step6, calculating eight characteristic statistic parameters of the target ground object sample based on the gray level co-occurrence matrix, wherein the eight characteristic statistic parameters comprise entropy, mean value, variance, contrast, correlation, dissimilarity, homogeneity and angular second moment;
step7, constructing a characteristic data set of the target ground object by utilizing the eight characteristic statistic parameters and the gray level characteristics of the optical image of the target ground object.
In addition, the embodiment of the invention also discloses a human activity change detection system based on SAR images, which comprises a plurality of mutually connected computer modules, wherein the computer modules can realize the human activity change detection method based on SAR images.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other. For the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. The human activity change detection method based on SAR image is characterized by comprising the following steps:
acquiring SAR images of different periods of a target area, and performing spatial registration on the SAR images of different periods based on DEM data;
performing change detection on the SAR image subjected to spatial registration based on multi-scale depth feature fusion to obtain an SAR image change detection map;
comparing and analyzing the SAR image change detection map with a human activity characteristic data set to obtain suspected ecological damage map spots;
and carrying out boundary modification and attribute assignment on the suspected ecological damage pattern spots to obtain final ecological damage pattern spots.
2. The SAR image-based human activity change detection method of claim 1, wherein the spatial registration of the SAR images of different periods based on DEM data comprises the steps of:
s1.1: acquiring front and rear SAR images S of different periods of a target area 1 、S 2
S1.2: obtaining DEM data in a target area, and selecting control points of the DEM data;
s1.3: turning the control point of the DEM data to an analog image;
s1.4: establishing polynomial transformation of the analog image and each SAR image;
s1.5: converting control points of the analog image into each SAR image by using a point set matching algorithm with vector field consistency;
s1.6: establishing front and rear SAR images S 1 、S 2 A polynomial transformation model of (2);
s1.7, resampling to realize SAR image space registration to obtainTwo SAR images I after spatial registration 1 And I 2
3. The SAR image-based human activity change detection method according to claim 2, wherein the step S1.3 of transferring the control point of the DEM data to the analog image comprises:
selecting a distance-Doppler R-D model, and establishing a lookup table for associating DEM data with a simulation image, namely: (X, Y, Z) DEM ↔(i,j) SIM Wherein, (X, Y, Z) represents DEM space coordinates; (i, j) represents the spatial coordinates of the simulated image;
matching the simulated image with the SAR image to obtain a coordinate corresponding relation between the simulated image SIM and the SAR image, namely: (i, j) SIM ↔(i 1 ,j 1SAR In the formula (i) 1 ,j 1 ) And representing the spatial coordinates of the SAR image.
4. A method for detecting changes in human activity based on SAR images according to claim 3, wherein the step of creating a polynomial transformation of the simulated image and each SAR image comprises the steps of:
i 1 =a 0 +a 1 i+a 2 j+a 3 ij+a 4 i 2 +a 5 j 2
j 1 =b 0 +b 1 i+b 2 j+b 3 ij+b 4 i 2 +b 5 j 2
wherein a is 0 ,a 1 ,a 2 ,a 3 ,a 4 ,a 5 ,b 0 ,b 1 ,b 2 ,b 3 ,b 4 ,b 5 Each representing a polynomial coefficient.
5. The SAR image-based human activity change detection method according to claim 4, wherein two front and rear SAR images S are created 1 、S 2 A polynomial transformation model of (a), comprising the following polynomials:
X S1 =a 0 +a 1 X S2 +a 2 Y S2 +a 3 X S2 Y S2 +a 4 X S2 2 +a 5 Y S2 2
Y S2 =b 0 +b 1 X S1 +b 2 Y S1 +b 3 X S1 Y S1 +b 4 X S1 2 +b 5 Y S1 2
wherein X is S2 ,Y S2 Represent S 2 Feature point coordinates of the image; x is X S1 ,Y S1 Represent S 1 Feature point coordinates of the image; a, a 0 ,a 1 ,a 2 ,a 3 ,a 4 ,a 5 ,b 0 ,b 1 ,b 2 ,b 3 ,b 4 ,b 5 Each representing the polynomial coefficients in step S1.4.
6. The SAR image-based human activity change detection method according to claim 5, wherein the change detection of the spatially registered SAR image based on the multi-scale depth feature fusion comprises the steps of:
s2.1, spatially registered SAR image I 1 、I 2 Carrying out logarithmic operation to obtain a logarithmic ratio difference diagram;
s2.2, carrying out three-layer synchronous extrusion wavelet transform decomposition on the logarithmic ratio difference diagram obtained in the step S2.1 to obtain a low-frequency component and a high-frequency component of each layer after decomposition;
wherein: i LR Representing spatially registered SAR image I 1 、I 2 Carrying out logarithmic operation to obtain a logarithmic ratio difference diagram;representing low frequency components of the first layer; />、/>、/>Three high frequency components representing the first layer; />Representing low frequency components of the second layer; />、/>、/>Three high frequency components representing the second layer; />Representing low frequency components of the third layer; />、/>、/>Three high frequency components representing the third layer;
s2.3, performing inverse two-dimensional static wavelet transformation based on the low-frequency component and the high-frequency component after decomposition of each layer, and obtaining a multi-scale difference map through independent reconstruction;
s2.4, dividing the multi-scale difference graph into two types by utilizing a fuzzy clustering analysis FCM algorithm, taking the classified multi-scale difference graph as a pseudo tag for training CNN, and selecting a reliable training sample from the multi-scale difference graph;
s2.5, inputting a training sample into a CNN model, training CNN by using a BP algorithm with random gradient descent, and generating two SAR images I by using the trained CNN model 1 、I 2 A change detection result map of (2).
7. The SAR image-based human activity change detection method according to claim 6, wherein the spatially registered SAR image I in step S2.1 1 、I 2 The logarithmic operation is carried out, and the logarithmic operation specifically comprises the following expression:
I LR (x,y)=∣logI 2 (x,y)-logI 1 (x,y)∣;
wherein I is LR (x, y) represents the logarithmic ratio difference image element value, I 1 (x,y)、I 2 (x, y) are respectively spatially registered SAR images I 1 、I 2 Corresponding pixel values.
8. The SAR image-based human activity change detection method according to claim 6, wherein in step S2.3, the inverse two-dimensional static wavelet transform is implemented by the following expression:
wherein:representation ofA reconstructed low frequency component of the third layer; />、/> 、/>Representing the reconstructed high frequency component of the third layer; />Representing the reconstructed low frequency component of the second layer;
、 />、/>representing the reconstructed high frequency component of the second layer; />Representing the reconstructed low frequency component of the first layer.
9. The SAR image-based human activity change detection method of claim 1, wherein the human activity feature dataset is obtained by:
step1, obtaining a typical feature sample, wherein the typical feature sample comprises six types of mineral resource development, industrial development and construction, energy development and construction, travel development and construction, traffic development and construction and other development and construction besides the typical feature sample;
step2, analyzing optical image characteristics of the typical object sample, wherein the optical image characteristics comprise spatial characteristics, attribute characteristics and texture characteristics;
step3, analyzing SAR image features of the typical feature sample, wherein the SAR image features comprise imaging scattering, geometric features and basic image features;
step4, analyzing the performance characteristics of the typical object sample target on the SAR image from the tone, texture, geometric shape characteristics and contextual characteristics;
step5, matching the optical image of the typical object sample with the corresponding SAR image;
step6, calculating eight characteristic statistic parameters of the target ground object sample based on the gray level co-occurrence matrix, wherein the eight characteristic statistic parameters comprise entropy, mean value, variance, contrast, correlation, dissimilarity, homogeneity and angular second moment;
step7, constructing a characteristic data set of the target ground object by utilizing the eight characteristic statistic parameters and the gray level characteristics of the optical image of the target ground object.
10. A SAR image-based human activity change detection system, comprising a plurality of interconnected computer modules, which are operable to implement the SAR image-based human activity change detection method of any one of claims 1-9.
CN202311483336.3A 2023-11-09 2023-11-09 SAR image-based human activity change detection method and system Pending CN117475310A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311483336.3A CN117475310A (en) 2023-11-09 2023-11-09 SAR image-based human activity change detection method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311483336.3A CN117475310A (en) 2023-11-09 2023-11-09 SAR image-based human activity change detection method and system

Publications (1)

Publication Number Publication Date
CN117475310A true CN117475310A (en) 2024-01-30

Family

ID=89630914

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311483336.3A Pending CN117475310A (en) 2023-11-09 2023-11-09 SAR image-based human activity change detection method and system

Country Status (1)

Country Link
CN (1) CN117475310A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109946670A (en) * 2019-03-25 2019-06-28 河海大学 A kind of polarization radar information extracting method of optical image driving
CN113033401A (en) * 2021-03-25 2021-06-25 生态环境部卫星环境应用中心 Human activity change recognition and supervision method for ecological protection red line

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109946670A (en) * 2019-03-25 2019-06-28 河海大学 A kind of polarization radar information extracting method of optical image driving
CN113033401A (en) * 2021-03-25 2021-06-25 生态环境部卫星环境应用中心 Human activity change recognition and supervision method for ecological protection red line

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘梦岚;杨学志;贾璐;汪骏;: "融合多尺度深度特征的SAR图像变化检测", 合肥工业大学学报(自然科学版), no. 06, 28 June 2020 (2020-06-28) *
周荣荣;丁吉峰;隋立春;潘东峰;杨超;: "一种适用于山地的SAR影像配准方法", 测绘科学技术学报, no. 05, 15 October 2018 (2018-10-15) *

Similar Documents

Publication Publication Date Title
Hamzah et al. Improvement of stereo matching algorithm for 3D surface reconstruction
CN108830819B (en) Image fusion method and device for depth image and infrared image
Sun et al. Aerial 3D building detection and modeling from airborne LiDAR point clouds
CN112488210A (en) Three-dimensional point cloud automatic classification method based on graph convolution neural network
CN106384383A (en) RGB-D and SLAM scene reconfiguration method based on FAST and FREAK feature matching algorithm
CN110796694A (en) Fruit three-dimensional point cloud real-time acquisition method based on KinectV2
CN110992366A (en) Image semantic segmentation method and device and storage medium
CN110598564A (en) OpenStreetMap-based high-spatial-resolution remote sensing image transfer learning classification method
Liang et al. Maximum likelihood classification of soil remote sensing image based on deep learning
CN113033432A (en) Remote sensing image residential area extraction method based on progressive supervision
CN116310095A (en) Multi-view three-dimensional reconstruction method based on deep learning
CN108615401A (en) The non-homogeneous light parking situation recognition methods in interior based on deep learning
Aishwarya et al. An image fusion framework using novel dictionary based sparse representation
CN112734683B (en) Multi-scale SAR and infrared image fusion method based on target enhancement
CN102609721B (en) Remote sensing image clustering method
CN112017259A (en) Indoor positioning and image building method based on depth camera and thermal imager
CN110363863B (en) Input data generation method and system of neural network
CN109215122B (en) Streetscape three-dimensional reconstruction system and method and intelligent trolley
CN117475310A (en) SAR image-based human activity change detection method and system
CN112507826B (en) End-to-end ecological variation monitoring method, terminal, computer equipment and medium
CN113192204B (en) Three-dimensional reconstruction method for building in single inclined remote sensing image
CN109118576A (en) Large scene three-dimensional reconstruction system and method for reconstructing based on BDS location-based service
CN107590829A (en) A kind of seed point pick-up method for being applied to the intensive cloud data registration of various visual angles
CN114283258A (en) CNN-based method for generating three-dimensional point cloud from single image
CN112766032A (en) SAR image saliency map generation method based on multi-scale and super-pixel segmentation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination