CN114419465A - Method, device and equipment for detecting change of remote sensing image and storage medium - Google Patents

Method, device and equipment for detecting change of remote sensing image and storage medium Download PDF

Info

Publication number
CN114419465A
CN114419465A CN202210328943.1A CN202210328943A CN114419465A CN 114419465 A CN114419465 A CN 114419465A CN 202210328943 A CN202210328943 A CN 202210328943A CN 114419465 A CN114419465 A CN 114419465A
Authority
CN
China
Prior art keywords
remote sensing
sub
change
determining
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210328943.1A
Other languages
Chinese (zh)
Other versions
CN114419465B (en
Inventor
周波
田欣兴
苗瑞
邹小刚
梁书玉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Haiqing Zhiyuan Technology Co.,Ltd.
Original Assignee
Shenzhen HQVT Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen HQVT Technology Co Ltd filed Critical Shenzhen HQVT Technology Co Ltd
Priority to CN202210328943.1A priority Critical patent/CN114419465B/en
Publication of CN114419465A publication Critical patent/CN114419465A/en
Application granted granted Critical
Publication of CN114419465B publication Critical patent/CN114419465B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The application provides a method, a device, equipment and a storage medium for detecting changes of remote sensing images. The method comprises the following steps: respectively extracting frequency domain characteristics of the first time phase remote sensing image and the second time phase remote sensing image by adopting a non-subsampled Contourlet transform algorithm to obtain a plurality of first characteristic sub-images and a plurality of second characteristic sub-images; determining a plurality of difference image maps according to the plurality of first characteristic sub-maps and the plurality of second characteristic sub-maps; determining a layer change area and a layer non-change area corresponding to each difference image map by adopting a first Gaussian mixture model; and fusing the layer change region and the layer non-change region corresponding to each difference image map by adopting a hidden Markov model to obtain the change region and the non-change region of the second time phase remote sensing image relative to the first time phase remote sensing image. According to the method, a relatively accurate change detection result can be obtained, and the change of the remote sensing image can be better represented.

Description

Method, device and equipment for detecting change of remote sensing image and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method, an apparatus, a device, and a storage medium for detecting changes in remote sensing images.
Background
The change detection is to detect the information of the change of the ground features of the area along with the time by analyzing a plurality of remote sensing images of the same area at different moments. With the development of remote sensing technology and information technology, multi-temporal remote sensing image change detection has become an important direction for current remote sensing image analysis and research, and is widely applied to various fields of social economy, such as disaster monitoring and assessment, analysis of land use conditions, investigation of water resource quality and geographical distribution conditions, planning and layout of cities and the like.
In the research of the multi-temporal remote sensing image change detection method, a common detection method is to classify different surface features in the remote sensing image, construct difference images for the different surface features, and determine a change class and a non-change class by using a threshold value method.
The existing method of selecting the threshold value by a method of classifying before comparing has low requirements, but the method has the problem of accumulation of classification errors, namely, the image quality can cause errors to the classification result, and the errors can influence the precision of change detection. Moreover, due to the existence of noise in the remote sensing image, noise also exists in the generated difference image, so that the difference image cannot truly express the change of the remote sensing image, and the final change detection result is inaccurate.
Disclosure of Invention
The application provides a method, a device, equipment and a storage medium for detecting changes of remote sensing images, which are used for solving the problem that the change detection result obtained by the existing remote sensing image detection mode is inaccurate.
In a first aspect, the present application provides a method for detecting a change in a remote sensing image, including:
acquiring a first time-phase remote sensing image and a second time-phase remote sensing image which are acquired in the same region at different times;
respectively extracting frequency domain characteristics of the first time phase remote sensing image and the second time phase remote sensing image by adopting a non-subsampled Contourlet transform algorithm so as to respectively obtain a plurality of first characteristic sub-images and a plurality of second characteristic sub-images;
determining a plurality of difference image maps according to the plurality of first characteristic sub-maps and the plurality of second characteristic sub-maps;
for each difference image map, determining a layer change region and a layer non-change region corresponding to the difference image map by adopting a first Gaussian mixture model;
and fusing the layer change region and the layer non-change region corresponding to each difference image map by adopting a hidden Markov model to obtain the change region and the non-change region of the second time-phase remote sensing image relative to the first time-phase remote sensing image.
In a second aspect, the present application provides a remote sensing image change detection apparatus, including:
the acquisition unit is used for acquiring a first time phase remote sensing image and a second time phase remote sensing image which are acquired in the same region at different times;
the extraction unit is used for respectively extracting the frequency domain characteristics of the first time phase remote sensing image and the second time phase remote sensing image by adopting a non-subsampled Contourlet transformation algorithm so as to respectively obtain a plurality of first characteristic subgraphs and a plurality of second characteristic subgraphs;
the determining unit is used for determining a plurality of difference image maps according to the plurality of first characteristic subgraphs and the plurality of second characteristic subgraphs;
the determining unit is further used for determining a layer change area and a layer non-change area corresponding to each difference image map by adopting a first Gaussian mixture model;
and the fusion unit is used for fusing the layer change region and the layer non-change region corresponding to each difference image map by adopting a hidden Markov model so as to obtain the change region and the non-change region of the second time phase remote sensing image relative to the first time phase remote sensing image.
In a third aspect, the present invention provides an electronic device comprising: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored by the memory, causing the processor to perform the method of the first aspect.
In a fourth aspect, the present invention provides a computer-readable storage medium having stored thereon computer-executable instructions for implementing the method according to the first aspect when executed by a processor.
According to the method, the device, the equipment and the storage medium for detecting the change of the remote sensing image, a first time phase remote sensing image and a second time phase remote sensing image which are acquired in the same region at different times are obtained; respectively extracting frequency domain characteristics of the first time phase remote sensing image and the second time phase remote sensing image by adopting a non-subsampled Contourlet transform algorithm so as to respectively obtain a plurality of first characteristic sub-images and a plurality of second characteristic sub-images; determining a plurality of difference image maps according to the plurality of first characteristic sub-maps and the plurality of second characteristic sub-maps; for each difference image map, determining a layer change region and a layer non-change region corresponding to the difference image map by adopting a first Gaussian mixture model; and fusing the layer change region and the layer non-change region corresponding to each difference image map by adopting a hidden Markov model to obtain the change region and the non-change region of the second time-phase remote sensing image relative to the first time-phase remote sensing image.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic diagram of a network architecture of a remote sensing image change detection method provided by the invention;
FIG. 2 is a schematic flow chart of a method for detecting changes in remote sensing images according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of feature sub-image extraction of a remote sensing image change detection method according to a second embodiment of the present invention;
FIG. 4 is a schematic diagram of feature sub-image extraction of a remote sensing image change detection method provided by the third embodiment of the present invention;
FIG. 5 is a schematic flow chart of a method for detecting changes in remote sensing images according to a fourth embodiment of the present invention;
fig. 6 is a schematic flow chart of a remote sensing image change detection method according to a fifth embodiment of the present invention;
fig. 7 is a schematic flow chart of a method for detecting changes in remote sensing images according to a sixth embodiment of the present invention;
fig. 8 is a schematic flow chart of a remote sensing image change detection method according to a seventh embodiment of the present invention;
fig. 9 is a schematic flow chart of a method for detecting changes in remote sensing images according to an eighth embodiment of the present invention;
FIG. 10 is a schematic flow chart of a method for detecting changes in remote sensing images according to a ninth embodiment of the present invention;
FIG. 11 is a gray level histogram of a method for detecting changes in a remote sensing image according to an embodiment of the present invention;
fig. 12 is a schematic structural diagram of a remote sensing image change detection apparatus according to an embodiment of the present invention;
fig. 13 is a block diagram of an electronic device for implementing the method for detecting changes in remote sensing images according to the embodiment of the present invention.
With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
For a clear understanding of the technical solutions of the present application, a detailed description of the prior art solutions is first provided.
With the development of remote sensing technology and information technology, multi-temporal remote sensing image change detection has become an important direction for current remote sensing image analysis and research, and is widely applied to various fields of social economy, such as disaster monitoring and assessment, analysis of land use conditions, investigation of water resource quality and geographical distribution conditions, planning and layout of cities and the like. Change Detection (CD) plays a very important role in accurately understanding surface changes in the use of existing remote sensing image data. The method can accurately and timely detect the change area from the remote sensing image, and has important significance for various works such as city management and planning, urbanization evaluation, post-disaster reconstruction and the like. And the newly-built buildings in the detected areas are combined with the areas where the buildings are located, so that the illegal buildings can be effectively detected, and the urban standardized management is facilitated. Analyzing the change density in the remote sensing image facilitates understanding of the development speed of each area and the density area of building changes, which is helpful for estimating the development process of the area and evaluating the development degree of the area. By positioning the area with severe change, whether the area suffering from natural disasters such as earthquake is rebuilt according to a planning mode can be judged, and the process of post-disaster reconstruction is supervised. In the research of the multi-temporal remote sensing image change detection method, a common detection method is to classify different surface features in the remote sensing image, construct difference images for the different surface features, and determine a change class and a non-change class by using a threshold value method. The other method is based on manual marking, and two remote sensing images are compared in a manual mode, and a changed area is marked.
The existing method of selecting the threshold value by a method of classifying before comparing has low requirements, but the method has the problem of accumulation of classification errors, namely, the image quality can cause errors to the classification result, and the errors can influence the precision of change detection. Moreover, due to the existence of noise in the remote sensing image, noise also exists in the generated difference image, so that the difference image cannot truly express the change of the remote sensing image, and the final change detection result is inaccurate. In the manual labeling mode, due to the fact that buildings are densely distributed, detection personnel are required to compare changes existing in each area, and due to the fact that the remote sensing image has the advantages of being low in contrast, dense in urban building distribution, large in area range needing to be detected and the like, a large amount of time and energy are consumed for manually labeling the changes in the remote sensing image, and inaccurate conditions exist in manual labeling.
Therefore, aiming at the problem that the change detection result obtained by the remote sensing image detection mode in the prior art is inaccurate, the research finds that a first time-phase remote sensing image and a second time-phase remote sensing image which are acquired in the same area at different time are obtained, the frequency domain characteristics of the first time-phase remote sensing image and the second time-phase remote sensing image are respectively extracted by utilizing a non-downsampling Contourlet transform algorithm so as to respectively obtain a plurality of first characteristic subgraphs and a plurality of second characteristic subgraphs, thereby determining a plurality of difference image graphs according to the plurality of first characteristic subgraphs and the plurality of second characteristic subgraphs, determining a layer change area and a layer change area corresponding to each difference image graph by utilizing a first mixed Gaussian model aiming at each difference image graph, further fusing the layer change area and the layer non-change area corresponding to each difference image graph by utilizing a hidden Markov model so as to obtain a change area and a non-change area of the second time-phase remote sensing image relative to the first time-phase remote sensing image, compared with the prior art, the method can obtain a relatively accurate change detection result, and can better represent the change of the remote sensing image.
Therefore, the inventor proposes a technical scheme of the embodiment of the invention based on the above creative discovery. The network architecture and the application scenario of the remote sensing image change detection method provided by the embodiment of the invention are introduced below.
As shown in fig. 1, a network architecture corresponding to the method for detecting a change in a remote sensing image according to an embodiment of the present invention includes: electronic device 1 and server 2. The electronic device 1 is in communication connection with the server 2. A user clicks a change detection key in an operation interface of the electronic equipment 1, so that a change detection instruction is triggered, and the electronic equipment 1 acquires a first time phase remote sensing image and a second time phase remote sensing image which are acquired in the same area at different times from the server 2 according to the change detection instruction; respectively extracting frequency domain characteristics of the first time phase remote sensing image and the second time phase remote sensing image by adopting a non-subsampled Contourlet transform algorithm so as to respectively obtain a plurality of first characteristic sub-images and a plurality of second characteristic sub-images; determining a plurality of difference image maps according to the plurality of first characteristic sub-maps and the plurality of second characteristic sub-maps; determining a layer change area and a layer non-change area corresponding to each difference image map by adopting a first Gaussian mixture model; and fusing the layer change region and the layer non-change region corresponding to each difference image map by adopting a hidden Markov model to obtain the change region and the non-change region of the second time-phase remote sensing image relative to the first time-phase remote sensing image. Compared with the prior art, the method can obtain a relatively accurate change detection result, and can better represent the change of the remote sensing image.
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
Example one
Fig. 2 is a schematic flow chart of a remote sensing image change detection method according to an embodiment of the present invention, and as shown in fig. 2, an execution subject of the remote sensing image change detection method according to the present embodiment is a remote sensing image change detection device, and the remote sensing image change detection device is located in an electronic device, the remote sensing image change detection method according to the present embodiment includes the following steps:
step 101, acquiring a first time phase remote sensing image and a second time phase remote sensing image acquired in different time in the same region.
In this embodiment, time-phase remote sensing data of the same area at different times is acquired, where the time-phase remote sensing data is an image data set regularly formed according to different time periods, such as day, ten-day, month, season, year, and the like, and the remote sensing data with increased time dimension is the time-phase remote sensing data. The method comprises the steps of obtaining a first time-phase remote sensing image and a second time-phase remote sensing image which are collected in the same area at different times from time-phase remote sensing data, wherein the first time-phase remote sensing image and the second time-phase remote sensing image are remote sensing images in the same position at different times, for example, flood occurs in a certain area every day, and in order to analyze an area influenced by the flood, the first time-phase remote sensing image collected before the flood occurs in the area and the second time-phase remote sensing image collected after the flood occurs are obtained.
And 102, respectively extracting the frequency domain characteristics of the first time phase remote sensing image and the second time phase remote sensing image by adopting a non-subsampled Contourlet transform algorithm to respectively obtain a plurality of first characteristic subgraphs and a plurality of second characteristic subgraphs.
In this embodiment, the first time-phase remote sensing image and the second time-phase remote sensing image are subjected to feature extraction, and a non-downsampling Contourlet transform algorithm is adopted for extraction, wherein the non-downsampling Contourlet transform algorithm is composed of a non-downsampling pyramid filter bank (NSPFB) and a non-downsampling direction filter bank (NSDFB), the NSDFB is based on a fan-shaped direction filter bank, and the decomposed sub-band images are filtered in a frequency domain space in a fan shape, so that the images are decomposed in a more accurate direction in the corresponding frequency domain images, and thus, the images can contain detailed features in all directions. After the NSPFB performs the upsampling process on the previous filter through iteration, the secondary filtering process is performed on the previous low-frequency subband image, so that each level of decomposition generates a low-frequency subband image and a high-frequency subband image. And extracting the first time-phase remote sensing image by using a non-subsampled Contourlet transform algorithm to obtain a plurality of first characteristic subgraphs, and extracting the second time-phase remote sensing image by using a non-subsampled Contourlet transform algorithm to obtain a plurality of second characteristic subgraphs.
Step 103, determining a plurality of difference images according to the plurality of first characteristic subgraphs and the plurality of second characteristic subgraphs.
In this embodiment, a plurality of difference images are determined according to the plurality of first characteristic sub-images and the plurality of second characteristic sub-images, wherein the difference images can be represented by gray-scale histograms.
And step 104, determining a layer change area and a layer non-change area corresponding to each difference image map by adopting a first Gaussian mixture model.
In this embodiment, a layer change region and a layer non-change region corresponding to each difference image map are determined by using a first gaussian mixture model, where each difference image map includes a corresponding layer change region, a corresponding layer non-change region, and a region to be detected, and the region to be detected is a region that cannot be directly determined as a layer change region or a layer non-change region.
And 105, fusing the layer change region and the layer non-change region corresponding to each difference image map by using a hidden Markov model to obtain the change region and the non-change region of the second time-phase remote sensing image relative to the first time-phase remote sensing image.
In this embodiment, a hidden markov model is used to fuse the layer change region and the layer non-change region corresponding to each difference image map, and a hidden markov tree model may be used to fuse the layer change region and the layer non-change region to obtain a change region and a non-change region of the second time-phase remote sensing image relative to the first time-phase remote sensing image.
In the embodiment, a first time-phase remote sensing image and a second time-phase remote sensing image acquired in the same region at different time are acquired, frequency domain features of the first time-phase remote sensing image and the second time-phase remote sensing image are respectively extracted by using a non-downsampling Contourlet transform algorithm to respectively obtain a plurality of first feature sub-images and a plurality of second feature sub-images, so that a plurality of difference image maps are determined according to the first feature sub-images and the second feature sub-images, a layer change region and a layer change region corresponding to each difference image map are determined by using a first mixed Gaussian model for each difference image map, a layer change region and a layer non-change region corresponding to each difference image map are further fused by using a hidden Markov model to obtain a change region and a non-change region of the second time-phase remote sensing image relative to the first time-phase remote sensing image, and compared with the prior art, a more accurate change detection result can be obtained, the change of the remote sensing image can be better represented.
Example two
Fig. 3 is a schematic diagram of feature subgraph extraction of the remote sensing image change detection method provided in the second embodiment of the present invention, and on the basis of the remote sensing image change detection method provided in the first embodiment of the present invention, step 102 is further refined, specifically including the following steps:
and 1021, respectively extracting high-frequency features and low-frequency features corresponding to the two time-phase remote sensing images by adopting NSPFB (non-subsampled pulse frequency domain) to respectively obtain a first high-frequency sub-band feature sub-image and a first low-frequency sub-band feature sub-image.
In this embodiment, the non-downsampling Contourlet transform algorithm includes a non-downsampling pyramid filter bank NSPFB and a non-downsampling direction filter bank NSDFB, as shown in fig. 3, the NSPFB is used to filter the first time-phase remote sensing image and the second time-phase remote sensing image, respectively extract the high-frequency features and the low-frequency features corresponding to the two time-phase remote sensing images, obtain a first high-frequency sub-band feature sub-graph and a first low-frequency sub-band feature sub-graph corresponding to the first time-phase remote sensing image, and obtain a first high-frequency sub-band feature sub-graph and a first low-frequency sub-band feature sub-graph corresponding to the second time-phase remote sensing image.
And 1022, respectively extracting the high-frequency features of the first high-frequency sub-band feature sub-images corresponding to the two time-phase remote sensing images by using the NSDFB to respectively obtain first high-frequency direction sub-band feature sub-images.
In this embodiment, the NSDFB pair is used for filtering, and high-frequency features of first high-frequency subband feature subgraphs corresponding to two time-phase remote sensing images are respectively extracted to obtain first high-frequency direction subband feature subgraphs corresponding to the first time-phase remote sensing image and obtain first high-frequency direction subband feature subgraphs corresponding to the second time-phase remote sensing image.
And 1023, respectively extracting the graph high-frequency characteristics and the low-frequency characteristics of the first low-frequency sub-band characteristic sub-graphs corresponding to the two time-phase remote sensing images by adopting NSPFB (non-subsampled particle swarm optimization) to respectively obtain a second high-frequency sub-band characteristic sub-graph and a second low-frequency sub-band characteristic sub-graph.
In this embodiment, NSPFB is used to perform filtering processing, and high-frequency features and low-frequency features of a first low-frequency sub-band feature sub-image corresponding to two time-phase remote sensing images are respectively extracted to obtain a second high-frequency sub-band feature sub-image and a second low-frequency sub-band feature sub-image corresponding to the first time-phase remote sensing image, and to obtain a second high-frequency sub-band feature sub-image and a second low-frequency sub-band feature sub-image corresponding to the second time-phase remote sensing image.
And 1024, respectively extracting the high-frequency features of the second high-frequency sub-band feature sub-graphs corresponding to the two time-phase remote sensing images by adopting the NSDFB so as to respectively obtain second high-frequency direction sub-band feature sub-graphs.
In this embodiment, the NSDFB is adopted to perform filtering processing, and the high-frequency features of the second high-frequency sub-band feature sub-images corresponding to the two time-phase remote sensing images are respectively extracted to obtain the high-frequency features of the second high-frequency sub-band feature sub-images corresponding to the first time-phase remote sensing image and obtain the high-frequency features of the second high-frequency sub-band feature sub-images corresponding to the second time-phase remote sensing image.
And 1025, determining the first high-frequency direction sub-band characteristic subgraph, the second high-frequency direction sub-band characteristic subgraph and the second low-frequency sub-band characteristic subgraph corresponding to the first time-phase remote sensing image into a plurality of first characteristic subgraphs.
In this embodiment, as shown in fig. 3, a first high-frequency direction sub-band feature sub-graph, a second high-frequency direction sub-band feature sub-graph, and a second low-frequency sub-band feature sub-graph corresponding to the first time-phase remote sensing image are taken as a plurality of first feature sub-graphs.
And step 1026, determining the first high-frequency direction sub-band feature subgraph, the second high-frequency direction sub-band feature subgraph and the second low-frequency sub-band feature subgraph corresponding to the second time-phase remote sensing image as a plurality of second feature subgraphs.
In this embodiment, the first high-frequency direction sub-band feature subgraph, the second high-frequency direction sub-band feature subgraph and the second low-frequency sub-band feature subgraph corresponding to the second time-phase remote sensing image are taken as a plurality of second feature subgraphs.
In this embodiment, by performing filtering processing for multiple times, noise can be effectively reduced, and interference factors can be reduced.
EXAMPLE III
Fig. 4 is a schematic diagram of feature subgraph extraction of the remote sensing image change detection method provided by the third embodiment of the invention, and on the basis of the remote sensing image change detection method provided by the first embodiment of the invention, step 102 is further refined, specifically including the following steps:
step 102a, respectively extracting high-frequency features and low-frequency features corresponding to the two time-phase remote sensing images by using NSPFB (non-subsampled pulse frequency response) to respectively obtain a first high-frequency sub-band feature sub-image and a first low-frequency sub-band feature sub-image.
In this embodiment, as shown in fig. 4, the first time-phase remote sensing image is illustrated in fig. 4, and the non-downsampling Contourlet transform algorithm includes a non-downsampling pyramid filter bank NSPFB and a non-downsampling direction filter bank NSDFB, as shown in fig. 3, the first time-phase remote sensing image and the second time-phase remote sensing image are respectively filtered by using the NSPFB, high-frequency features and low-frequency features corresponding to the two time-phase remote sensing images are respectively extracted, a first high-frequency subband feature sub-graph and a first low-frequency subband feature sub-graph corresponding to the first time-phase remote sensing image are obtained, and a first high-frequency subband feature sub-graph and a first low-frequency subband feature sub-graph corresponding to the second time-phase remote sensing image are obtained.
And step 102b, respectively extracting the high-frequency characteristics of the first high-frequency sub-band characteristic subgraph corresponding to the two time-phase remote sensing images by adopting the NSDFB so as to respectively obtain first high-frequency direction sub-band characteristic subgraphs.
In this embodiment, the NSDFB pair is used for filtering, and high-frequency features of first high-frequency subband feature subgraphs corresponding to two time-phase remote sensing images are respectively extracted to obtain first high-frequency direction subband feature subgraphs corresponding to the first time-phase remote sensing image and obtain first high-frequency direction subband feature subgraphs corresponding to the second time-phase remote sensing image.
And step 102c, respectively extracting the high-frequency characteristic and the low-frequency characteristic of the first low-frequency sub-band characteristic subgraph corresponding to the two time-phase remote sensing images by adopting NSPFB (non-subsampled pulse frequency response) to respectively obtain a second high-frequency sub-band characteristic subgraph and a second low-frequency sub-band characteristic subgraph.
In this embodiment, NSPFB is used to perform filtering processing, and a high-frequency feature and a low-frequency feature of a first low-frequency sub-band feature sub-image corresponding to two time-phase remote sensing images are respectively extracted to obtain a second high-frequency sub-band feature sub-image and a second low-frequency sub-band feature sub-image corresponding to the first time-phase remote sensing image, and to obtain a second high-frequency sub-band feature sub-image and a second low-frequency sub-band feature sub-image corresponding to the second time-phase remote sensing image.
And step 102d, respectively extracting the high-frequency characteristics of the second high-frequency sub-band characteristic subgraphs corresponding to the two time-phase remote sensing images by adopting the NSDFB so as to respectively obtain second high-frequency direction sub-band characteristic subgraphs.
In this embodiment, the NSDFB is adopted to perform filtering processing, and the high-frequency features of the second high-frequency sub-band feature sub-images corresponding to the two time-phase remote sensing images are respectively extracted to obtain the high-frequency features of the second high-frequency sub-band feature sub-images corresponding to the first time-phase remote sensing image and obtain the high-frequency features of the second high-frequency sub-band feature sub-images corresponding to the second time-phase remote sensing image.
And step 102e, respectively extracting graph high-frequency characteristics and low-frequency characteristics of a second low-frequency sub-band characteristic sub-graph corresponding to the two time-phase remote sensing images by adopting NSPFB (non-subsampled pulse frequency domain) to respectively obtain a third high-frequency sub-band characteristic sub-graph and a third low-frequency sub-band characteristic sub-graph.
In this embodiment, NSPFB is used to perform filtering processing, and high-frequency features and low-frequency features of a second low-frequency sub-band feature sub-image corresponding to two time-phase remote sensing images are respectively extracted to obtain a third high-frequency sub-band feature sub-image and a third low-frequency sub-band feature sub-image corresponding to a first time-phase remote sensing image, and obtain a third high-frequency sub-band feature sub-image and a third low-frequency sub-band feature sub-image corresponding to a second time-phase remote sensing image.
And step 102f, respectively extracting the high-frequency characteristics of a third high-frequency sub-band characteristic sub-graph corresponding to the two time-phase remote sensing images by adopting NSDFB so as to respectively obtain a third high-frequency direction sub-band characteristic sub-graph.
In this embodiment, the NSDFB is adopted to perform filtering processing, and high-frequency features of third high-frequency subband feature subgraphs corresponding to the two time-phase remote sensing images are respectively extracted, so that a third high-frequency directional subband feature subgraph corresponding to the first time-phase remote sensing image is obtained, and a third high-frequency directional subband feature subgraph corresponding to the second time-phase remote sensing image is obtained.
And step 102g, determining a first high-frequency direction sub-band characteristic subgraph, a second high-frequency direction sub-band characteristic subgraph, a third high-frequency direction sub-band characteristic subgraph and a third low-frequency sub-band characteristic subgraph corresponding to the first time-phase remote sensing image as a plurality of first characteristic subgraphs.
In this embodiment, a first high-frequency direction sub-band feature sub-graph, a second high-frequency direction sub-band feature sub-graph, a third high-frequency direction sub-band feature sub-graph and a third low-frequency sub-band feature sub-graph corresponding to the first time-phase remote sensing image are taken as a plurality of first feature sub-graphs.
And step 102h, determining a first high-frequency direction sub-band feature subgraph, a second high-frequency direction sub-band feature subgraph, a third high-frequency direction sub-band feature subgraph and a third low-frequency sub-band feature subgraph corresponding to the second time-phase remote sensing image as a plurality of second feature subgraphs.
In this embodiment, a plurality of first high-frequency direction sub-band feature sub-images, a plurality of second high-frequency direction sub-band feature sub-images, a plurality of third high-frequency direction sub-band feature sub-images, and a third low-frequency sub-band feature sub-image corresponding to the second time-phase remote sensing image are taken as a plurality of second feature sub-images.
In this embodiment, by performing filtering processing for multiple times, noise can be effectively reduced, and interference factors can be reduced.
Example four
Fig. 5 is a schematic flow chart of a remote sensing image change detection method provided by the fourth embodiment of the present invention, and as shown in fig. 5, on the basis of the remote sensing image change detection method provided by the first embodiment of the present invention, step 103 is further refined, which specifically includes the following steps:
step 1031, dividing the plurality of first feature subgraphs and the plurality of second feature subgraphs into a plurality of groups of feature subgraphs according to the corresponding relationship.
In this embodiment, the multiple first feature sub-graphs and the multiple second feature sub-graphs are divided into multiple groups of feature sub-graphs according to corresponding relationships, as shown in fig. 3, the multiple first feature sub-graphs corresponding to the first time-phase remote sensing image include a first high-frequency direction sub-band feature sub-graph, a second high-frequency direction sub-band feature sub-graph and a second low-frequency sub-band feature sub-graph corresponding to the first time-phase remote sensing image, the multiple second feature sub-graphs corresponding to the second time-phase remote sensing image include a first high-frequency direction sub-band feature sub-graph, a second high-frequency direction sub-band feature sub-graph and a second low-frequency sub-band feature sub-graph corresponding to the second time-phase remote sensing image, the first high-frequency direction sub-band feature sub-graph corresponding to the first time-phase remote sensing image and the first high-frequency direction sub-band feature sub-graph corresponding to the second time-phase remote sensing image are divided into the same group of feature sub-graphs, and the second high-frequency direction sub-band feature sub-graphs corresponding to the first time-phase remote sensing image and the second time-phase remote sensing image are divided into the same group And dividing the sub-band characteristic subgraph into the same group of characteristic subgraphs, and dividing a second low-frequency sub-band characteristic subgraph corresponding to the first time-phase remote sensing image and a second low-frequency sub-band characteristic subgraph corresponding to the second time-phase remote sensing image into the same group of characteristic subgraphs.
Optionally, as shown in fig. 4, the multiple first feature sub-graphs corresponding to the first time-phase remote sensing image include a first high-frequency direction sub-band feature sub-graph, a second high-frequency direction sub-band feature sub-graph, a third high-frequency direction sub-band feature sub-graph, and a third low-frequency sub-band feature sub-graph corresponding to the first time-phase remote sensing image, the multiple second feature sub-graphs corresponding to the second time-phase remote sensing image include a first high-frequency direction sub-band feature sub-graph, a second high-frequency direction sub-band feature sub-graph, a third high-frequency direction sub-band feature sub-graph, and a third low-frequency sub-band feature sub-graph corresponding to the second time-phase remote sensing image, the first high-frequency direction sub-band feature sub-graph corresponding to the first time-phase remote sensing image and the first high-frequency direction sub-band feature sub-graph corresponding to the second time-phase remote sensing image are divided into the same group of feature sub-graphs, and the second high-frequency direction sub-band feature sub-graphs corresponding to the first time-phase remote sensing image and the second time-phase remote sensing image are divided into the same group of sub-band feature sub-bands And the characteristic subgraph divides a third high-frequency direction sub-band characteristic subgraph corresponding to the first time-phase remote sensing image and a third high-frequency direction sub-band characteristic subgraph corresponding to the second time-phase remote sensing image into a same group of characteristic subgraphs and divides a third low-frequency sub-band characteristic subgraph corresponding to the first time-phase remote sensing image and a third low-frequency sub-band characteristic subgraph corresponding to the second time-phase remote sensing image into a same group of characteristic subgraphs.
Step 1032, for each group of feature sub-images, calculating a pixel difference value between the first feature sub-image and the corresponding second feature sub-image, and determining a difference image corresponding to the group of feature sub-images according to the pixel difference value.
In this embodiment, for each group of feature sub-images, a pixel difference value between the first feature sub-image and the second feature sub-image is calculated, the image parameter corresponding to the first feature sub-image and the influence parameter corresponding to the second feature sub-image are substituted into formula (1), so as to obtain an image parameter corresponding to a difference image, and the difference image is generated according to the image parameter corresponding to the difference image, where formula (1) is expressed as:
Figure 403731DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 271193DEST_PATH_IMAGE002
the image parameters corresponding to the first feature sub-image,
Figure 872070DEST_PATH_IMAGE003
the image parameters corresponding to the second feature sub-image,
Figure 977429DEST_PATH_IMAGE004
are respectively the first
Figure 254827DEST_PATH_IMAGE005
Layer one
Figure 812847DEST_PATH_IMAGE006
The non-downsampled Contourlet transform coefficients of the sub-bands,
Figure 201103DEST_PATH_IMAGE007
Figure 944325DEST_PATH_IMAGE008
is as follows
Figure 861465DEST_PATH_IMAGE005
Layer one
Figure 437940DEST_PATH_IMAGE006
The difference of the transform coefficients of the individual subbands,
Figure 364308DEST_PATH_IMAGE009
is as follows
Figure 194992DEST_PATH_IMAGE005
Layer one
Figure 548612DEST_PATH_IMAGE006
The ratio of the transform coefficients of the individual subbands,
Figure 346804DEST_PATH_IMAGE010
Figure 811284DEST_PATH_IMAGE011
and
Figure 260588DEST_PATH_IMAGE012
is the first
Figure 50690DEST_PATH_IMAGE005
Layer one
Figure 805019DEST_PATH_IMAGE006
Mean and standard deviation of the gray scale of individual subbands.
EXAMPLE five
Fig. 6 is a schematic flow chart of a remote sensing image change detection method provided in the fifth embodiment of the present invention, and as shown in fig. 6, on the basis of the remote sensing image change detection method provided in the first embodiment of the present invention, step 104 is further refined, which specifically includes the following steps:
step 1041, for each difference image map, constructing a first difference distribution function corresponding to the difference image map, where the first difference distribution function is expressed in a form of a first mixture gaussian model corresponding to the two gaussian models.
In this embodiment, for each difference image map, a first difference distribution function corresponding to the difference image map is constructed, and actually, a difference image map boundary distribution curve is regarded as being formed by superimposing two gaussian models, the first difference distribution function is expressed in a form of a first mixed gaussian model corresponding to the two gaussian models, and the first difference distribution function is expressed as:
Figure 338769DEST_PATH_IMAGE013
wherein the content of the first and second substances,
Figure 393312DEST_PATH_IMAGE014
for the feature value of the difference image,
Figure 839468DEST_PATH_IMAGE015
is the first in a Gaussian mixture model
Figure 346673DEST_PATH_IMAGE005
Layer one
Figure 418534DEST_PATH_IMAGE006
Sub-band number
Figure 327584DEST_PATH_IMAGE016
The individual components (m =1, 2),
Figure 193909DEST_PATH_IMAGE017
is taken as the mean value of the average value,
Figure 434748DEST_PATH_IMAGE018
is the variance of the received signal and the received signal,
Figure 310300DEST_PATH_IMAGE019
and
Figure 73857DEST_PATH_IMAGE020
are coefficients of two Gaussian models, and pi12=1。
And 1042, calculating parameters of the first Gaussian mixture model by adopting a maximum expectation EM algorithm.
In this embodiment, the parameters of the first gaussian mixture model are calculated by using an EM Algorithm, which is an iterative optimization strategy, and since each iteration in the calculation method is divided into two steps, one of which is an expected step (E step) and the other is an extreme step (M step), the Algorithm is called as an EM Algorithm (expection-Maximization Algorithm). Wherein, step E is represented as:
Figure 111083DEST_PATH_IMAGE021
wherein the content of the first and second substances,
Figure 609191DEST_PATH_IMAGE022
in order to be the posterior probability,
Figure 757276DEST_PATH_IMAGE023
is the first in a Gaussian mixture model
Figure 640918DEST_PATH_IMAGE005
Layer one
Figure 583467DEST_PATH_IMAGE006
Sub-band number
Figure 21401DEST_PATH_IMAGE016
A component (a)
Figure 19182DEST_PATH_IMAGE024
),
Figure 22910DEST_PATH_IMAGE025
Is taken as the mean value of the average value,
Figure 667518DEST_PATH_IMAGE026
is the variance of the received signal and the received signal,
Figure 858328DEST_PATH_IMAGE027
are coefficients of a gaussian model.
Wherein, M step is expressed as:
Figure 364527DEST_PATH_IMAGE028
whereinN is the number of sample points, and the EM step is repeated until the algorithm converges to obtain the second
Figure 222761DEST_PATH_IMAGE005
Layer one
Figure 38271DEST_PATH_IMAGE006
Mean difference of individual subband parameters
Figure 716377DEST_PATH_IMAGE029
Variance of
Figure 9955DEST_PATH_IMAGE030
Coefficient of Gaussian model
Figure 974893DEST_PATH_IMAGE031
The mean-square error, the variance and the Gaussian model coefficient are parameters of the first mixed Gaussian model.
And 1043, determining a layer change region and a layer non-change region corresponding to the difference image map according to the parameters of the first Gaussian mixture model.
In this embodiment, an image rough-division threshold corresponding to the difference image is determined according to parameters of the first gaussian mixture model, a significant variation region, a significant non-variation region and a region to be detected in the corresponding difference image are determined according to the image rough-division threshold, an analysis variation region and an analysis non-variation region in the region to be detected are determined, the analysis variation region and the significant variation region are determined as corresponding layer variation regions, and the analysis non-variation region and the non-significant variation region are determined as corresponding layer non-variation regions.
EXAMPLE six
Fig. 7 is a schematic flow chart of a remote sensing image change detection method provided in the sixth embodiment of the present invention, and as shown in fig. 7, on the basis of the remote sensing image change detection method provided in the fifth embodiment of the present invention, step 1043 is further refined, which specifically includes the following steps:
step 1043a, determining an image rough-dividing threshold in the corresponding difference image map according to the parameter of the first Gaussian mixture model.
In this embodiment, the parameters of the first gaussian mixture model include a mean difference, a variance, and a gaussian model coefficient, and the image rough-division threshold in the corresponding difference image map is obtained according to the mean difference, the variance, and the gaussian model coefficient.
And step 1043b, determining a variation threshold and a non-variation threshold in the corresponding difference image map according to the rough image classification threshold.
In this embodiment, the change threshold in the corresponding difference image map is determined according to the rough image classification threshold and the preset threshold of the area to be detected, and the non-change threshold in the corresponding difference image map is determined according to the rough image classification threshold and the preset threshold of the area to be detected, where the preset threshold of the area to be detected is an empirical value and can be set as needed.
And step 1043c, determining a significant variation region, a significant non-variation region and a region to be detected in the corresponding difference image map according to the variation threshold and the non-variation threshold.
In this embodiment, a significant variation region in the corresponding difference image map is determined according to the variation threshold, and a region corresponding to a threshold in the corresponding difference image map that is less than or equal to the variation threshold is determined as the significant variation region. And determining a significant non-change area in the corresponding difference image map according to the non-change threshold, and determining an area corresponding to the non-change threshold which is greater than or equal to the threshold in the corresponding difference image map as the significant non-change area. And determining the area to be detected in the corresponding difference image map according to the change threshold and the non-change threshold, and determining the area corresponding to the threshold which is larger than the change threshold and smaller than the non-change threshold in the corresponding difference image map as the area to be detected.
Step 1043d, determining the type of each pixel point in the area to be detected in the difference image map to determine an analysis change area and an analysis non-change area in the area to be detected.
In this embodiment, each pixel point category in the area to be detected in the differential image map includes a non-change pixel and a change pixel, and if a certain pixel point category in the area to be detected is a change pixel, the pixel point is classified as an analysis change area in the area to be detected; if the type of a certain pixel point in the region to be detected is a non-change pixel, classifying the pixel point into an analysis non-change region in the region to be detected.
Step 1043e, determining the significant variation area and the analysis variation area in the difference image map as corresponding layer variation areas, and determining the significant non-variation area and the analysis non-variation area in the difference image map as corresponding layer non-variation areas.
In this embodiment, the types of the pixel points in the regions to be detected in the differential image map are determined, so as to further determine an analysis change region and an analysis non-change region in the regions to be detected, determine the significant change region and the analysis change region in the regions to be detected in the differential image map as corresponding layer change regions, and determine the significant non-change region and the analysis non-change region in the regions to be detected in the differential image map as corresponding layer non-change regions.
EXAMPLE seven
Fig. 8 is a schematic flow chart of a remote sensing image change detection method provided by the seventh embodiment of the present invention, and as shown in fig. 8, on the basis of the remote sensing image change detection method provided by the sixth embodiment of the present invention, step 1043a is further refined, which specifically includes the following steps:
and 1043a1, substituting the parameters of the first Gaussian mixture model into the corresponding Gaussian model, and calculating the intersection point of the two Gaussian models.
In this embodiment, the mean deviation, the variance, and the gaussian model coefficients of the first gaussian mixture model are substituted into formula (5) to obtain the intersection of the two gaussian models, where formula (5) is expressed as:
Figure 961304DEST_PATH_IMAGE032
wherein the content of the first and second substances,
Figure 392285DEST_PATH_IMAGE033
in order to be the point of intersection,
Figure 223975DEST_PATH_IMAGE034
in order to achieve the average difference,
Figure 541955DEST_PATH_IMAGE035
is the variance of the received signal and the received signal,
Figure 699267DEST_PATH_IMAGE036
are gaussian model coefficients.
Step 1043a2, determine the intersection as the image rough-cut threshold in the corresponding difference image map.
In this embodiment, the intersection of the two gaussian models is determined as the graph rough score threshold. Further determining a non-varying threshold and a varying threshold based on the coarse threshold.
Example eight
Fig. 9 is a schematic flow chart of the method for detecting a change in a remote sensing image according to the eighth embodiment of the present invention, and as shown in fig. 9, on the basis of the method for detecting a change in a remote sensing image according to the sixth embodiment of the present invention, step 1043b is further refined, which specifically includes the following steps:
step 1043b1, acquiring a pre-configured threshold of the region to be detected.
In this embodiment, a pre-configured threshold of the area to be detected is obtained, where the threshold of the area to be detected is an empirical value and can be set as required.
Step 1043b2, calculating a difference value between the image rough-dividing threshold value and the threshold value of the region to be detected, and determining the difference value as a variation threshold value in the corresponding difference image map.
In this embodiment, the change threshold in the corresponding difference image map is determined according to the rough image division threshold and the pre-configured threshold of the to-be-detected region, specifically, the rough image division threshold and the pre-configured threshold of the to-be-detected region are substituted into formula (6) to calculate the change threshold in the corresponding difference image map, where formula (6) is expressed as:
Figure 351965DEST_PATH_IMAGE037
formula (6)
Wherein the content of the first and second substances,
Figure 987345DEST_PATH_IMAGE038
is a differential image mapThe threshold value of the variable threshold value is changed,
Figure 674679DEST_PATH_IMAGE039
and alpha is a preset threshold value of the area to be detected.
And step 1043b3, summing the image rough-dividing threshold and the threshold of the region to be detected, and determining the summation result as a non-change threshold in the corresponding difference image map.
In this embodiment, a non-change threshold in the corresponding difference image map is determined according to the rough image division threshold and the pre-configured threshold of the region to be detected, and specifically, the rough image division threshold and the pre-configured threshold of the region to be detected are substituted into formula (7) to calculate the non-change threshold in the corresponding difference image map, where formula (7) is expressed as:
Figure 252159DEST_PATH_IMAGE040
formula (7)
Wherein the content of the first and second substances,
Figure 126575DEST_PATH_IMAGE041
for the non-varying threshold in the difference image map,
Figure 565646DEST_PATH_IMAGE042
and alpha is a preset threshold value of the area to be detected.
Example nine
Fig. 10 is a schematic flow chart of the remote sensing image change detection method provided by the ninth embodiment of the present invention, and as shown in fig. 10, on the basis of the remote sensing image change detection method provided by the sixth embodiment of the present invention, a step 1043c is further refined, which specifically includes the following steps:
in step 1043c1, the area corresponding to the threshold value less than or equal to the variation threshold value is determined as the significant variation area in the corresponding difference image map.
In this embodiment, the difference image map can be represented by a gray level histogram, see fig. 11, where fig. 11 is a gray level histogram corresponding to a certain difference image map,
Figure 107486DEST_PATH_IMAGE043
is a variation threshold of the gray histogram corresponding to the difference image map,
Figure 357333DEST_PATH_IMAGE044
the area corresponding to the threshold value smaller than or equal to the change threshold value in the gray histogram is determined as the significant change area in the corresponding difference image map, and the pixel point corresponding to the significant change area does not change, which indicates that the pixel point in the first characteristic sub-map and the second characteristic sub-map does not change before and after.
In step 1043c2, the area corresponding to the threshold value greater than or equal to the non-change threshold value is determined as the significant non-change area in the corresponding difference image map.
In this embodiment, the area corresponding to the non-change threshold value or more in the gray histogram is determined as the significant non-change area in the corresponding difference image map, the pixel point corresponding to the significant non-change area is not changed, and the pixel point corresponding to the significant change area is changed, which indicates that the pixel point in the first feature sub-map and the pixel point in the second feature sub-map are changed before and after.
Step 1043c3, determining the area corresponding to the threshold value greater than the variation threshold value and less than the non-variation threshold value as the area to be detected in the corresponding difference image map.
In this embodiment, the area corresponding to the threshold value greater than the variation threshold value and less than the non-variation threshold value in the gray histogram is determined as the area to be detected in the corresponding difference image map, and the pixel point corresponding to the area to be detected is not determined whether to vary, and the analysis variation area and the analysis non-variation area in the area to be detected need to be determined according to the type of each pixel point.
Example ten
On the basis of the remote sensing image change detection method provided by the sixth embodiment of the invention, a step 1043d is further refined, and the method specifically comprises the following steps:
step 1043d1, respectively constructing a difference distribution function for a change region and a non-change region in each difference image map, wherein the second difference distribution function of the change region is represented in the form of a second mixed gaussian model corresponding to the two gaussian models; the third difference distribution function of the unchanged region is expressed in the form of a third mixture gaussian model corresponding to the two gaussian models.
In this embodiment, a difference distribution function is respectively constructed for a change region and a non-change region in each difference image map, and the change region and the non-change region are respectively regarded as being formed by two gaussian models. And the second difference distribution function of the changed area is represented in the form of a second Gaussian mixture model corresponding to the two Gaussian models, and the third difference distribution function of the unchanged area is represented in the form of a third Gaussian mixture model corresponding to the two Gaussian models.
Step 1043d2, calculating parameters of the second Gaussian mixture model and parameters of the third Gaussian mixture model by using an error back propagation BP algorithm to determine a second difference distribution function and a third difference distribution function.
In this embodiment, the parameters of the second gaussian mixture model are calculated by using an error back propagation BP algorithm, and the parameters of the second gaussian mixture model include a corresponding mean value, a corresponding variance, and a corresponding proportional parameter corresponding to a corresponding pixel point, so as to determine a second difference distribution function. And calculating parameters of a third Gaussian mixture model by using an error Back Propagation (BP) algorithm, wherein the parameters of the third Gaussian mixture model comprise a corresponding mean value, a corresponding variance and a corresponding proportion parameter, so as to determine a third difference distribution function.
Step 1043d3, for the to-be-detected region in each difference image map, respectively calculating the class membership degree of the second differential distribution function corresponding to each pixel corresponding to the to-be-detected region and the class membership degree of the corresponding third differential distribution function.
In this embodiment, for the to-be-detected region in each differential image map, the class membership degree of the second differential distribution function corresponding to each pixel corresponding to the to-be-detected region is calculated, and the class membership degree of the third differential distribution function corresponding to each pixel corresponding to the to-be-detected region is calculated, where the second differential distribution function is a function corresponding to a changed region, the corresponding class membership degree of the second differential distribution function is a changed class membership degree, the third differential distribution function is a function corresponding to a non-changed region, and the corresponding class membership degree of the third differential distribution function is a non-changed class membership degree. The second differential distribution function and the third differential distribution function are expressed as:
Figure 719044DEST_PATH_IMAGE045
wherein the content of the first and second substances,
Figure 961807DEST_PATH_IMAGE046
is shown as
Figure 623732DEST_PATH_IMAGE005
Layer one
Figure 28169DEST_PATH_IMAGE006
Sample pixel x in a subband belongs to the class
Figure 123514DEST_PATH_IMAGE047
(class of variation) or class
Figure 169967DEST_PATH_IMAGE005
(degree of membership of the non-varying class),
Figure 686399DEST_PATH_IMAGE048
is shown in
Figure 261737DEST_PATH_IMAGE005
Layer one
Figure 598040DEST_PATH_IMAGE006
Pixel x in sub-band in class
Figure 933338DEST_PATH_IMAGE049
First, the
Figure 38697DEST_PATH_IMAGE050
The proportion of each of the mixed gaussian models,
Figure 50516DEST_PATH_IMAGE051
represents a mean value of
Figure 139694DEST_PATH_IMAGE052
Variance is
Figure 511639DEST_PATH_IMAGE053
The gaussian mixture model of (1). Parameters in Gaussian mixture model of variable region and non-variable region respectively by BP algorithm
Figure 737084DEST_PATH_IMAGE054
Figure 919803DEST_PATH_IMAGE055
Figure 230699DEST_PATH_IMAGE056
Fitting to obtain parameters
Figure 157067DEST_PATH_IMAGE057
Figure 253330DEST_PATH_IMAGE058
Figure 341371DEST_PATH_IMAGE059
Will be parameter
Figure 139563DEST_PATH_IMAGE060
Figure 869622DEST_PATH_IMAGE061
Figure 804080DEST_PATH_IMAGE062
Substituting formula (10) to calculate pixel points corresponding to the region to be detected
Figure 580799DEST_PATH_IMAGE063
In the first place
Figure 866287DEST_PATH_IMAGE005
Layer one
Figure 400037DEST_PATH_IMAGE006
Varying and non-varying class membership in individual subbands
Figure 189001DEST_PATH_IMAGE064
And
Figure 635157DEST_PATH_IMAGE065
the formula (9) is expressed as:
Figure 407941DEST_PATH_IMAGE066
and step 1043d4, determining the category of each pixel point of the region to be detected according to the category membership degree of each pixel in the second differential distribution function and the third differential distribution function.
In this embodiment, the category of each pixel point of the area to be detected is determined according to the category membership degree of each pixel in the second differential distribution function and the category membership degree of each pixel in the third differential distribution function.
EXAMPLE eleven
On the basis of the method for detecting the change of the remote sensing image provided by the tenth embodiment of the invention, the step 1043d4 is further refined, and the method specifically comprises the following steps:
step 1001, if it is determined that the class membership degree of a certain pixel point in the second differential distribution function is greater than or equal to the class membership degree in the third differential distribution function, determining that the pixel point class is a change class in the analysis change region.
In this embodiment, if it is determined that the class membership (variable membership) of a certain pixel point in the second differential distribution function is greater than or equal to the class membership (non-variable membership) in the third differential distribution function, the first characteristic subset is describedAnd the pixel point in the graph and the corresponding second characteristic subgraph changes, and the pixel point category is a variation category in the analysis variation region. Defining pixel points on the second order by membership
Figure 479802DEST_PATH_IMAGE005
Layer one
Figure 123273DEST_PATH_IMAGE006
Implicit states in individual subbands
Figure 255177DEST_PATH_IMAGE067
Respectively representing a change class and a non-change class, and the membership degree and the hidden state (change class) are related as follows:
Figure 233366DEST_PATH_IMAGE068
wherein the content of the first and second substances,
Figure 108918DEST_PATH_IMAGE069
the degree of membership of the pixel points in the variation class corresponding to the second difference distribution function,
Figure 872475DEST_PATH_IMAGE070
the corresponding non-varying class membership in the third differential distribution function.
In step 1002, if it is determined that the class membership degree of a certain pixel point in the second differential distribution function is smaller than the corresponding class membership degree in the third differential distribution function, the pixel point class is determined to be a non-change class in the analysis non-change region.
In this embodiment, if it is determined that the class membership (variable membership) of a certain pixel point in the second differential distribution function is smaller than the class membership (non-variable membership) corresponding to the third differential distribution function, the pixel point class is a non-variable class in the analysis non-variable region. Membership is related to implicit state (invariant class) as follows:
Figure 175280DEST_PATH_IMAGE071
wherein the content of the first and second substances,
Figure 391498DEST_PATH_IMAGE072
the degree of membership of the pixel points in the variation class corresponding to the second difference distribution function,
Figure 555894DEST_PATH_IMAGE073
the corresponding non-varying class membership in the third differential distribution function.
Example twelve
On the basis of the remote sensing image change detection method provided by the embodiment of the invention, the step 105 is further refined, and the method specifically comprises the following steps:
step 1051, fusing the membership degree of each pixel point of the layer change area and the membership degree of each pixel point of the layer non-change area corresponding to each difference image map by using a hidden Markov model, determining the actual pixel point category corresponding to each pixel point, and obtaining a corresponding binary image according to the actual category of each pixel point so as to represent the change area and the non-change area of the second time-phase remote sensing image relative to the first time-phase remote sensing image by the binary image.
In this embodiment, the plurality of difference image maps may be described by a hidden markov tree model, where the initial hidden markov tree model is represented as:
Figure 439537DEST_PATH_IMAGE074
wherein the content of the first and second substances,
Figure 913243DEST_PATH_IMAGE075
is the joint probability, a is the state transition probability,
Figure 616757DEST_PATH_IMAGE076
is the gaussian mean and C is the covariance.
Figure 103887DEST_PATH_IMAGE077
Four initial parameters are requiredThe adjustment is performed.
In this embodiment, the method for training parameters of the hidden markov tree model uses an EM algorithm, and step E uses an "up-down algorithm", where the upward process calculates a conditional probability function by feeding back information of a finer scale to a coarser scale, and the downward process calculates a joint probability function by transferring information of a coarser scale to a finer scale.
Wherein, step E includes:
wherein the state transition probability is expressed as:
Figure 842036DEST_PATH_IMAGE078
wherein the content of the first and second substances,
Figure 486644DEST_PATH_IMAGE079
to represent
Figure 943033DEST_PATH_IMAGE005
And when the layer child node is in the state o, the n-1 layer father node is in the state o'.
Wherein the joint probability of the hidden Markov tree model represents:
Figure 432920DEST_PATH_IMAGE080
wherein the content of the first and second substances,
Figure 41887DEST_PATH_IMAGE081
is the corresponding membership degree of the difference image map when the type of a certain pixel point at the nth layer is o, theta is an initial hidden Markov tree model,
Figure 591817DEST_PATH_IMAGE082
and
Figure 535503DEST_PATH_IMAGE083
representing a bottom-up joint probability function and a conditional probability function.
Wherein the probability density function is combined
Figure 829081DEST_PATH_IMAGE082
The calculation from low frequency n =1 to high frequency n =3 is as follows,
Figure 777500DEST_PATH_IMAGE085
representing n layers of differential images in
Figure 677323DEST_PATH_IMAGE086
Vector of values of class o
Figure 774592DEST_PATH_IMAGE087
Figure 607419DEST_PATH_IMAGE088
The calculation process of the conditional probability function from high frequency n =3 to low frequency n =1 is as follows:
Figure 515463DEST_PATH_IMAGE089
and E, calculating the probability of a hidden state according to the step E, updating the model parameters in the step M, and representing the updated hidden Markov tree model as follows, wherein theta' is an initial hidden Markov tree model:
Figure 902582DEST_PATH_IMAGE090
through EM operation convergence, a certain pixel point is obtained
Figure 490875DEST_PATH_IMAGE092
The actual pixel point categories are:
Figure 553509DEST_PATH_IMAGE093
in this embodiment, the updated hidden markov model is used to fuse the membership degree of each pixel point of the layer change region and the membership degree of each pixel point of the layer non-change region corresponding to each difference image map to obtain the actual pixel point category of each pixel point, where the actual pixel point category includes a change category or a non-change category, the change category pixel point is marked as 1, the non-change category pixel point is marked as 1, a binary map about the pixel point is generated, and the change region and the non-change region of the second time-phase remote sensing image relative to the first time-phase remote sensing image are represented by the binary map.
Fig. 12 is a schematic structural diagram of a remote sensing image change detection apparatus according to an embodiment of the present invention, and as shown in fig. 12, the remote sensing image change detection apparatus 200 according to the embodiment includes an acquisition unit 201, an extraction unit 202, a determination unit 203, and a fusion unit 204.
The acquiring unit 201 is configured to acquire a first time-phase remote sensing image and a second time-phase remote sensing image acquired in the same region at different times. The extracting unit 202 is configured to extract frequency domain features of the first time-phase remote sensing image and the second time-phase remote sensing image respectively by using a non-subsampled Contourlet transform algorithm, so as to obtain a plurality of first feature sub-images and a plurality of second feature sub-images respectively. The determining unit 203 is configured to determine a plurality of difference maps according to the plurality of first feature sub-maps and the plurality of second feature sub-maps. The determining unit 203 is further configured to determine, for each difference image map, a layer change region and a layer non-change region corresponding to the difference image map by using the first gaussian mixture model. And the fusion unit 204 is configured to fuse the layer change region and the layer non-change region corresponding to each difference image map by using a hidden markov model, so as to obtain a change region and a non-change region of the second time-phase remote sensing image relative to the first time-phase remote sensing image.
Optionally, the extracting unit is further configured to extract the high-frequency features and the low-frequency features corresponding to the two time-phase remote sensing images by using NSPFB, so as to obtain a first high-frequency sub-band feature sub-image and a first low-frequency sub-band feature sub-image respectively;
respectively extracting high-frequency characteristics of first high-frequency sub-band characteristic subgraphs corresponding to the two time-phase remote sensing images by adopting NSDFB (non-subsampled DFB) so as to respectively obtain first high-frequency direction sub-band characteristic subgraphs;
respectively extracting high-frequency features and low-frequency features of first low-frequency sub-band feature sub-images corresponding to the two time-phase remote sensing images by adopting NSPFB (non-subsampled pulse frequency response) so as to respectively obtain a second high-frequency sub-band feature sub-image and a second low-frequency sub-band feature sub-image;
respectively extracting high-frequency characteristics of second high-frequency sub-band characteristic subgraphs corresponding to the two time-phase remote sensing images by adopting NSDFB (non-subsampled DFB) so as to respectively obtain second high-frequency direction sub-band characteristic subgraphs;
determining a first high-frequency direction sub-band characteristic subgraph, a second high-frequency direction sub-band characteristic subgraph and a second low-frequency sub-band characteristic subgraph corresponding to the first time-phase remote sensing image as a plurality of first characteristic subgraphs;
and determining a first high-frequency direction sub-band characteristic subgraph, a second high-frequency direction sub-band characteristic subgraph and a second low-frequency sub-band characteristic subgraph corresponding to the second time-phase remote sensing image as a plurality of second characteristic subgraphs.
Optionally, the determining unit is further configured to divide the plurality of first feature sub-images and the plurality of second feature sub-images into a plurality of groups of feature sub-images according to the correspondence; and aiming at each group of characteristic subgraphs, calculating the pixel difference value between the first characteristic subgraph and the corresponding second characteristic subgraph, and determining the difference image corresponding to the group of characteristic subgraphs according to the pixel difference value.
Optionally, the determining unit is further configured to construct, for each difference image map, a first difference distribution function corresponding to the difference image map, where the first difference distribution function is represented in the form of a first mixture gaussian model corresponding to the two gaussian models; calculating parameters of the first Gaussian mixture model by adopting a maximum expectation EM algorithm; and determining a layer change region and a layer non-change region corresponding to the difference image map according to the parameters of the first Gaussian mixture model.
Optionally, the determining unit is further configured to determine an image rough-dividing threshold in the corresponding difference image map according to a parameter of the first gaussian mixture model; determining a variable threshold and a non-variable threshold in the corresponding difference image map according to the rough image division threshold; determining a significant change area, a significant non-change area and an area to be detected in the corresponding difference image map according to the change threshold and the non-change threshold; determining the types of all pixel points in the areas to be detected in the difference image map so as to determine an analysis change area and an analysis non-change area in the areas to be detected; and determining the significant variation area and the analysis variation area in the difference image map as corresponding layer variation areas, and determining the significant non-variation area and the analysis non-variation area in the difference image map as corresponding layer non-variation areas.
Optionally, the determining unit is further configured to substitute a parameter of the first gaussian mixture model into a corresponding gaussian model, and calculate an intersection of the two gaussian models; the intersection point is determined as the image coarse threshold in the corresponding difference image map.
Optionally, the determining unit is further configured to determine, as a significant variation region in the corresponding difference image map, a region corresponding to the threshold value being less than or equal to the variation threshold value; determining the area corresponding to the non-change threshold value or more as a significant non-change area in the corresponding difference image map; and determining the area corresponding to the threshold value which is larger than the variation threshold value and smaller than the non-variation threshold value as the area to be detected in the corresponding difference image map.
Optionally, the determining unit is further configured to respectively construct a difference distribution function for a changed region and a non-changed region in each difference image map, where a second difference distribution function of the changed region is represented in a form of a second gaussian mixture model corresponding to the two gaussian models; the third difference distribution function of the unchanged area is expressed in the form of a third mixed Gaussian model corresponding to the two Gaussian models; calculating parameters of the second Gaussian mixture model and parameters of the third Gaussian mixture model by adopting an error Back Propagation (BP) algorithm to determine a second difference distribution function and a third difference distribution function; respectively calculating the class membership degree of a second differential distribution function corresponding to each pixel corresponding to each region to be detected and the class membership degree of a corresponding third differential distribution function aiming at the region to be detected in each differential image map; and determining the category of each pixel point of the area to be detected according to the category membership degree of each pixel in the second differential distribution function and the third differential distribution function.
Optionally, the determining unit is further configured to determine that the category of the pixel point is a variation category in the analysis variation region if it is determined that the category membership degree of the pixel point corresponding to the second differential distribution function is greater than or equal to the category membership degree corresponding to the third differential distribution function; and if the class membership degree of a certain pixel point in the second differential distribution function is smaller than the corresponding class membership degree in the third differential distribution function, determining that the pixel point class is a non-change class in the analysis non-change area.
Optionally, the fusion unit is further configured to fuse the membership degree of each pixel in the layer change region and the membership degree of each pixel in the layer non-change region corresponding to each difference image map by using a hidden markov model, determine an actual pixel category corresponding to each pixel, and obtain a corresponding binary image according to the actual category of each pixel, so as to represent the change region and the non-change region of the second time-phase remote sensing image relative to the first time-phase remote sensing image by using the binary image.
Fig. 13 is a block diagram of an electronic device for implementing the remote sensing image change detection method according to the embodiment of the present invention, and as shown in fig. 13, the electronic device 300 includes: memory 301, processor 302.
The memory 301 stores computer-executable instructions;
the processor 302 executes computer-executable instructions stored by the memory 301 to cause the processor to perform a method provided by any of the embodiments described above.
In an exemplary embodiment, a computer-readable storage medium is also provided, in which computer-executable instructions are stored, the computer-executable instructions being executed by a processor to perform the method in any one of the above-mentioned embodiments.
In an exemplary embodiment, a computer program product is also provided, comprising a computer program for execution by a processor of the method in any of the above embodiments.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This invention is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (13)

1. A method for detecting changes of remote sensing images is characterized by comprising the following steps:
acquiring a first time-phase remote sensing image and a second time-phase remote sensing image which are acquired in the same region at different times;
respectively extracting frequency domain characteristics of the first time phase remote sensing image and the second time phase remote sensing image by adopting a non-subsampled Contourlet transform algorithm so as to respectively obtain a plurality of first characteristic sub-images and a plurality of second characteristic sub-images;
determining a plurality of difference image maps according to the plurality of first characteristic sub-maps and the plurality of second characteristic sub-maps;
for each difference image map, determining a layer change region and a layer non-change region corresponding to the difference image map by adopting a first Gaussian mixture model;
and fusing the layer change region and the layer non-change region corresponding to each difference image map by adopting a hidden Markov model to obtain the change region and the non-change region of the second time-phase remote sensing image relative to the first time-phase remote sensing image.
2. The method according to claim 1, wherein the non-subsampled Contourlet transform algorithm comprises: a non-downsampling pyramid filter bank NSPFB and a non-downsampling direction filter bank NSDFB;
the method for respectively extracting the frequency domain characteristics of the first time-phase remote sensing image and the second time-phase remote sensing image by adopting the non-subsampled Contourlet transform algorithm to respectively obtain a plurality of first characteristic sub-images and a plurality of second characteristic sub-images comprises the following steps:
respectively extracting high-frequency features and low-frequency features corresponding to the two time-phase remote sensing images by adopting NSPFB (non-subsampled pulse frequency shift) to respectively obtain a first high-frequency sub-band feature sub-image and a first low-frequency sub-band feature sub-image;
respectively extracting high-frequency characteristics of first high-frequency sub-band characteristic subgraphs corresponding to the two time-phase remote sensing images by adopting NSDFB (non-subsampled DFB) so as to respectively obtain first high-frequency direction sub-band characteristic subgraphs;
respectively extracting high-frequency features and low-frequency features of first low-frequency sub-band feature sub-images corresponding to the two time-phase remote sensing images by adopting NSPFB (non-subsampled pulse frequency response) so as to respectively obtain a second high-frequency sub-band feature sub-image and a second low-frequency sub-band feature sub-image;
respectively extracting high-frequency characteristics of second high-frequency sub-band characteristic subgraphs corresponding to the two time-phase remote sensing images by adopting NSDFB (non-subsampled DFB) so as to respectively obtain second high-frequency direction sub-band characteristic subgraphs;
determining a first high-frequency direction sub-band characteristic subgraph, a second high-frequency direction sub-band characteristic subgraph and a second low-frequency sub-band characteristic subgraph corresponding to the first time-phase remote sensing image as a plurality of first characteristic subgraphs;
and determining a first high-frequency direction sub-band characteristic subgraph, a second high-frequency direction sub-band characteristic subgraph and a second low-frequency sub-band characteristic subgraph corresponding to the second time-phase remote sensing image as a plurality of second characteristic subgraphs.
3. The method of claim 1, wherein determining a plurality of difference image maps from the plurality of first feature sub-maps and the plurality of second feature sub-maps comprises:
dividing the plurality of first characteristic subgraphs and the plurality of second characteristic subgraphs into a plurality of groups of characteristic subgraphs according to the corresponding relation;
and aiming at each group of characteristic subgraphs, calculating the pixel difference value between the first characteristic subgraph and the corresponding second characteristic subgraph, and determining the difference image corresponding to the group of characteristic subgraphs according to the pixel difference value.
4. The method of claim 1, wherein for each difference image map, determining a layer-changed region and a layer-unchanged region corresponding to the difference image map by using a first Gaussian mixture model comprises:
for each difference image map, constructing a first difference distribution function corresponding to the difference image map, wherein the first difference distribution function is represented in the form of a first mixed Gaussian model corresponding to two Gaussian models;
calculating parameters of the first Gaussian mixture model by adopting a maximum expectation EM algorithm;
and determining a layer change region and a layer non-change region corresponding to the difference image map according to the parameters of the first Gaussian mixture model.
5. The method of claim 4, wherein determining the layer-changed region and the layer-unchanged region corresponding to the difference image map according to the parameters of the first Gaussian mixture model comprises:
determining an image rough-dividing threshold value in the corresponding difference image map according to the parameters of the first Gaussian mixture model;
determining a variable threshold value and a non-variable threshold value in the corresponding difference image map according to the image rough-dividing threshold value;
determining a significant variation area, a significant non-variation area and an area to be detected in the corresponding difference image map according to the variation threshold and the non-variation threshold;
determining the types of all pixel points in the areas to be detected in the difference image map so as to determine an analysis change area and an analysis non-change area in the areas to be detected;
and determining the significant variation area and the analysis variation area in the difference image map as corresponding layer variation areas, and determining the significant non-variation area and the analysis non-variation area in the difference image map as corresponding layer non-variation areas.
6. The method of claim 5, wherein determining an image rough-cut threshold in the corresponding difference image map according to the parameters of the first Gaussian mixture model comprises:
substituting the parameters of the first Gaussian mixture model into the corresponding Gaussian model, and calculating the intersection point of the two Gaussian models;
and determining the intersection point as an image rough-dividing threshold value in the corresponding difference image map.
7. The method of claim 5, wherein determining a corresponding difference image map variation threshold and a non-variation threshold according to the image rough-division threshold comprises:
acquiring a preset threshold value of a region to be detected;
calculating a difference value between the image rough-dividing threshold value and the threshold value of the area to be detected, and determining the difference value as a change threshold value in a corresponding difference image map;
and summing the image rough-dividing threshold value and the threshold value of the area to be detected, and determining the summation result as a non-change threshold value in the corresponding difference image map.
8. The method according to claim 5, wherein the determining the significant variation region, the significant non-variation region and the region to be detected in the corresponding difference image map according to the variation threshold and the non-variation threshold comprises:
determining the area corresponding to the threshold value smaller than or equal to the variation threshold value as a significant variation area in the corresponding difference image map;
determining the area corresponding to the non-change threshold value or more as a significant non-change area in the corresponding difference image map;
and determining the area corresponding to the threshold value which is larger than the variation threshold value and smaller than the non-variation threshold value as the area to be detected in the corresponding difference image map.
9. The method according to claim 5, wherein the determining the types of the pixels in the regions to be detected in the difference image to determine an analytic change region and an analytic non-change region in the regions to be detected comprises:
respectively constructing a difference distribution function aiming at a change region and a non-change region in each difference image map, wherein a second difference distribution function of the change region is represented in a form of a second mixed Gaussian model corresponding to the two Gaussian models; the third difference distribution function of the unchanged area is expressed in the form of a third mixed Gaussian model corresponding to the two Gaussian models;
calculating parameters of the second Gaussian mixture model and parameters of the third Gaussian mixture model by adopting an error Back Propagation (BP) algorithm to determine a second difference distribution function and a third difference distribution function;
respectively calculating the class membership degree of a second differential distribution function corresponding to each pixel corresponding to each region to be detected and the class membership degree of a corresponding third differential distribution function aiming at the region to be detected in each differential image map;
and determining the category of each pixel point of the area to be detected according to the category membership degree of each pixel in the second differential distribution function and the third differential distribution function.
10. The method according to claim 9, wherein determining the category of each pixel point of the region to be detected according to the category membership degree of each pixel in the second differential distribution function and the third differential distribution function comprises:
if the class membership degree of a certain pixel point in the second differential distribution function is larger than or equal to the class membership degree corresponding to the third differential distribution function, determining that the pixel point class is a change class in the analysis change area;
and if the class membership degree of a certain pixel point in the second differential distribution function is smaller than the corresponding class membership degree in the third differential distribution function, determining that the pixel point class is a non-change class in the analysis non-change area.
11. A remote sensing image change detection apparatus, characterized in that the apparatus comprises:
the acquisition unit is used for acquiring a first time phase remote sensing image and a second time phase remote sensing image which are acquired in the same region at different times;
the extraction unit is used for respectively extracting the frequency domain characteristics of the first time phase remote sensing image and the second time phase remote sensing image by adopting a non-subsampled Contourlet transformation algorithm so as to respectively obtain a plurality of first characteristic subgraphs and a plurality of second characteristic subgraphs;
the determining unit is used for determining a plurality of difference image maps according to the plurality of first characteristic subgraphs and the plurality of second characteristic subgraphs;
the determining unit is further used for determining a layer change area and a layer non-change area corresponding to each difference image map by adopting a first Gaussian mixture model;
and the fusion unit is used for fusing the layer change region and the layer non-change region corresponding to each difference image map by adopting a hidden Markov model so as to obtain the change region and the non-change region of the second time phase remote sensing image relative to the first time phase remote sensing image.
12. An electronic device, comprising: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored by the memory to implement the method of any of claims 1 to 10.
13. A computer-readable storage medium having computer-executable instructions stored thereon, which when executed by a processor, are configured to implement the method of any one of claims 1 to 10.
CN202210328943.1A 2022-03-31 2022-03-31 Method, device and equipment for detecting change of remote sensing image and storage medium Active CN114419465B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210328943.1A CN114419465B (en) 2022-03-31 2022-03-31 Method, device and equipment for detecting change of remote sensing image and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210328943.1A CN114419465B (en) 2022-03-31 2022-03-31 Method, device and equipment for detecting change of remote sensing image and storage medium

Publications (2)

Publication Number Publication Date
CN114419465A true CN114419465A (en) 2022-04-29
CN114419465B CN114419465B (en) 2022-07-01

Family

ID=81263360

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210328943.1A Active CN114419465B (en) 2022-03-31 2022-03-31 Method, device and equipment for detecting change of remote sensing image and storage medium

Country Status (1)

Country Link
CN (1) CN114419465B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116363761A (en) * 2023-06-01 2023-06-30 深圳海清智元科技股份有限公司 Behavior recognition method and device based on image and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102831598A (en) * 2012-07-04 2012-12-19 西安电子科技大学 Remote sensing image change detecting method with combination of multi-resolution NMF (non-negative matrix factorization) and Treelet
CN102867187A (en) * 2012-07-04 2013-01-09 西安电子科技大学 NSST (NonsubsampledShearlet Transform) domain MRF (Markov Random Field) and adaptive threshold fused remote sensing image change detection method
US20130336540A1 (en) * 2012-06-14 2013-12-19 Hitachi, Ltd. Decomposition apparatus and method for refining composition of mixed pixels in remote sensing images
CN107248172A (en) * 2016-09-27 2017-10-13 中国交通通信信息中心 A kind of remote sensing image variation detection method based on CVA and samples selection
CN111681197A (en) * 2020-06-12 2020-09-18 陕西科技大学 Remote sensing image unsupervised change detection method based on Siamese network structure

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130336540A1 (en) * 2012-06-14 2013-12-19 Hitachi, Ltd. Decomposition apparatus and method for refining composition of mixed pixels in remote sensing images
CN102831598A (en) * 2012-07-04 2012-12-19 西安电子科技大学 Remote sensing image change detecting method with combination of multi-resolution NMF (non-negative matrix factorization) and Treelet
CN102867187A (en) * 2012-07-04 2013-01-09 西安电子科技大学 NSST (NonsubsampledShearlet Transform) domain MRF (Markov Random Field) and adaptive threshold fused remote sensing image change detection method
CN107248172A (en) * 2016-09-27 2017-10-13 中国交通通信信息中心 A kind of remote sensing image variation detection method based on CVA and samples selection
CN111681197A (en) * 2020-06-12 2020-09-18 陕西科技大学 Remote sensing image unsupervised change detection method based on Siamese network structure

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张凤玉: "遥感图像变化检测方法研究", 《中国优秀硕士学位论文全文数据库》, 15 December 2010 (2010-12-15), pages 31 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116363761A (en) * 2023-06-01 2023-06-30 深圳海清智元科技股份有限公司 Behavior recognition method and device based on image and electronic equipment
CN116363761B (en) * 2023-06-01 2023-08-18 深圳海清智元科技股份有限公司 Behavior recognition method and device based on image and electronic equipment

Also Published As

Publication number Publication date
CN114419465B (en) 2022-07-01

Similar Documents

Publication Publication Date Title
CN110287932B (en) Road blocking information extraction method based on deep learning image semantic segmentation
Fouedjio A hierarchical clustering method for multivariate geostatistical data
Martinis et al. Unsupervised extraction of flood-induced backscatter changes in SAR data using Markov image modeling on irregular graphs
Kussul et al. Grid system for flood extent extraction from satellite images
CN109871875B (en) Building change detection method based on deep learning
CN111161229B (en) Change detection method based on geometric active contour model and sparse self-coding
CN102629380B (en) Remote sensing image change detection method based on multi-group filtering and dimension reduction
CN109389062A (en) Utilize the method for High Resolution Spaceborne SAR image zooming-out lake land and water cut-off rule
Na et al. Object‐based large‐scale terrain classification combined with segmentation optimization and terrain features: A case study in China
CN114419465B (en) Method, device and equipment for detecting change of remote sensing image and storage medium
CN106960433B (en) It is a kind of that sonar image quality assessment method is referred to based on image entropy and the complete of edge
Aahlaad et al. An object-based image analysis of worldview-3 image for urban flood vulnerability assessment and dissemination through ESRI story maps
Zhao et al. Quantification of extensional uncertainty of segmented image objects by random sets
CN109558801B (en) Road network extraction method, medium, computer equipment and system
CN112348750B (en) SAR image change detection method based on threshold fusion and neighborhood voting
CN108648200A (en) A kind of indirect city high-resolution impervious surface extracting method
Zhang et al. Segmentation Scale Selection in geographic object-based image analysis
CN111210433B (en) Markov field remote sensing image segmentation method based on anisotropic potential function
Raju et al. Object Recognition in Remote Sensing Images Based on Modified Backpropagation Neural Network.
Li et al. Crack Detection and Recognition Model of Parts Based on Machine Vision.
Kim et al. A hybrid dasymetric mapping for population density surface using remote sensing data
CN109409375B (en) SAR image semantic segmentation method based on contour structure learning model
Widyaningrum et al. Tailored features for semantic segmentation with a DGCNN using free training samples of a colored airborne point cloud
Wu et al. Channel head extraction based on fuzzy unsupervised machine learning method
Brochet et al. Multivariate Emulation of Kilometer-Scale Numerical Weather Predictions with Generative Adversarial Networks: A Proof of Concept

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 518100 Guangdong Shenzhen Baoan District Xixiang street, Wutong Development Zone, Taihua Indus Industrial Park 8, 3 floor.

Patentee after: Shenzhen Haiqing Zhiyuan Technology Co.,Ltd.

Address before: 518100 Guangdong Shenzhen Baoan District Xixiang street, Wutong Development Zone, Taihua Indus Industrial Park 8, 3 floor.

Patentee before: SHENZHEN HIVT TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder