CN104637027A - Time and space quantitative fusion method for remote sensing data considering nonlocal characteristics and temporal and spatial variation - Google Patents

Time and space quantitative fusion method for remote sensing data considering nonlocal characteristics and temporal and spatial variation Download PDF

Info

Publication number
CN104637027A
CN104637027A CN201510087994.XA CN201510087994A CN104637027A CN 104637027 A CN104637027 A CN 104637027A CN 201510087994 A CN201510087994 A CN 201510087994A CN 104637027 A CN104637027 A CN 104637027A
Authority
CN
China
Prior art keywords
mrow
msub
time
space
mover
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510087994.XA
Other languages
Chinese (zh)
Other versions
CN104637027B (en
Inventor
沈焕锋
刘慧琴
吴鹏海
袁强强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN201510087994.XA priority Critical patent/CN104637027B/en
Publication of CN104637027A publication Critical patent/CN104637027A/en
Application granted granted Critical
Publication of CN104637027B publication Critical patent/CN104637027B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a time and space quantitative fusion method for remote sensing data considering nonlocal characteristics and temporal and spatial variation. The time and space quantitative fusion method is based on nonlocal filter, utilizes a moving window technique and comprises the following steps: firstly, selecting similar picture elements, screening the similar picture elements by using an experiential formula based on small window operation, and secondarily screening the similar picture elements on the basis; measuring the weight according to the similarity of neighborhood matrixes with the similar picture elements as the center as well as the relative distance between the neighborhood matrixes and the central picture element; considering the temporal and spatial variation, and aiming at main characteristics of the region, selecting corresponding modes for weighing; finally, fusing to obtain a reflectance value of the central picture element. According to the time and space quantitative fusion method disclosed by the invention, by using complementary information of high spatial resolution data and high temporal resolution data and considering the nonlocal characteristics and the temporal and spatial variation, data with high spatial resolution and high temporal resolution are obtained by fusing, so that the time and space quantitative fusion method has higher precision and greater actual application potential.

Description

Remote sensing data space-time quantitative fusion method considering non-local characteristics and space-time changes
Technical Field
The invention belongs to the field of remote sensing image fusion, and relates to a remote sensing data space-time quantitative fusion method considering non-local characteristics and space-time changes.
Background
The remote sensing monitoring with high resolution, long time sequence and high precision has great significance for the relevant fields of global change research, resource investigation and management, environmental monitoring and the like. However, due to the restriction of each index in the sensor design technology, the spatial resolution and the temporal resolution are mutually balanced. To solve this problem, a corresponding method is created that fuses and generates data having both high spatial resolution and high temporal resolution using complementary information of the satellite data of high spatial resolution and the satellite data of high temporal resolution, i.e., a space fusion technique. The current common fusion methods include a classical space-time adaptive reflectivity fusion method and a space-time adaptive reflectivity fusion method based on a spectrum unmixing theory. The two methods have respective advantages and limitations, the classical method has advantages in predicting regions mainly changing in time, and detail changes in the space of heterogeneous regions are difficult to predict; the enhanced algorithm is advantageous in predicting spatial detail changes, but less effective in predicting temporally changing regions.
Disclosure of Invention
The invention aims to provide a space-time quantitative fusion method based on non-local thought and considering time and space changes, and the space-time quantitative fusion is carried out by considering the time and space changes aiming at the characteristics of the prior art.
In order to capture regional characteristics better, the method aims at two image pair prediction, combines non-local filtering, uses an empirical formula similar pixel screening method based on small window operation, performs secondary screening on similar pixels, adopts first-order similar non-local mean weight, and predicts the reflectivity value of a window center pixel by weighted average of the similar pixels based on moving window operation.
The technical scheme adopted by the invention is as follows: a remote sensing data space-time quantitative fusion method considering non-local characteristics and space-time changes is characterized by comprising the following steps:
step 1: inputting a Tm moment high-space and low-space resolution image pair and a Tn moment high-space and low-space resolution image pair, and preprocessing the input images, wherein the preprocessing comprises reprojection, resampling or clipping;
step 2: the Tm time high spatial and low spatial resolution image pair and the Tn time high spatial and low spatial resolution image pair respectively use an empirical formula in combination with a time difference TijkAnd the spectral difference SijkScreening similar pixels;
and step 3: based on non-local filtering, calculating the weight of each similar pixel by adopting the similarity of a neighborhood matrix taking the similar pixel as a center and the relative distance between the similar pixel and the center pixel;
and 4, step 4: and (4) taking the time change and the spatial change into consideration, fusing the time-changed regions in a classical algorithm mode, and fusing the space-changed regions in a time weighting mode.
Preferably, the empirical formula described in step 2 is:
<math> <mrow> <mo>|</mo> <mover> <mi>F</mi> <mo>&OverBar;</mo> </mover> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>,</mo> <mi>B</mi> <mo>)</mo> </mrow> <mo>-</mo> <mover> <mi>F</mi> <mo>&OverBar;</mo> </mover> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>w</mi> <mo>/</mo> <mn>2</mn> </mrow> </msub> <mo>,</mo> <msub> <mi>y</mi> <mrow> <mi>w</mi> <mo>/</mo> <mn>2</mn> </mrow> </msub> <mo>,</mo> <mi>B</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>&lt;</mo> <mo>=</mo> <mi>d</mi> <mo>*</mo> <msup> <mn>2</mn> <mrow> <mover> <mi>F</mi> <mo>&OverBar;</mo> </mover> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>w</mi> <mo>/</mo> <mn>2</mn> </mrow> </msub> <mo>,</mo> <msub> <mi>y</mi> <mrow> <mi>w</mi> <mo>/</mo> <mn>2</mn> </mrow> </msub> <mo>,</mo> <mi>B</mi> <mo>)</mo> </mrow> </mrow> </msup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>;</mo> </mrow> </math>
wherein, F (x)i,yjB) reflectance values of neighboring pels of the temporal high spatial resolution data on a basis, F (x)w/2,yw/2B) is the reflectivity of the central pixel, B is the number of wave bands, and d is a free parameter.
Preferably, the screening process is performed in a small window, i.e. the mean is calculated.
Preferably, the specific implementation process of step 3 is to calculate the weight of each similar pixel by a matrix-based method based on non-local filtering, perform spatial filtering in a first-order form, and add a relative distance on the basis, and the specific implementation manner is:
<math> <mrow> <mi>W</mi> <mo>=</mo> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <mrow> <mover> <mi>S</mi> <mo>&OverBar;</mo> </mover> <mo>+</mo> <mover> <mi>T</mi> <mo>&OverBar;</mo> </mover> </mrow> <mi>h</mi> </mfrac> <mo>)</mo> </mrow> <mo>*</mo> <mn>1</mn> <mo>/</mo> <mi>D</mi> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> <mo>;</mo> </mrow> </math>
wherein,andrespectively, the mean values of the spectral difference and the time difference of the neighborhood matrix, and the experiment is carried out by adjusting the filtering parameter h and the matrix window, wherein the optimal h range is [0.001, 0.1 ]](ii) a D is adjacent pixel (x)i,yj) Relative distance from the center pixel element.
Preferably, the time-varying region in step 4 is fused by a classical algorithm, and the formula is as follows:
<math> <mrow> <mi>F</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>,</mo> <msub> <mi>t</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>w</mi> </munderover> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>w</mi> </munderover> <munderover> <mi>&Sigma;</mi> <mrow> <mi>k</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <msub> <mi>W</mi> <mi>ijk</mi> </msub> <mo>&times;</mo> <mrow> <mo>(</mo> <mi>F</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>,</mo> <msub> <mi>t</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mo>+</mo> <mi>C</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>,</mo> <msub> <mi>t</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mi>C</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>,</mo> <msub> <mi>t</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> <mo>;</mo> </mrow> </math>
wherein F, C represents the reflectivity of the high spatial resolution and high temporal resolution data, respectively, (x)i,yj) For a given position of a high spatial resolution and high temporal resolution data pel, t0Time of data acquisition, tkFor the prediction time, w is the search window size, and n is the input image logarithm;
the space-time-varying regions in step 4 are fused in a time-weighted mode, and the formula is as follows:
F(xw/2,yw/2,tp)=F(x,y,t0)+∑w(x,y)*aw/2*(C(x,y,tp)-C(x,y,t0) (4); it is characterized by that it calculates the linear regression coefficient a of similar picture element in the moving windoww/2And further calculating the time phase weight of each basic time, and weighting to obtain a final fusion result.
The method is characterized in that based on non-local filtering, similar pixel screening and weight calculation are carried out, space-time change is considered, corresponding modes are selected for weighting according to main characteristics of the region, and finally the reflectivity value of the central pixel is obtained through fusion. In a word, the method provided by the invention can be effectively used for the space-time quantitative fusion of the remote sensing images, and more accurate prediction results can be obtained.
Drawings
FIG. 1 is a flow chart of an embodiment of the present invention.
Detailed Description
In order to facilitate the understanding and implementation of the present invention for those of ordinary skill in the art, the present invention is further described in detail with reference to the accompanying drawings and examples, it is to be understood that the embodiments described herein are merely illustrative and explanatory of the present invention and are not restrictive thereof.
Please refer to fig. 1, the method for spatiotemporal quantitative fusion of remote sensing data considering non-local characteristics and spatiotemporal changes provided in this embodiment is used for obtaining a reflectance value of a central pixel through fusion; the method comprises the following steps:
step 1: inputting a Tm moment high-space and low-space resolution image pair and a Tn moment high-space and low-space resolution image pair, and preprocessing the input images, wherein the preprocessing comprises reprojection, resampling or clipping;
step 2: the Tm time high spatial and low spatial resolution image pair and the Tn time high spatial and low spatial resolution image pair respectively use an empirical formula in combination with a time difference TijkAnd the spectral difference SijkScreening similar pixels;
the empirical formula is:
<math> <mrow> <mo>|</mo> <mover> <mi>F</mi> <mo>&OverBar;</mo> </mover> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>,</mo> <mi>B</mi> <mo>)</mo> </mrow> <mo>-</mo> <mover> <mi>F</mi> <mo>&OverBar;</mo> </mover> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>w</mi> <mo>/</mo> <mn>2</mn> </mrow> </msub> <mo>,</mo> <msub> <mi>y</mi> <mrow> <mi>w</mi> <mo>/</mo> <mn>2</mn> </mrow> </msub> <mo>,</mo> <mi>B</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>&lt;</mo> <mo>=</mo> <mi>d</mi> <mo>*</mo> <msup> <mn>2</mn> <mrow> <mover> <mi>F</mi> <mo>&OverBar;</mo> </mover> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>w</mi> <mo>/</mo> <mn>2</mn> </mrow> </msub> <mo>,</mo> <msub> <mi>y</mi> <mrow> <mi>w</mi> <mo>/</mo> <mn>2</mn> </mrow> </msub> <mo>,</mo> <mi>B</mi> <mo>)</mo> </mrow> </mrow> </msup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>;</mo> </mrow> </math> wherein, F (x)i,yjB) reflectance values of neighboring pels for the basic temporal high spatial resolution data (if other products, parameter d changes appropriately), F (x)w/2,yw/2B) is the central pixel reflectivity, B is the number of bands, d is a free parameter, the value of which varies within a small range for a given sensor. Through preliminary screening, some inevitable low-quality pixels or unreliable pixels are screened, so that further elimination is carried out on the basis of the screened pixels, and a time difference T is introducedijkAnd the spectral difference SijkThe whole method for screening similar pixels also considers the non-local characteristics and is carried out in a small window, namely the mean value is calculated.
And step 3: based on non-local filtering, calculating the weight of each similar pixel by adopting the similarity of a neighborhood matrix taking the similar pixel as a center and the relative distance between the similar pixel and the center pixel;
based on Non-local filtering (Non-local means), the weight of each similar pixel is calculated by a matrix-based method, and the fact that the calculation of a second-order normal form may cause smooth transition is considered, so that excessive detail information is lost in a fusion result. Therefore, the invention adopts a first-order form to carry out spatial filtering, and adds a relative distance on the basis, and the specific implementation mode is as follows:
<math> <mrow> <mi>W</mi> <mo>=</mo> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <mrow> <mover> <mi>S</mi> <mo>&OverBar;</mo> </mover> <mo>+</mo> <mover> <mi>T</mi> <mo>&OverBar;</mo> </mover> </mrow> <mi>h</mi> </mfrac> <mo>)</mo> </mrow> <mo>*</mo> <mn>1</mn> <mo>/</mo> <mi>D</mi> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> <mo>;</mo> </mrow> </math>
wherein,andrespectively, the mean values of the spectral difference and the time difference of the neighborhood matrix, and the experiment is carried out by adjusting the filtering parameter h and the matrix window, wherein the optimal h range is [0.001, 0.1 ]](ii) a D is adjacent pixel (x)i,yj) Relative distance from the center pixel element.
And 4, step 4: and (4) considering time change and space change, fusing the time-changed regions in a classical algorithm mode, fusing the time-changed regions in a time weighting mode, and finally fusing to obtain the reflectivity value of the central pixel.
The time-varying region (which is represented by the fact that the earth surface coverage (phenology) changes greatly in a certain time period) is fused in a classical algorithm mode, and the formula is as follows:
<math> <mrow> <mi>F</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>,</mo> <msub> <mi>t</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>w</mi> </munderover> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>w</mi> </munderover> <munderover> <mi>&Sigma;</mi> <mrow> <mi>k</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <msub> <mi>W</mi> <mi>ijk</mi> </msub> <mo>&times;</mo> <mrow> <mo>(</mo> <mi>F</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>,</mo> <msub> <mi>t</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mo>+</mo> <mi>C</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>,</mo> <msub> <mi>t</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mi>C</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>,</mo> <msub> <mi>t</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> <mo>;</mo> </mrow> </math>
wherein F, C represents the reflectivity of the high spatial resolution and high temporal resolution data, respectively, (x)i,yj) For a given position of a high spatial resolution and high temporal resolution data pel, t0Time of data acquisition, tkFor the prediction time, w is the search window size, and n is the input image logarithm;
the fusion of the spatial variation area (the surface coverage (phenology) basically does not change with time or changes little, generally occurs in the spatial heterogeneous area) by adopting a time weighting mode, and the formula is as follows:
F(xw/2,yw/2,tp)=F(x,y,t0)+∑w(x,y)*aw/2*(C(x,y,tp)-C(x,y,t0) (4); it is characterized by that it calculates the linear regression coefficient a of similar picture element in the moving windoww/2And further calculating the time phase weight of each basic time, and weighting to obtain a final fusion result.
It should be understood that parts of the specification not set forth in detail are well within the prior art.
It should be understood that the above description of the preferred embodiments is given for clarity and not for any purpose of limitation, and that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (5)

1. A remote sensing data space-time quantitative fusion method considering non-local characteristics and space-time changes is characterized by comprising the following steps:
step 1: inputting a Tm moment high-space and low-space resolution image pair and a Tn moment high-space and low-space resolution image pair, and preprocessing the input images, wherein the preprocessing comprises reprojection, resampling or clipping;
step 2: the high-spatial and low-spatial resolution image pair at the Tm moment and the high-spatial and low-spatial resolution image pair at the Tn moment are respectively combined by using empirical formulasTime difference TijkAnd the spectral difference SijkScreening similar pixels;
and step 3: based on non-local filtering, calculating the weight of each similar pixel by adopting the similarity of a neighborhood matrix taking the similar pixel as a center and the relative distance between the similar pixel and the center pixel;
and 4, step 4: and (4) taking the time change and the spatial change into consideration, fusing the time-changed regions in a classical algorithm mode, and fusing the space-changed regions in a time weighting mode.
2. The method for spatiotemporal quantitative fusion of remote sensing data considering non-local characteristics and spatiotemporal changes according to claim 1, characterized in that the empirical formula in step 2 is:
<math> <mrow> <mo>|</mo> <mover> <mi>F</mi> <mo>&OverBar;</mo> </mover> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>,</mo> <mi>B</mi> <mo>)</mo> </mrow> <mo>-</mo> <mover> <mi>F</mi> <mo>&OverBar;</mo> </mover> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>w</mi> <mo>/</mo> <mn>2</mn> </mrow> </msub> <mo>,</mo> <msub> <mi>y</mi> <mrow> <mi>w</mi> <mo>/</mo> <mn>2</mn> </mrow> </msub> <mo>,</mo> <mi>B</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>&lt;</mo> <mo>=</mo> <mi>d</mi> <mo>*</mo> <msup> <mn>2</mn> <mrow> <mover> <mi>F</mi> <mo>&OverBar;</mo> </mover> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>w</mi> <mo>/</mo> <mn>2</mn> </mrow> </msub> <mo>,</mo> <msub> <mi>y</mi> <mrow> <mi>w</mi> <mo>/</mo> <mn>2</mn> </mrow> </msub> <mo>,</mo> <mi>B</mi> <mo>)</mo> </mrow> </mrow> </msup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>;</mo> </mrow> </math>
wherein, F (x)i,yjB) reflectance values of neighboring pels of the temporal high spatial resolution data on a basis, F (x)w/2,yw/2B) is the reflectivity of the central pixel, B is the number of wave bands, and d is a free parameter.
3. The method for spatiotemporal quantitative fusion of remote sensing data considering non-local characteristics and spatiotemporal changes according to claim 2, characterized in that: the screening process is performed in a small window, i.e. the mean is calculated.
4. The method for spatiotemporal quantitative fusion of remote sensing data considering non-local characteristics and spatiotemporal changes according to claim 2, characterized in that: the specific implementation process of the step 3 is that based on non-local filtering, the weight of each similar pixel is calculated by a matrix-based method, spatial filtering is carried out by adopting a first-order form, and a relative distance is added on the basis, and the specific implementation mode is as follows:
<math> <mrow> <mi>W</mi> <mo>=</mo> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <mrow> <mover> <mi>S</mi> <mo>&OverBar;</mo> </mover> <mo>+</mo> <mover> <mi>T</mi> <mo>&OverBar;</mo> </mover> </mrow> <mi>h</mi> </mfrac> <mo>)</mo> </mrow> <mo>*</mo> <mn>1</mn> <mo>/</mo> <mi>D</mi> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> <mo>;</mo> </mrow> </math>
wherein,andrespectively, the mean values of the spectral difference and the time difference of the neighborhood matrix, and the experiment is carried out by adjusting the filtering parameter h and the matrix window, wherein the optimal h range is [0.001, 0.1 ]](ii) a D is adjacent pixel (x)i,yj) Relative distance from the center pixel element.
5. The method for spatiotemporal quantitative fusion of remote sensing data considering non-local characteristics and spatiotemporal variations according to claim 2, 3 or 4, characterized in that:
and 4, fusing the time-varying regions in a classical algorithm mode, wherein the formula is as follows:
<math> <mrow> <mi>F</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>,</mo> <msub> <mi>t</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>w</mi> </munderover> <munderover> <mi>&Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>w</mi> </munderover> <munderover> <mi>&Sigma;</mi> <mrow> <mi>k</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <msub> <mi>W</mi> <mi>ijk</mi> </msub> <mo>&times;</mo> <mrow> <mo>(</mo> <mi>F</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>,</mo> <msub> <mi>t</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mo>+</mo> <mi>C</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>,</mo> <msub> <mi>t</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mi>C</mi> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>,</mo> <msub> <mi>t</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> <mo>;</mo> </mrow> </math>
wherein F, C represents the reflectivity of the high spatial resolution and high temporal resolution data, respectively, (x)i,yj) For a given position of a high spatial resolution and high temporal resolution data pel, t0Time of data acquisition, tkFor the prediction time, w is the search window size, and n is the input image logarithm;
the space-time-varying regions in step 4 are fused in a time-weighted mode, and the formula is as follows:
F(xw/2,yw/2,tp)=F(x,y,t0)+∑w(x,y)*aw/2*(C(x,y,tp)-C(x,y,t0)) (4);
it is characterized by that it calculates the linear regression coefficient a of similar picture element in the moving windoww/2And then calculating each base timeAnd (5) weighting time phase weight to obtain a final fusion result.
CN201510087994.XA 2015-02-26 2015-02-26 Take the remotely-sensed data space-time quantitative fusing method of non local characteristic and change in time and space into account Active CN104637027B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510087994.XA CN104637027B (en) 2015-02-26 2015-02-26 Take the remotely-sensed data space-time quantitative fusing method of non local characteristic and change in time and space into account

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510087994.XA CN104637027B (en) 2015-02-26 2015-02-26 Take the remotely-sensed data space-time quantitative fusing method of non local characteristic and change in time and space into account

Publications (2)

Publication Number Publication Date
CN104637027A true CN104637027A (en) 2015-05-20
CN104637027B CN104637027B (en) 2017-07-11

Family

ID=53215736

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510087994.XA Active CN104637027B (en) 2015-02-26 2015-02-26 Take the remotely-sensed data space-time quantitative fusing method of non local characteristic and change in time and space into account

Country Status (1)

Country Link
CN (1) CN104637027B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105184076A (en) * 2015-09-02 2015-12-23 安徽大学 Space-time integrated fusion method for remote sensing earth surface temperature data
CN110503137A (en) * 2019-07-29 2019-11-26 电子科技大学 Based on the determination method of the remote sensing image temporal-spatial fusion base image pair of mixing together
CN112419198A (en) * 2020-11-27 2021-02-26 中国矿业大学 Non-local mean weighting method for SAR interferogram filtering
CN112819697A (en) * 2021-02-04 2021-05-18 北京师范大学 Remote sensing image space-time fusion method and system
CN113702305A (en) * 2021-08-17 2021-11-26 燕山大学 Gas concentration linear measurement method based on self-adaptive differential absorption spectrum technology
CN114301905A (en) * 2020-09-23 2022-04-08 华为技术有限公司 Resolution conversion method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103177431A (en) * 2012-12-26 2013-06-26 中国科学院遥感与数字地球研究所 Method of spatial-temporal fusion for multi-source remote sensing data
CN103983360A (en) * 2014-05-30 2014-08-13 中国科学院遥感与数字地球研究所 Land surface temperature (LST) inversion method based on HJ-1B IRS satellite data
CN104360040A (en) * 2014-11-07 2015-02-18 河海大学 Remote sensing soil moisture content monitoring method based on STARFM fusion technology

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103177431A (en) * 2012-12-26 2013-06-26 中国科学院遥感与数字地球研究所 Method of spatial-temporal fusion for multi-source remote sensing data
CN103983360A (en) * 2014-05-30 2014-08-13 中国科学院遥感与数字地球研究所 Land surface temperature (LST) inversion method based on HJ-1B IRS satellite data
CN104360040A (en) * 2014-11-07 2015-02-18 河海大学 Remote sensing soil moisture content monitoring method based on STARFM fusion technology

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
沈焕锋等: "一种顾及影像纹理特性的自适应分辨率增强算法", 《遥感学报》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105184076A (en) * 2015-09-02 2015-12-23 安徽大学 Space-time integrated fusion method for remote sensing earth surface temperature data
CN105184076B (en) * 2015-09-02 2017-11-17 安徽大学 Space-time integrated fusion method for remote sensing earth surface temperature data
CN110503137A (en) * 2019-07-29 2019-11-26 电子科技大学 Based on the determination method of the remote sensing image temporal-spatial fusion base image pair of mixing together
CN110503137B (en) * 2019-07-29 2022-03-15 电子科技大学 Determination method of remote sensing image space-time fusion basic image pair based on cross fusion
CN114301905A (en) * 2020-09-23 2022-04-08 华为技术有限公司 Resolution conversion method and device
CN114301905B (en) * 2020-09-23 2023-04-04 华为技术有限公司 Resolution conversion method and device
CN112419198A (en) * 2020-11-27 2021-02-26 中国矿业大学 Non-local mean weighting method for SAR interferogram filtering
CN112419198B (en) * 2020-11-27 2024-02-02 中国矿业大学 Non-local mean weighting method for SAR interferogram filtering
CN112819697A (en) * 2021-02-04 2021-05-18 北京师范大学 Remote sensing image space-time fusion method and system
CN113702305A (en) * 2021-08-17 2021-11-26 燕山大学 Gas concentration linear measurement method based on self-adaptive differential absorption spectrum technology
CN113702305B (en) * 2021-08-17 2022-07-15 燕山大学 Gas concentration linear measurement method based on self-adaptive differential absorption spectrum technology

Also Published As

Publication number Publication date
CN104637027B (en) 2017-07-11

Similar Documents

Publication Publication Date Title
CN104637027B (en) Take the remotely-sensed data space-time quantitative fusing method of non local characteristic and change in time and space into account
Wang et al. Fusion of Landsat 8 OLI and Sentinel-2 MSI data
Pardo-Igúzquiza et al. Downscaling cokriging for image sharpening
Wang et al. Enhancing spatio-temporal fusion of MODIS and Landsat data by incorporating 250 m MODIS data
CN110969577A (en) Video super-resolution reconstruction method based on deep double attention network
CN102763134B (en) For the parameter interpolate of high dynamic range video tone mapping
Pardo-Iguzquiza et al. Image fusion by spatially adaptive filtering using downscaling cokriging
EP3598387B1 (en) Learning method and program
CN112819697B (en) Remote sensing image space-time fusion method and system
Zhang et al. Multi-focus image fusion based on robust principal component analysis and pulse-coupled neural network
CN102016915A (en) Method and apparatus for super-resolution of images
CN111612489A (en) Order quantity prediction method and device and electronic equipment
Al‐Naji et al. Quality index evaluation of videos based on fuzzy interface system
CN111369483B (en) Method for generating high-spatial-temporal-resolution remote sensing data by fusing multi-source remote sensing data
Zhang et al. An object-based spatiotemporal fusion model for remote sensing images
Sun et al. Unsupervised 3D tensor subspace decomposition network for spatial-temporal-spectral fusion of hyperspectral and multispectral images
Peng et al. Geographically weighted spatial unmixing for spatiotemporal fusion
Qu et al. Fusion of hyperspectral and panchromatic images using an average filter and a guided filter
CN109840539B (en) Remote sensing space-time data fusion method based on ground block pattern spots
Kwan Image resolution enhancement for remote sensing applications
CN105160630A (en) Optical super-resolution image reconstruction method
CN111767679A (en) Method and device for processing time-varying vector field data
CN114758282B (en) Video prediction method based on time sequence correction convolution
CN111179171A (en) Image super-resolution reconstruction method based on residual module and attention mechanism
Luo et al. Hierarchical Neural Operator Transformer with Learnable Frequency-aware Loss Prior for Arbitrary-scale Super-resolution

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant