CN110503137A - Based on the determination method of the remote sensing image temporal-spatial fusion base image pair of mixing together - Google Patents

Based on the determination method of the remote sensing image temporal-spatial fusion base image pair of mixing together Download PDF

Info

Publication number
CN110503137A
CN110503137A CN201910710592.9A CN201910710592A CN110503137A CN 110503137 A CN110503137 A CN 110503137A CN 201910710592 A CN201910710592 A CN 201910710592A CN 110503137 A CN110503137 A CN 110503137A
Authority
CN
China
Prior art keywords
image
value
mixing together
candidate
pair
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910710592.9A
Other languages
Chinese (zh)
Other versions
CN110503137B (en
Inventor
曹入尹
陈洋
周纪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201910710592.9A priority Critical patent/CN110503137B/en
Publication of CN110503137A publication Critical patent/CN110503137A/en
Application granted granted Critical
Publication of CN110503137B publication Critical patent/CN110503137B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses the invention proposes a kind of determination method of remote sensing image temporal-spatial fusion base image pair based on mixing together, belong to digital image processing field.This method is in potential input picture pair as much as possible for many years, similarity standard and the consistency criterion of selection base image clock synchronization are considered simultaneously, the consistency between MODIS image and Landsat image is demonstrated with the mode of mixing together, optimal input image group has been determined, has obtained more accurate, stable fusion results.This method gets rid of the limitation that the prior art considers similarity criterion using limited input data and only, merges for space-time data in practical application and provides a kind of strong operability, the method for high degree of automation.

Description

Based on the determination method of the remote sensing image temporal-spatial fusion base image pair of mixing together
Technical field
The invention belongs to digital image processing fields, and in particular to the processing of remote sensing image data with merge.
Background technique
Normalized difference vegetation index (NDVI) is one of the significant data source of many ecological environment applications in remote sensing.NDVI Long-term sequence data describe vegetation greenness variation and the growth characteristics of many years, have been widely used for the inspection of vegetation Phenological change Survey, production estimation force evaluating and Land cover utilization variation etc..Currently, multiple satellite sensors, as AVHRR, MODIS and SPOT VGT NDVI provides NDVI long-term sequence product.But the space of these NDVI time series products point Resolution is not high, and grid cell size is thicker, and several kilometers etc. from several hundred rice of MODIS to AVHRR are several.This coarse spatial discrimination Rate greatly affected the precision of the applications such as NDVI time series data detection Phenological change, limit it in special heterogeneity area Application in domain.Therefore, spatial resolution is improved on the basis of the NDVI time series data of existing high time resolution to obtain The NDVI time series data of high-spatial and temporal resolution is very important.
It, can not be all very high between room and time resolution ratio due to the restriction of sensor design and physics law.It is single at present One satellite sensor can not provide while have the data of high time and high spatial resolution, can only provide and meet one of them The data of characteristic.Therefore, for this limitation, multi-group data can be utilized to improve spatial and temporal resolution.Researcher proposes Temporal-spatial fusion technology, by by the image (for example, MODIS Satellite Product) of high time resolution but low spatial resolution and high Spatial resolution but low temporal resolution image (for example, Landsat Satellite Product) is merged by technological means to rebuild High-spatial and temporal resolution NDVI time series.Currently, a temporal-spatial fusion algorithm more than 50 has been proposed in researcher.These fusions are calculated Method can substantially be divided into 4 classes, including based on mixed pixel decomposition method, based on weight combination of function method, be based on engineering The method and integrated approach of habit.
Space-time data blending algorithm usually requires the spatial information provided using thin image (referring to high spatial resolution images) To assist generating the blending image of forecast date.Therefore, on the basic date as input, temporal-spatial fusion at least needs a pair Thin image and thick image (referring to low spatial resolution image).Therefore, the selection of base image pair has been largely fixed space-time The accuracy of data fusion.Base image is similarity standard and consistency criterion respectively to there are two standards at present.With For MODIS-Landsat fusion, the criterion that similarity standard considers is that basic date image should be similar to forecast date image. In order to meet this criterion, two kinds of strategies, " nearest date " strategy, i.e. ND strategy, and " most phase are used in previous research Close " strategy, i.e. HC strategy.The selection on the date that " nearest date " strategy refers mainly to base image pair should be with the every other day of forecast date Phase is separated by recently." most related " strategy refer mainly to base image centering MODIS image and forecast date on MODIS image it Between whether with highest correlation determine the selection of base image pair.Consistency criterion is then primary concern is that the basic date On MODIS and Landsat image should be consistent in radiation characteristic and geometrical characteristic.MODIS and Landsat image it Between inconsistency will affect space-time data fusion accuracy.But currently without a kind of method can consider simultaneously similitude and Conformance criteria automatically determines the base image pair of input.Because all there is MODIS image in basic date and forecast date, and Be conducive to estimate similarity, so mostly only with similarity criterion in existing fused data research.Simultaneously because consistent Property be difficult to quantify, existing method be difficult consider actual use conformance criteria.Some researchs only provide for this criterion Some references.For example, because in bidirectional reflectance distribution function reflectivity product MCD43A4 adjusted, MODIS and Landsat Between observation angle difference be largely repaired, so compared to MODIS directional reflectance product MOD09GA, MCD43A4 is more advantageous to MODIS-Landsat data fusion.Even but the data after correcting still can be because of various aspects Other reasons lead to inconsistent problem, and the current method of these problems can not be assessed effectively.Therefore, one kind is invented The method for automatically determining optimal base image pair is a vital step for the practical application of space-time data integration technology.
Summary of the invention
The invention proposes a kind of mixing together methods to ask to solve the determination of remote sensing image temporal-spatial fusion base image pair Topic.This method considers the similitude of selection base image clock synchronization in potential input picture pair as much as possible for many years Standard and consistency criterion demonstrate the consistency between MODIS image and Landsat image with the mode of mixing together, really Determine optimal input image group, obtains more accurate, stable fusion results.This method, which gets rid of the prior art and uses, to be had It limits input data and only considers the limitation of similarity criterion, provide a kind of operability for space-time data fusion in practical application By force, the method for high degree of automation.
The technical solution adopted by the present invention includes the following steps:
(1) select several base images to as candidate image pair according to the similarity criterion between MODIS.
For quantitative prediction time point (Rp) and the several years in some given point in time (Ri) between MODIS image it is similar Property, calculate the reflectivity difference (diff (R between two width MODIS imagesp_i)) and linearly dependent coefficient (cor (Rp_i)), it calculates Formula is as follows:
Wherein Rp_iIndicate predicted time point RpWith i-th of given point in time RiBetween corresponding relationship.N is indicated in the picture The pixel number of MODIS, j indicate j-th of pixel in image, then Rp(j) predicted time point R is indicatedpOn corresponding MODIS image The value of j-th of pixel, Ri(j) given point in time R is indicatediThe value of j-th of pixel on corresponding MODIS image.Var () and Cov (), two functions respectively indicated the estimation of variance and covariance.Assuming that the sum of all given time images pair is in the several years M, for wherein i-th of image to (i=1 ..., m), with following formula by diff (Rp_i) and cor (Rp_i) combine calculating I-th of image is obtained to relative to predicted time point RpIndex of similarity SIi:
Thus formula it is found that when reflectivity difference it is smaller, when related coefficient is bigger, SIiIt is worth bigger.SI is selected in the present inventioni It is worth maximum 5 images to as candidate image pair.
(2) to candidate image to progress mixing together processing.
With the identical form of expression (CM,FM) indicate this five candidate images pair, wherein CMFor the thick image of m-th, FMFor The thin image of m-th, wherein M=1,2...5.Based on 5 candidate images pair, mixing together is carried out between them.Choose (Ct, Ft) as prediction authentication image pair, using except (Ct,Ft) other than 4 images to the blending image respectively obtained on t time point, Wherein t=1,2 ..., M.For predicting authentication image FtOn correspondence pixel (x, y), corresponding reflectance value be Ft(x, y), And its corresponding predicted value obtained based on remaining different images to application FSDAF method is respectively FSDAF (FN(x,y)→Ft (x, y)), wherein N | N ∈ { 1,2 ..., 5 } and N ≠ t }.Last FtThe prediction result of (x, y)It can be by following formula It is calculated:
Wherein ∑N∈{1,2,…,5},N≠taNt=1.0 and 0.0≤aNt≤1.0 (3)
A in above-mentioned formulaNtIt indicates using N to image to the corresponding contribution coefficient of fusion results on obtained t time point, One share 4 regression coefficients it needs to be determined that.In order to estimate their value, present invention uses 5 centered on pixel (x, y) × All pixel values are solved as the input of equation with least square method in 5 local windows, and it is made to meet following formula:
The range setting of four regression parameter values between zero and one, and they and be 1.Therefore, as predicted value FSDAF (FN(x,y)→Ft(x, y))) with true value differ larger, then in above formula predicted value regression parameter aNtIt may be smaller.All Candidate image to being all that there are inconsistencies between MODIS and Landsat may under the premise of being selected according to similarity criterion The reason of being this biggish prediction error that pixel (x, y) occurs.When t difference value is 1,2,3,4,5, corresponding N value Range can also change therewith, can estimate mixing together when using five candidate image centerings, t-th of candidate image as target Obtained all regression parameter aNt(t=1,2 ..., 5;N ∈ { 1,2 ..., 5 } and N ≠ t), by each aNtTable is arranged in a matrix in value It is shown as:
According to formula (3), the every a line of above formula and be all 1.0.The value of each column is indicated using one of candidate in formula (5) Image is to the regression parameter for generating other candidate images pair, i.e. a12,a13,a14, and a15It is as use (C1,F1) (C is generated respectively2, F2)、(C3,F3)、(C4,F4) and (C5,F5) in pixel (x, y) coefficient.As previously mentioned, under normal circumstances, given image pair Parameter is smaller, then represents that prediction error is bigger, this MODIS and Landsat for being more likely due to the image pair inconsistent make At.Therefore, we use the average value a of each columnM_To indicate each candidate image to (CM,FM) in predicted time RpOn obtain Fusion results to final predictionContribution, indicate are as follows:
Wherein FSDAF (FM(x,y)→Fp(x, y)) it is to merge to obtain to using FSDAF method based on m-th candidate image Predicted time RpThe predicted value of upper image picture elements (x, y).
Mixing together method of the invention considers two standards of similitude and consistency simultaneously.Similarity criterion selection The year border growth cycle for considering vegetation in the process, selects candidate image pair from many years image.All candidate images are to being real The basis of the fusion mixing together of existing space-time data.Since the true Landsat image on all candidate dates is all available, because This these image can be used as the performance of authentication image quantitative assessment mixing together.During mixing together, when use is " different Cause " candidate image is to when carrying out data fusions on other candidate dates, it may occur that biggish fusion mistake.In this way, intersecting Merging obtained coefficient can be with quantitative description " consistency " criterion.It is not one used in mixing together process but five times Base image pair is selected, for predicting different pixels, the fusion results of different candidate images pair might have different coefficients, generation The different contribution of table.Based on mixing together determine the method for optimal input compared with other existing selection strategies, mixing together The NDVI image of method fusion is more accurate, stablizes.
Detailed description of the invention
Fig. 1 is base images different in different vegetation types to the quantitative assessment result (AAD of the data fusion of selection criteria And related coefficient).Wherein, four kinds of selection criteria are similarity standard SI respectively, most relevant criterion HC, difference minimum sandards Diff And date most close standard ND.
Fig. 2 is mixing together method in different vegetation types and 5 candidate images to FSDAF fusion results (respectively SI_ 1, SI_2, SI_3, SI_4, SI_5) quantitative comparison (AAD and related coefficient).
Fig. 3 is mixing together method and five candidates based in simulated experiment inconsistent between MODIS and Landsat Image is to the quantitative comparison of FSDAF fusion results (being expressed as SI_1, SI_2, SI_3, SI_4 and SI_5) (AAD and related Coefficient).It with the mode for changing consistency between MODIS and Landsat, i.e., is multiplied by variation coefficient at random on original value, simulates Anisomerous image is to there is inconsistent situation, wherein the image simulated is indicated with runic.
(A) is the true NDVI image of deciduous forest 2007/158 in Fig. 4, and using under mixing together method and SI criterion The blending image result of (i.e. SI_1) FSDAF.We are highlighted those pictures for having cloud noise with black line segment in SI_1 Member.It (B) is five width candidate images in mixing together method to the corresponding estimation parameter of result.The average value of each image is shown in In bracket above each image.Wherein there is the correspondence pixel of cloud noise to be highlighted with black line segment.The last one small figure is aobvious Quantitative assessment of all candidate images to fusion results is shown.
Mode is shown in specific implementation
(1) sample data and pretreatment
The present embodiment use data source be respectively MODIS-NDVI product (high time resolution 8 days, low spatial resolution 500m) and Landsat-NDVI data (low temporal resolution 16 days, high spatial resolution 30m).Intersect in the present embodiment and melts Closing the Fusion Model used is Flexible Spatiotemporal Data Fusion (FSDAF) method.According to fusion method The requirement of FSDAF, the present invention have carried out necessary pretreatment to the data source of use, comprising: utilize the Landsat ecosystem Disturbance adaptive processing system (LEDAPS) has carried out radiation calibration and atmospheric correction to Landsat image.Then MODIS is schemed As being registrated with Landsat image, resampling is carried out to MODIS NDVI image using nearest neighbor algorithm, with matching The spatial resolution of Landsat.Detect the cloud in each Landsat image automatically using cloud mask algorithm (Fmask method) Pollution, and only use the image that cloud coverage rate is lower than 1%.Finally we are thought highly of using iteration Savitzky-Golay (SG) filtering MODIS NDVI time series has been built, noise is eliminated.
(2) test and its result implement example
The present embodiment is in different vegetation coverages, the region of different vegetation types, i.e. deciduous forest, evergreen forest, double Ji Nongtian And Alpine Grasslands, mixing together method of the invention is tested.In testing, (12 is public for the subset area in each region In × 12 kilometers) used as standard test range.After data prediction, for evaluation, this test uses friendship Fork fusion method and two methods of existing FSDAF there are the forecast date of true Landsat NDVI image generate NDVI melt Image is closed, true picture and blending image is for statistical analysis.Statistical analysis uses following scheme: (1) different vegetation types Middle difference base image is to the data fusion result of selection criteria compared with the quantitative assessment of mixing together result;(2) different to plant By mixing together method in type compared with 5 candidate images are to FSDAF fusion results;(3) based on MODIS and Landsat it Between in inconsistent simulated experiment, FSDAF fusion results are compared in mixing together method and five candidate images;(4) intersect and melt Close to and base image centering noise susceptibility compare.Specific implementation process and show that column effect is described as follows.
Attached drawing 1 is that mixing together method difference base image is quantitative to the data fusion of selection criteria in different vegetation types Evaluate the average value of AAD and related coefficient.Mixing together method has minimum AAD compared to other selection criteria in quantitative assessment Value and highest related coefficient.In combination with most related and nearest date strategy most like SI selection criteria have it is second best Evaluation result.The two results illustrate that the combination of most related and nearest date strategy is effective, and the side of mixing together Method has more stable and better syncretizing effect.
The raising of mixing together method effect of the invention is not only due to use new similarity criterion, also in that Five candidate images are utilized to realizing mixing together.In order to verify the validity of mixing together method, the present invention is compared When using mixing together, and only use five candidate base image centerings some as basic images to using FSDAF method When, the accuracy of space-time data fusion results.Attached drawing 2 the result shows that, mixing together method have minimum AAD value and highest Related coefficient.This result illustrates that mixing together is to consider that consistency further increases effective hand of fusion accuracy and effect Section.
In order to mixing together method of the invention be better described to the processing capacity of inconsistent situation, mould in the present invention There is inconsistent situation between quasi- MODIS and Landsat image, i.e., the corresponding Landsat image on MODIS time point Multiplied by the random number within one (0.8,1.2) in each pixel, and the candidate base image of different number is considered to appearance Inconsistent situation.As shown in the table in attached drawing 3, the statistical result of runic indicates the inconsistency between MODIS and Landsat Greatly reduce the accuracy of space-time data fusion.Statistical result in table also illustrates that mixing together method of the invention by these The inconsistency of simulation influences very little, still maintains higher statistical accuracy.Wherein importantly, even if being waited there are five institutes Select base image to there is inconsistent situation, the syncretizing effect of mixing together method also than five any knots of candidate image centering Fruit will get well.Attached drawing 3 the result shows that mixing together method provide it is a kind of and meanwhile consider similitude and consistency criterion oneself The dynamic method for determining base image pair.
Attached drawing 4 indicate Landsat image pollute using a width by cloud as five pairs of candidate's base images to one of, have studied Influence of the cloud noise to mixing together method.The result shows that with the base image that is polluted by cloud to the fused image of FSDAF SI_1 is compared, and the blending image that mixing together method obtains is smaller by cloud pollution effect.Therefore, mixing together method can be avoided The influence of the noises such as cloud.

Claims (1)

1. a kind of determination method of the remote sensing image temporal-spatial fusion base image pair based on mixing together, which is characterized in that including Following steps:
S1. select several base images to as candidate image pair according to the similarity criterion between MODIS:
Quantitative prediction time point RpWith i-th of given point in time R in the several yearsiBetween MODIS image similitude, calculate two width Reflectivity difference diff (R between MODIS imagep_i) and linearly dependent coefficient cor (Rp_i), calculation formula is as follows:
Wherein Rp_iIndicate predicted time point RpWith i-th of given point in time RiBetween corresponding relationship, n indicates MODIS in the picture Pixel number, j indicates j-th of pixel in image, then Rp(j) predicted time point R is indicatedpJ-th on corresponding MODIS image The value of pixel, Ri(j) given point in time R is indicatediThe value of j-th of pixel on corresponding MODIS image, Var () and Cov () two Function respectively indicates the estimation of variance and covariance, and the sum of all given time images pair is m in the several years, for wherein the I image is to (i=1 ..., m), with following formula by diff (Rp_i) and cor (Rp_i) combine i-th of image is calculated To relative to predicted time point RpIndex of similarity SIi:
When reflectivity difference is smaller, when related coefficient is bigger, SIiIt is worth bigger, selection SIiIt is worth maximum 5 images to as candidate Image pair;
S2. to candidate image to progress mixing together processing:
With the identical form of expression (CM,FM) indicate this five candidate images pair, wherein CMFor the thick image of m-th, FMFor m-th Thin image, wherein M=1,2...5;Based on 5 candidate images pair, mixing together is carried out between them;Choose (Ct,Ft) conduct Authentication image pair is predicted, using except (Ct,Ft) other than 4 images to the blending image respectively obtained on t time point, wherein t= 1,2,…,M;For predicting authentication image FtOn correspondence pixel (x, y), corresponding reflectance value be Ft(x, y), it is corresponded to Be respectively FSDAF (F to the predicted value obtained using FSDAF method based on remaining different imagesN(x,y)→Ft(x, y)), In N | and N ∈ { 1,2 ..., 5 } and N ≠ t }, last FtThe prediction result of (x, y)It is calculated by following formula:
Wherein ∑N∈{1,2,…,5},N≠taNt=1.0 and 0.0≤aNt≤1.0 (3)
A in above-mentioned formulaNtIt indicates using N to image to the corresponding contribution coefficient of fusion results on obtained t time point, altogether Have 4 regression coefficients it needs to be determined that, in order to estimate their value, use is in 5 × 5 local windows centered on pixel (x, y) All pixel values are solved as the input of equation with least square method, and it is made to meet following formula:
The range setting of four regression parameter values between zero and one, and they and be 1, therefore, as predicted value FSDAF (FN (x,y)→Ft(x, y))) with true value differ larger, then in above formula predicted value regression parameter aNtIt is smaller, be when t distinguishes value When 1,2,3,4,5, corresponding N value range can also change therewith, estimate with five candidate image centerings, t-th of candidate image pair All regression parameter a that mixing together obtains when for targetNt(t=1,2 ..., 5;N ∈ { 1,2 ..., 5 } and N ≠ t), it will be each aNtExpression is arranged in a matrix in value are as follows:
According to formula (3), the every a line of above formula and be all 1.0, the value of each column indicates to use one of candidate image in formula (5) To the regression parameter for generating other candidate images pair, i.e. a12,a13,a14, and a15It is as use (C1,F1) (C is generated respectively2,F2)、 (C3,F3)、(C4,F4) and (C5,F5) in pixel (x, y) coefficient, use the average value of each columnTo indicate each candidate figure As to (CM,FM) in predicted time RpOn obtained fusion results to final predictionContribution, indicate are as follows:
Wherein FSDAF (FM(x,y)→Fp(x, y)) it is pre- to being merged using FSDAF method based on m-th candidate image Survey time RpThe predicted value of upper image picture elements (x, y).
CN201910710592.9A 2019-07-29 2019-07-29 Determination method of remote sensing image space-time fusion basic image pair based on cross fusion Active CN110503137B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910710592.9A CN110503137B (en) 2019-07-29 2019-07-29 Determination method of remote sensing image space-time fusion basic image pair based on cross fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910710592.9A CN110503137B (en) 2019-07-29 2019-07-29 Determination method of remote sensing image space-time fusion basic image pair based on cross fusion

Publications (2)

Publication Number Publication Date
CN110503137A true CN110503137A (en) 2019-11-26
CN110503137B CN110503137B (en) 2022-03-15

Family

ID=68586774

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910710592.9A Active CN110503137B (en) 2019-07-29 2019-07-29 Determination method of remote sensing image space-time fusion basic image pair based on cross fusion

Country Status (1)

Country Link
CN (1) CN110503137B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110909821A (en) * 2019-12-03 2020-03-24 中国农业科学院农业资源与农业区划研究所 Method for carrying out high-space-time resolution vegetation index data fusion based on crop reference curve
CN111798394A (en) * 2020-06-30 2020-10-20 电子科技大学 Remote sensing image cloud pollution removing method based on multi-year time sequence data
CN112102180A (en) * 2020-08-21 2020-12-18 电子科技大学 Cloud identification method based on Landsat optical remote sensing image
CN117197625A (en) * 2023-08-29 2023-12-08 珠江水利委员会珠江水利科学研究院 Remote sensing image space-spectrum fusion method, system, equipment and medium based on correlation analysis

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104637027A (en) * 2015-02-26 2015-05-20 武汉大学 Time and space quantitative fusion method for remote sensing data considering nonlocal characteristics and temporal and spatial variation
US20190057245A1 (en) * 2017-08-15 2019-02-21 Regents Of The University Of Minnesota Satellite image classification across multiple resolutions and time using ordering constraint among instances
CN109612588A (en) * 2019-01-02 2019-04-12 中国科学院新疆生态与地理研究所 LST image data prediction technique, device and electronic equipment
CN109685108A (en) * 2018-11-23 2019-04-26 电子科技大学 A method of generating high-spatial and temporal resolution NDVI long-term sequence

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104637027A (en) * 2015-02-26 2015-05-20 武汉大学 Time and space quantitative fusion method for remote sensing data considering nonlocal characteristics and temporal and spatial variation
US20190057245A1 (en) * 2017-08-15 2019-02-21 Regents Of The University Of Minnesota Satellite image classification across multiple resolutions and time using ordering constraint among instances
CN109685108A (en) * 2018-11-23 2019-04-26 电子科技大学 A method of generating high-spatial and temporal resolution NDVI long-term sequence
CN109612588A (en) * 2019-01-02 2019-04-12 中国科学院新疆生态与地理研究所 LST image data prediction technique, device and electronic equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
CAO RY ET AL: "《A simple method to improve the quality of NDVI time-series data by integrating spatiotemporal information with the Savitzky-Golay filter》", 《SCIENCEDIRECT》 *
TAN ZQ ET AL: "《Mapping Inundation dynamics in a heterogeneous floodplain:Insights from Integrating Observation and Modeling Approach》", 《SCIENCEDIRECT》 *
袁周米琪等: "《面向地表特征变化区域的时空遥感数据融合方法研究》", 《北京师范大学学报(自然科学版)》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110909821A (en) * 2019-12-03 2020-03-24 中国农业科学院农业资源与农业区划研究所 Method for carrying out high-space-time resolution vegetation index data fusion based on crop reference curve
CN110909821B (en) * 2019-12-03 2020-07-28 中国农业科学院农业资源与农业区划研究所 Method for carrying out high-space-time resolution vegetation index data fusion based on crop reference curve
CN111798394A (en) * 2020-06-30 2020-10-20 电子科技大学 Remote sensing image cloud pollution removing method based on multi-year time sequence data
CN111798394B (en) * 2020-06-30 2022-10-14 电子科技大学 Remote sensing image cloud pollution removing method based on multi-year time sequence data
CN112102180A (en) * 2020-08-21 2020-12-18 电子科技大学 Cloud identification method based on Landsat optical remote sensing image
CN112102180B (en) * 2020-08-21 2022-10-11 电子科技大学 Cloud identification method based on Landsat optical remote sensing image
CN117197625A (en) * 2023-08-29 2023-12-08 珠江水利委员会珠江水利科学研究院 Remote sensing image space-spectrum fusion method, system, equipment and medium based on correlation analysis
CN117197625B (en) * 2023-08-29 2024-04-05 珠江水利委员会珠江水利科学研究院 Remote sensing image space-spectrum fusion method, system, equipment and medium based on correlation analysis

Also Published As

Publication number Publication date
CN110503137B (en) 2022-03-15

Similar Documents

Publication Publication Date Title
Wu et al. Examining the sensitivity of spatial scale in cellular automata Markov chain simulation of land use change
CN110738252B (en) Space autocorrelation machine learning satellite precipitation data downscaling method and system
CN110503137A (en) Based on the determination method of the remote sensing image temporal-spatial fusion base image pair of mixing together
Haining et al. Spatial data analysis in the social and environmental sciences
Jin et al. Downscaling AMSR-2 soil moisture data with geographically weighted area-to-area regression kriging
CN110186820A (en) Multisource data fusion and environomental pollution source and pollutant distribution analysis method
CN109840553B (en) Extraction method and system of cultivated land crop type, storage medium and electronic equipment
Ling et al. Sub-pixel mapping of remotely sensed imagery with hybrid intra-and inter-pixel dependence
CN107423537B (en) Surface temperature downscaling method based on self-adaptive threshold
Ge Sub-pixel land-cover mapping with improved fraction images upon multiple-point simulation
CN114547017B (en) Meteorological big data fusion method based on deep learning
Li et al. A superresolution land-cover change detection method using remotely sensed images with different spatial resolutions
CN109186774B (en) Surface temperature information acquisition method, device, computer equipment and storage medium
Tang et al. Quantifying the effect of registration error on spatio-temporal fusion
Wang et al. Spatial–spectral radial basis function-based interpolation for Landsat ETM+ SLC-off image gap filling
Abate Built-heritage multi-temporal monitoring through photogrammetry and 2D/3D change detection algorithms
Zhang et al. Spatial domain bridge transfer: An automated paddy rice mapping method with no training data required and decreased image inputs for the large cloudy area
CN109685108A (en) A method of generating high-spatial and temporal resolution NDVI long-term sequence
Peng et al. Geographically weighted spatial unmixing for spatiotemporal fusion
González‐Abad et al. Using explainability to inform statistical downscaling based on deep learning beyond standard validation approaches
Steger et al. Statistical modeling of landslides: landslide susceptibility and beyond
Li et al. Spatial-temporal super-resolution land cover mapping with a local spatial-temporal dependence model
Sihvonen et al. Spectral profile partial least-squares (SP-PLS): Local multivariate pansharpening on spectral profiles
Cao et al. Universal high spatial resolution hyperspectral imaging using hybrid-resolution image fusion
CN113901348A (en) Oncomelania snail distribution influence factor identification and prediction method based on mathematical model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant