CN111159310B - Extensible image generation space-time fusion method with information gain strategy - Google Patents

Extensible image generation space-time fusion method with information gain strategy Download PDF

Info

Publication number
CN111159310B
CN111159310B CN201911280551.7A CN201911280551A CN111159310B CN 111159310 B CN111159310 B CN 111159310B CN 201911280551 A CN201911280551 A CN 201911280551A CN 111159310 B CN111159310 B CN 111159310B
Authority
CN
China
Prior art keywords
image
information
time
resolution image
predicting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911280551.7A
Other languages
Chinese (zh)
Other versions
CN111159310A (en
Inventor
王力哲
冯如意
陈佳
韩伟
陈小岛
阎继宁
宋维静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Geosciences
Original Assignee
China University of Geosciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Geosciences filed Critical China University of Geosciences
Priority to CN201911280551.7A priority Critical patent/CN111159310B/en
Publication of CN111159310A publication Critical patent/CN111159310A/en
Application granted granted Critical
Publication of CN111159310B publication Critical patent/CN111159310B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • G06F16/283Multi-dimensional databases or data warehouses, e.g. MOLAP or ROLAP
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/26Visual data mining; Browsing structured data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/951Indexing; Web crawling techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/955Retrieval from the web using information identifiers, e.g. uniform resource locators [URL]

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an extensible image generation space-time fusion algorithm with an information gain strategy, which comprises the following steps: by usingCycle‑GANSimulating the time sequence of the image, obtaining a plurality of available data sets, and generating a multi-stage iterative image; selecting the multi-stage iterative image with the image similar to the image at the time of prediction according to the reference information; gain information of the multistage iteration image is acquired, and the gain information is sent to a prediction moment with spatial informationkIs a low resolution image of (1); and acquiring the image information selected by wavelet transformation, and predicting the spatial information in space-time fusion.Cycle‑GANThe thinking of the simulation time sequence process and the method of countermeasure learning are beneficial to reasonably predicting the time sequence high-resolution images and generating images, are beneficial to generating images containing more gain information which is not contained in the low-resolution images, and provide assistance for introducing new gain information for space-time fusion.

Description

Extensible image generation space-time fusion method with information gain strategy
Technical Field
The invention relates to the technical field of image generation space-time fusion algorithms, in particular to an extensible image generation space-time fusion method with an information gain strategy.
Background
Due to cost and technical limitations and different tasks, the current satellite sensors mostly obtain remote sensing images with higher single resolution. Which cannot meet the requirements of practical applications. The space-time fusion can furthest utilize the complementarity of information among different remote sensing data, so that the remote sensing data with different spatial resolutions and time resolutions can finally obtain fused images with higher time and spatial resolutions by a fusion technical means. However, in practical applications, due to the large difference in spatial resolution between high-resolution and low-resolution images, for example, one MODIS pixel corresponds to a pixel of hundreds of Landsat images, which means that the MODIS image contains less than 1% of spatial information of the Landsat images in the same scene, conventional interpolation methods cannot embody enough spatial information, but too little known information cannot express the correspondence between the high-resolution image and the low-resolution image, which results in lower fusion accuracy.
Disclosure of Invention
Aiming at the technical problems in the related art, the invention provides an expandable image generation space-time fusion method with an information gain strategy, which can overcome the defects in the prior art.
In order to achieve the technical purpose, the technical scheme of the invention is realized as follows:
a scalable, image-generating spatio-temporal fusion method with information gain strategy, the method comprising the steps of:
s1: adopting the time sequence of the Cycle-GAN analog image to obtain a plurality of available data sets and generating a multi-stage iterative image;
s2: selecting the multi-stage iterative image with the image similar to the image at the time of prediction according to the reference information;
s3: gain information of the multistage iterative image is obtained, and the gain information is sent to a low-resolution image with the prediction time k of the spatial information;
s4: and acquiring the image information selected by wavelet transformation, and predicting the spatial information in space-time fusion.
Further, the step S1 includes the following steps:
s11: setting iteration times, and acquiring image information of a front moment k-1 and a rear moment k+1 of a predicted moment k;
s12: setting images at the time k-1 and the time k+1 as training samples of Cycle-GAN, and selecting one of the time as a generated sample to generate an image;
s13: and iteratively obtaining a plurality of remote sensing images.
Further, the step S2 includes the steps of:
s21: calculating and predicting a low resolution image at time k;
s22: calculating and predicting mutual information of a low resolution image at time k and a low resolution image at time k-1
S23: calculating and predicting mutual information of a low resolution image at time k and a low resolution image at time k+1
S24: calculating and predicting a high resolution image at time k;
s25: calculating and predicting mutual information of a high resolution image at time k and a high resolution image at time k-1
S26: calculating and predicting mutual information of a high resolution image at time k and a high resolution image at time k+1
S27: acquiring mutual information in a rangeAnd->An image in between, wherein lambda represents high and low resolutionThe correspondence of the mutual information between the rates can be an empirical value;
s28: acquiring an image with the maximum information entropy in the selected images, and setting the image as a selected generated image
Further, the step S3 includes the following steps:
s31: acquiring an imageInformation and low resolution image information, for image +.>Performing wavelet transformation secondary decomposition, and performing wavelet transformation secondary decomposition on the low-resolution image;
s32: acquiring and recombining imagesSecondary resolution information and low resolution image secondary resolution information.
The invention has the beneficial effects that: by using the method, on one hand, a new thought is provided for the prediction of space information by space-time fusion, and a new solution is provided for the problem of difficult processing of the corresponding relation caused by overlarge information difference between high and low spatial resolutions;
on the other hand, the concept of the Cycle-GAN simulation time sequence process and the method of antagonism learning are beneficial to reasonably predicting time-ordered high-resolution images and generating images, are beneficial to generating images containing more gain information which is not contained in low-resolution images, and provide assistance for introducing new gain information for space-time fusion.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of the steps of a scalable image generation spatiotemporal fusion method with information gain policy according to an embodiment of the invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which are derived by a person skilled in the art based on the embodiments of the invention, fall within the scope of protection of the invention.
As shown in fig. 1, the scalable image generation space-time fusion method with information gain strategy according to the embodiment of the invention comprises the following steps:
step S1, adopting the time sequence of a Cycle-GAN analog image to obtain a plurality of available data sets and generating a multi-stage iterative image;
step S2, selecting the multi-stage iterative image with the image similar to the image in the prediction according to the reference information;
step S3, gain information of the multistage iterative image is obtained, and the gain information is sent to a low-resolution image with the prediction time k of the spatial information;
and S4, acquiring the image information selected by wavelet transformation, and predicting the spatial information in space-time fusion.
Step S1 comprises the steps of:
step S11, setting iteration times, and acquiring image information of a front moment k-1 and a rear moment k+1 of a predicted moment k;
step S12, setting images at the time k-1 and k+1 as training samples of Cycle-GAN, and selecting one of the time as a generated sample to generate an image;
and step S13, iteratively acquiring a plurality of remote sensing images.
Step S2 comprises the steps of:
step S21, calculating and predicting a low resolution image at the moment k;
step S22, calculating and predicting mutual information of the low resolution image at time k and the low resolution image at time k-1
Step S23, calculating and predicting mutual information of the low resolution image at time k and the low resolution image at time k+1
Step S24, calculating and predicting a high-resolution image at the moment k;
step S25, calculating and predicting mutual information of the high resolution image at time k and the high resolution image at time k-1
Step S26, calculating and predicting mutual information of the high resolution image at time k and the high resolution image at time k+1
Step S27, obtaining the mutual information in the rangeAnd->The images of the two, wherein lambda represents the corresponding relation of the mutual information between the high resolution and the low resolution, and the corresponding relation can be an empirical value;
acquiring an image with the maximum information entropy in the selected images, and setting the image as a selected generated image
Step S3 comprises the steps of:
step S31, acquiring an imageInformation and low resolution image information, for image +.>Performing wavelet transformation secondary decomposition, and performing wavelet transformation secondary decomposition on the low-resolution image;
step S32, acquiring and reorganizing the imageSecondary resolution information and low resolution image secondary resolution information.
In order to facilitate understanding of the above technical solutions of the present invention, the following describes the above technical solutions of the present invention in detail by a specific usage manner.
The embodiment of the invention discloses a cloud storage-based multi-dimensional organization management system overall architecture flow of multi-source remote sensing image data, which comprises the following modules and flows:
1) The multi-center remote sensing data retrieval module consists of a plurality of parts such as distributed data center ftp service, a branch center crawler, metadata mapping, a software creation index and the like. Metadata is integrated to a main center through a branch center crawler, and metadata indexes are created through a solr to provide query retrieval services for remote sensing metadata of multiple centers for users.
2) And after the user subscribes the data required by the user through the multi-center remote sensing data retrieval module, the data is transmitted to the shared cloud storage through the ftp service provided by the sub-center, the reliability of the data storage is ensured, and then the user defines the view by selecting the view generation rule. Thereby realizing multidimensional organization of multi-center remote sensing data.
3) And the data view sharing module stores the view structure of the user in the user virtual directory database while the user generates the view, and then shares the virtual directory structure of the user to other users by calling the user virtual directory database, so that the movement of real data is avoided, and the multidimensional organization and management of the remote sensing data by other users are facilitated.
When a specific user organizes and manages data in cloud storage, the user builds a sharing code on the view data on the basis of OpenStack shift object storage, and shares a virtual directory structure to other users. A user may access the telemetry data resource therein via HTTP protocol through a Swift Restful API.
The cloud storage multidimensional data organization module is mainly divided into a front-end visualization layer and provides data view generation service, view sharing service and view data downloading service for users. The middle business logic module is mainly responsible for processing the basic function of data import and export cloud storage and the generation function of a data view logic structure, and the bottom layer is mainly supported by the OpenStack object storage and Mysql of cluster deployment.
When the method is specifically used, a user firstly searches the archived data of each sub-center, then selects data subscription, simultaneously guides the subscribed data into a main center cloud for storage by the sub-center, then can select view construction rules to generate a custom view, persists the view to a Mysql database, and can share the data view and download links with other users through a shared view function and a data downloading function of the cloud storage to complete data preparation.
According to the description, the multi-dimensional organization method of the multi-source remote sensing image data based on cloud storage is designed and completed, and the multi-dimensional organization method can be suitable for remote sensing applications requiring multi-dimensional characterization and analysis of large-scale remote sensing data. According to the method, a unified management flow is utilized, user management based on a cloud platform is utilized, an automatic multidimensional data view is constructed, virtual data views based on cloud storage are shared and downloaded, the data storage efficiency is greatly improved, and the data preparation flow required by subsequent data analysis is simplified.
In summary, by means of the above technical solution of the present invention, through the use of the method, on one hand, a new idea is provided for prediction of spatial information by space-time fusion, and a new solution is provided for solving the problem of difficult processing of the corresponding relationship caused by the overlarge information difference between high and low spatial resolutions; on the other hand, the concept of the Cycle-GAN simulation time sequence process and the method of antagonism learning are beneficial to reasonably predicting time-ordered high-resolution images and generating images, are beneficial to generating images containing more gain information which is not contained in low-resolution images, and provide assistance for introducing new gain information for space-time fusion.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.

Claims (2)

1. An extensible image generation space-time fusion method with an information gain strategy, which is characterized by comprising the following steps:
s1: adopting the time sequence of the Cycle-GAN analog image to obtain a plurality of available data sets and generating a multi-stage iterative image;
s2: selecting images through the range of values of mutual information and acquiring the image with the maximum information entropy;
the step S2 includes the steps of:
s21: calculating and predicting a low resolution image at time k;
s22: calculating and predicting mutual information of a low resolution image at time k and a low resolution image at time k-1
S23: calculating and predicting mutual information of a low resolution image at time k and a low resolution image at time k+1
S24: calculating and predicting a high resolution image at time k;
s25: calculating and predicting mutual information of a high resolution image at time k and a high resolution image at time k-1
S26: calculating and predicting mutual information of a high resolution image at time k and a high resolution image at time k+1
S27: acquiring mutual information in a rangeAnd->An image of the image, wherein lambda represents the correspondence of the mutual information between the high resolution and the low resolution;
s28: acquiring an image with the maximum information entropy in the selected images, and setting the image as a selected generated image
S3: gain information of the multistage iterative image is obtained, and the gain information is sent to a low-resolution image with the prediction time k of the spatial information;
s4: acquiring image information selected by wavelet transformation, and predicting spatial information in space-time fusion;
the step S4 includes the steps of:
s41: acquiring an imageInformation and low resolution image information, for image +.>Performing secondary decomposition of wavelet transformation on the low-resolution image;
s42: acquiring and recombining imagesSecondary resolution information and low resolution image secondary resolution information.
2. The scalable, image generation spatio-temporal fusion method with information gain strategy according to claim 1, characterized in that said step S1 comprises the steps of:
s11: setting iteration times, and acquiring image information of a front moment k-1 and a rear moment k+1 of a predicted moment k;
s12: setting images at the time k-1 and the time k+1 as training samples of Cycle-GAN, and selecting one of the time as a generated sample to generate an image;
s13: and iteratively obtaining a plurality of remote sensing images.
CN201911280551.7A 2019-12-13 2019-12-13 Extensible image generation space-time fusion method with information gain strategy Active CN111159310B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911280551.7A CN111159310B (en) 2019-12-13 2019-12-13 Extensible image generation space-time fusion method with information gain strategy

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911280551.7A CN111159310B (en) 2019-12-13 2019-12-13 Extensible image generation space-time fusion method with information gain strategy

Publications (2)

Publication Number Publication Date
CN111159310A CN111159310A (en) 2020-05-15
CN111159310B true CN111159310B (en) 2023-09-29

Family

ID=70557063

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911280551.7A Active CN111159310B (en) 2019-12-13 2019-12-13 Extensible image generation space-time fusion method with information gain strategy

Country Status (1)

Country Link
CN (1) CN111159310B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915529A (en) * 2012-10-15 2013-02-06 黄波 Integrated fusion technique and system based on remote sensing of time, space, spectrum and angle
CN105184076A (en) * 2015-09-02 2015-12-23 安徽大学 Space-time integrated fusion method for remote sensing earth surface temperature data
WO2017219263A1 (en) * 2016-06-22 2017-12-28 中国科学院自动化研究所 Image super-resolution enhancement method based on bidirectional recursion convolution neural network
EP3564903A1 (en) * 2018-05-01 2019-11-06 Koninklijke Philips N.V. Lower to higher resolution image fusion
CN110532897A (en) * 2019-08-07 2019-12-03 北京科技大学 The method and apparatus of components image recognition

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100944462B1 (en) * 2008-03-07 2010-03-03 한국항공우주연구원 Satellite image fusion method and system
TWI624804B (en) * 2016-11-07 2018-05-21 盾心科技股份有限公司 A method and system for providing high resolution image through super-resolution reconstrucion
US10474160B2 (en) * 2017-07-03 2019-11-12 Baidu Usa Llc High resolution 3D point clouds generation from downsampled low resolution LIDAR 3D point clouds and camera images

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915529A (en) * 2012-10-15 2013-02-06 黄波 Integrated fusion technique and system based on remote sensing of time, space, spectrum and angle
CN105184076A (en) * 2015-09-02 2015-12-23 安徽大学 Space-time integrated fusion method for remote sensing earth surface temperature data
WO2017219263A1 (en) * 2016-06-22 2017-12-28 中国科学院自动化研究所 Image super-resolution enhancement method based on bidirectional recursion convolution neural network
EP3564903A1 (en) * 2018-05-01 2019-11-06 Koninklijke Philips N.V. Lower to higher resolution image fusion
CN110532897A (en) * 2019-08-07 2019-12-03 北京科技大学 The method and apparatus of components image recognition

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Chen B等.Comparison of Spatiotemporal Fusion Models:A Review.《Remote Sens》.2015,第7卷(第2期),1798-1835. *
黄波 ; 赵涌泉 ; .多源卫星遥感影像时空融合研究的现状及展望.测绘学报.2017,(10),全文. *

Also Published As

Publication number Publication date
CN111159310A (en) 2020-05-15

Similar Documents

Publication Publication Date Title
Sudmanns et al. Big Earth data: disruptive changes in Earth observation data management and analysis?
Baumann et al. Big data analytics for earth sciences: the EarthServer approach
Planthaber et al. EarthDB: scalable analysis of MODIS data using SciDB
Ahmad et al. An efficient divide-and-conquer approach for big data analytics in machine-to-machine communication
CN109918478A (en) The method and apparatus of knowledge based map acquisition geographic products data
US20160299910A1 (en) Method and system for querying and visualizing satellite data
CN109657081B (en) Distributed processing method, system and medium for hyperspectral satellite remote sensing data
Karantzalos et al. A scalable geospatial web service for near real-time, high-resolution land cover mapping
EP2539833A2 (en) Portable globe creation for a geographical information system
AU2021217210A1 (en) Estimation of crop type and/or sowing date
US20060276968A1 (en) Temporal mapping and analysis
US11360970B2 (en) Efficient querying using overview layers of geospatial-temporal data in a data analytics platform
CN110110107A (en) A kind of Methods on Multi-Sensors RS Image various dimensions method for organizing based on cloud storage
Pokorný et al. Big data movement: a challenge in data processing
WO2022216521A1 (en) Dual-flattening transformer through decomposed row and column queries for semantic segmentation
WO2019148104A1 (en) Cloud computing flexible large area mosaic engine
CN109472343A (en) A kind of improvement sample data missing values based on GKNN fill up algorithm
CN111159310B (en) Extensible image generation space-time fusion method with information gain strategy
Fan et al. Capability representation model for heterogeneous remote sensing sensors: Case study on soil moisture monitoring
Dissauer et al. Properties of Flare-imminent versus Flare-quiet Active Regions from the Chromosphere through the Corona. I. Introduction of the AIA Active Region Patches (AARPs)
CN110022541A (en) A kind of sparse acquisition of WSN crop growth environment information and transmission method based on NB-IoT and FPGA
US20190172259A1 (en) Methods and Systems for Reconstructing GIS Scenes
Happ et al. Towards distributed region growing image segmentation based on MapReduce
Wyborn et al. Integrating ‘Big’geoscience data into the petascale national environmental research interoperability platform (NERDIP): Successes and unforeseen challenges
Solomon et al. Sparsity based super-resolution optical imaging using correlation information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant