AU2015258202B2 - Image generation system and image generation method - Google Patents

Image generation system and image generation method Download PDF

Info

Publication number
AU2015258202B2
AU2015258202B2 AU2015258202A AU2015258202A AU2015258202B2 AU 2015258202 B2 AU2015258202 B2 AU 2015258202B2 AU 2015258202 A AU2015258202 A AU 2015258202A AU 2015258202 A AU2015258202 A AU 2015258202A AU 2015258202 B2 AU2015258202 B2 AU 2015258202B2
Authority
AU
Australia
Prior art keywords
image
processor
correction parameters
resolution
missing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
AU2015258202A
Other versions
AU2015258202A1 (en
Inventor
Tao Guo
Osamu Nishiguchi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Solutions Ltd
Original Assignee
Hitachi Solutions Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Solutions Ltd filed Critical Hitachi Solutions Ltd
Publication of AU2015258202A1 publication Critical patent/AU2015258202A1/en
Application granted granted Critical
Publication of AU2015258202B2 publication Critical patent/AU2015258202B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

An image generation system which generates an image including no temporal and spatial missing images, includes: a processor that executes a program; and a memory that stores the program executed by the processor. The processor calculates spectral correction parameters for transformation between a first image and a second image using the first image and the second image of different resolutions having captured the same area on the ground; calculates temporal change correction parameters for correction of temporal changes using two second images captured at different times; and generates a missing image of the first image from the second image captured at a time different from the first image using the spectral correction parameters and the temporal change correction parameters. t t TIME I------------------------------------------ ------------------ 1 SATELLITE IMAGE 1 A10(t) B11(t) L --------------------------------------------------------------------------- I -------------------------------------------------------- | SATELLITE IMAGE 2 AC(t- LOD A|(t) A|(t ) BB 2 (t) -B 20(t) NO TEMPORAL I - NO CHANGE SPECTRAL CHANGE TEMPORAL CLOUD (2) - ______ CHANGE (4) T d '2 d , dia d=2A|()Aft) dl=f,(A|(t) -A (t) ds =f2 (A 2(t)-A() (3) (5) F-------------------- --------- SATELLITE IMAGE 3 A1|(t A|(t) SIMULATE ------- B(t) B B f(t)= B|(t)+ d2+d, Af(t)=A10(t)+d3 + d B(ti)=B20(ti)+d L ----------------------------------------- SIMULATION IMAGE 0: NON-CLOUD AREA, 1: CLOUD AREA An~t)--> TIME SATELLITE IMAGE

Description

TECHNICAL FIELD
[0001] The present invention generally relates to an image generation system that generates an image.
BACKGROUND
[0001a] In the images taken from the air by satellites, aircraft or the like, objects on the ground are not captured in areas covered with obstacles, such as cloud-covered regions. In particular, some applications require the information observed for a specified time of period. For example, in the field of agriculture, information for the crop-growing period is essential for growth management or yield prediction .
[0002] Medium-to high-resolution images provide detailed information on crops in sufficient image quality and therefore are widely used in the field of agriculture. However, with the medium- to high-resolution images, the same region is observed only at a lower frequency (in many cases, two weeks or more), and further, the medium- to high-resolution images are affected by the weather (for example, clouds). This observing time is referred to as the observation time window (acquisition window period). It is therefore difficult to obtain useful satellite images.
[0003] Furthermore, the recurrence period of a satellite depends on the orbit where the satellite is operated. Satellites capturing medium- to high-resolution images are operated in a medium- to low-altitude orbit. However, due to technical difficulties, it is difficult to increase the recurrence frequency of the satellite. For example, if the number of satellites is increased, the satellite's recurrence frequency can be increased, however it is high cost and hardly realized.
[0004] On the other hand, the frequency of recurrence of an observation satellite operated in a high-altitude orbit is high (for example, every day) . Furthermore, the high-altitude satellite provides low-resolution images containing no detailed spatial information at a high time frequency. In most cases, the low-resolution images are low cost and freely available.
[0005] While the satellite image free of influence from (or less influenced by) clouds is very useful, there are few opportunities for its use. Therefore, the technology for removing clouds from the image has been required. The related art may include the following two. One is a method of replacing a cloud-covered area by another image without clouds of the same area. However, the temporal information of the cloud-covered area and the image without clouds are not matched. The other one is a method of complementing a cloud-covered area using the information of its surrounding area in the same image, that is, the replacement by an image without clouds of another area. However, the spatial information of the cloud-covered area and the image without clouds are not matched.
[0006] The background in the art may include: Nonpatent Literature 1, Yanfeng Gu and five others, "Multiple-kernel learning-based unmixing algorithm for estimation of cloud fractions with MODIS and CloudSat data," Geoscience and Remote Sensing Symposium (IGARSS), 2012 IEEE International, pp.1785-1788, July 2012; and Nonpatent Literature 2, Xiaolin Zhu and three others, "A Modified Neighborhood Similar Pixel Interpolator Approach for Removing Thick Clouds in Landsat Images," Geoscience and Remote Sensing Letters, IEEE (Volume: 9, Issue: 3), pp.521-525, May 2012. The Nonpatent Literature 1 discloses a method for removing clouds by the interpolation using adjacent similar pixels because the nonavailability of images due to thick clouds is a problem common to ground observation satellites. Furthermore, the Nonpatent Literature 2 discloses a method for separating the influence of clouds from sub-pixels of a cloud-covered area by applying spectral unmixing.
[0007] Finding a reliable clue from a wealth of information is a key to recovering the information that is not available due to clouds. In the above-described first method, the images taken at different times are used, and temporal changes in a target region are not considered. Furthermore, in the second method, the information of a target region is similar to that of its surrounding region; however, any actual information of the target region is not contained.
[0008] It is desired to address or ameliorate one or more disadvantages or limitations associated with the prior art, or to at least provide a useful alternative.
SUMMARY
[0009] An embodiment of the present invention provides an image generation system which generates an image including no temporal and spatial missing images, comprising: a processor that executes a program; and a memory that stores the program executed by the processor, wherein the processor: calculates spectral correction parameters for transformation between a first image and a second image using the first image and the second image of different resolutions having captured the same area on the ground; calculates temporal change correction parameters for correction of temporal changes using two second images captured at different times; and generates a missing image of the first image from the second image captured at a time different from the first image using the spectral correction parameters and the temporal change correction parameters.
[0010] An embodiment of the present invention provides an image generation method for generating an image including no temporal and spatial missing images using a computer, the computer having a processor that executes a program and a memory that stores the program executed by the processor, the method comprising the steps of: calculating, in the processor, spectral correction parameters for transformation between a first image and a second image using the first image and the second image of different resolutions having captured the same area on the ground; calculating, in the processor, temporal change correction parameters for correction of temporal changes using two second images captured at different times; and generating, in the processor, a missing image of the first image from the second image captured at a time different from the first image using the spectral correction parameters and the temporal change correction parameters.
BRIEF DESCRIPTION OF DRAWINGS
[0011] Some embodiments of the present invention are hereinafter described, by way of non-limiting example only, with reference to the accompanying drawings, in which:: [0012] FIG. 1 illustrates the outline of an image generation method according to an embodiment of the present invention; FIG. 2A shows formulas for transformation between images according to this embodiment; FIG. 2B shows formulas for transformation between images according to this embodiment; FIG. 3 is a flowchart of a method for estimating an existing cloud-covered area Bi°(t); FIG. 4 is a detailed flowchart of a calculation process (Si) for spectral correction parameters dl; FIG. 5 is a detailed flowchart of a calculation process (S2) for temporal change correction parameters d2; FIG. 6 illustrates the process of step S23; FIG. 7 is a flowchart of a method for estimating a satellite image Ai°(tj) at an optional time; FIG. 8 is a detailed flowchart of a creation process (step S5) for the image Ai°(tj); FIG. 9 is a block diagram showing a logical configuration of an image generation system according to this embodiment; and FIG. 10 is a block diagram showing a physical configuration of the image generation system according to this embodiment.
DETAILED DESCRIPTION
[0012a] Described herein is a method for deriving missing information in consideration of temporal and spatial clues.
[0012b] In a representative aspect, described herein is an image generation system which generates an image including no temporal and spatial missing images, the image generation system including: a processor that executes a program; and a memory that stores the program executed by the processor. The processor calculates spectral correction parameters for transformation between a first image and a second image using the first image and the second image of different resolutions having captured the same area on the ground; calculates temporal change correction parameters for correction of temporal changes using two second images captured at different times; and generates a missing image of the first image from the second image captured at a time different from the first image using the spectral correction parameters and the temporal change correction parameters.
[0012c] According to the representative aspect, it may be possible to precisely derive information of a cloud region. The problems, configuration, and advantages other than the forgoing will be disclosed by the embodiment to be described below.
[0013] Firstly, the outline of an embodiment of the present invention will be described.
[0014] This embodiment relates to a computer system which generates an image containing no areas covered with obstacles (for example, a satellite image without clouds), wherein the missing information of reflected light from an object on the ground in a cloud area of a satellite image is recovered from another time-series satellite images. Especially, it is possible to eliminate the influence of clouds at important observation times. Therefore, the usefulness of satellite images is improved.
[0015] The main point of this embodiment is that a temporal change is derived using a non-cloud area and applied to a cloud area according to the spatial similarity. More specifically, in order to determine a temporal change of an image, time-series satellite images are used, and the determined temporal change is applied to an area to be processed to form a composite low-resolution image without clouds (at a time same as the high-resolution image with clouds) . Then the low-resolution image is projected onto the high-resolution image to create the information required in the cloud area.
[0016] FIG. 1 illustrates the outline of an image generation method according to an embodiment of the present invention.
[0017] In this embodiment, a high-resolution satellite image 1 and a low-resolution satellite image 2 are used. As described above, the high-resolution image is an image that is high in photography costs and captured less frequently, for example, about once a month by a low-altitude satellite or aircraft. On the other hand, the low-resolution satellite image is an image that is low in photography costs and captured frequently, for example, several times a day by a high-altitude satellite or aircraft.
[0018] This embodiment will be described using a satellite image taken by a satellite. However, the present invention can also be applied to images taken by aircraft. This embodiment can also be used for recovering the image of an area that could not be photographed due to a temporary breakdown of the satellite.
[0019] In the satellite images 1 and 2 taken at a certain time t, an object (such as a building or vegetation on the grass) on the ground could not be captured because there were clouds in an area B. In the satellite image 2 taken at a time ti, on the other hand, the object on the ground could be captured because there were no clouds in the area B. Further, since the time ti does not correspond to the photography period of the satellite image 1, the high-resolution image is not captured.
[0020] Under such circumstances, according to the present invention, an image Bi°(t) with the clouds removed at the time t can be generated by simulation using spectral changes dl from a satellite image Ai°(t) of the image sequence 1 to a satellite image A2°(t) of the image sequence 2 and temporal changes d2 from a satellite image A2°(ti) to the satellite image A2°(t) of the image sequence 2.
[0021] Further, with the method of the present invention, from a satellite image A2°(tj) of the image sequence 2 at an optional time tj as obtained by simulation, a high-resolution image Ai°(tj) at the time tj can be generated by simulation using the spectral changes dl from the satellite image Ai°(t) of the image sequence 1 to the satellite image A2°(t) and a temporal change d3 from the satellite image A2°(t) to A2°(tj) of the image sequence 2.
[0022] That is, by using the method of the present invention, it is possible to determine, with high accuracy, a satellite image at an optional time also including future.
[0023] FIG. 2A shows formulas for transformation between images according to this embodiment.
[0024] Formula (1) shows that the transformation between a satellite image Ai of the image sequence 1 and a satellite image A2 of the image sequence 2 can be performed by spectral correction parameters dl. The satellite image sequences 1 and 2 taken at the same time t are image sequences which have captured the same object on the ground with different resolutions using different sensors. Therefore, the images to be captured vary according to the difference in the frequency characteristic and photographic sensitivity of the sensors used in capturing the images. For this reason, the transformation of the difference between images caused by the sensors is performed by the spectral correction parameters dl.
[0025] Formula (2) shows that the transformation between satellite images A2 (t) and A2 (ti) of the image sequence 2 can be performed by temporal change correction parameters d2. The transformation between the satellite images 2 taken at the different times t and t2 can be performed by the temporal changes d2. The temporal change correction parameters d2 may be calculated using two satellite images of the image sequence 1. It should be noted that the temporal changes d2 are a function of the difference t2 - t between times when the images are taken.
[0026] Formula (3) is a formula for transformation between the satellite image sequences 1 and 2 taken at different times, using the formulas (1) and (2) . That is, the transformation between the satellite images A2 (t) and A2 (ti) of the image sequence 2 can be performed by the differences d2, and the transformation between the satellite image Αχ of the image sequence 1 and the satellite image A2 of the image sequence 2 can be performed by the spectral differences dl. Consequently, the transformation between the satellite image A2(tx) of the image sequence 2 and a satellite image Ai(t) of the image sequence 1 can be performed by the spectral differences dl and the temporal change correction parameters d2.
[0027] FIG. 2B shows a method for image transformation between images with different resolutions according to this embodiment.
[0028] If the resolution R1 of the satellite image 1 is lower than the resolution R2 of the satellite image 2, or if the resolution R1 of the satellite image 1 is the same as the resolution R2 of the satellite image 2, the pixels of the satellite image 1 can be determined from the pixels of the satellite image 2.
[0029] On the other hand, if the resolution R1 of the satellite image 1 is higher than the resolution R2 of the satellite image 2, the pixels of the image 2 become a mixture with the pixels of the image 1. Therefore, the pixels of the satellite image 1 are estimated from the pixels of the satellite image 2. For example, data in which the characteristic values of the plurality of pixels of the high-resolution satellite image 1 and the characteristic value of one pixel of the low-resolution satellite image 2 have been associated may be created in order to estimate the pixels of the high-resolution satellite image 1 from the pixels of the low-resolution satellite image 2.
[0030] FIG. 3 is a flowchart of a method for estimating an existing cloud-covered area Bi°(t).
[0031] Firstly, the spectral correction parameters dl are calculated using non-cloud areas of the satellite image Ai of the image sequence 1 and the satellite image A2 of the image sequence 2 at the time t (step SI) . The details of the process of the step SI will be described later using FIG . 4.
[0032] Next, the temporal change correction parameters d2 are calculated using the two satellite images A2 taken at different times (step S2) . In the step 2, a function f2 is a temporal change model for each kind of objects on the ground. The details of the process of the step S2 will be described later using FIG. 5. It should be noted that the temporal change correction parameters d2 may be calculated using the two satellite images Ai of the image sequence 1.
[0033] Finally, the pixels in a cloud area are calculated from a non-cloud image B2°(ti) with different resolution at a different time using the correction parameters dl and d2, and the non-cloud image Bi° (t) is created (step S3 ) .
[0034] FIG. 4 is a detailed flowchart of the calculation process (step SI) for the spectral correction parameters dl.
[0035] Firstly, in the step Si, the spatial positions of the image Ai(t) and the images A2(t) at the time t are aligned (step Sll) . This positional alignment between the two images is performed by specifying on each of the two images a GCP (ground control point, such as a cross point or edge point) provided on the ground, and aligning the positions of the specified ground control points with each other.
[0036] Then atmospheric correction of the image Ai(t) and the images A2(t) is performed (step S12). By the atmospheric correction, radiative transfer is calculated using the atmospheric information obtained from ground observation equipment, and the effects of the atmosphere, such as scattering or absorption, can be removed.
[0037] Thereafter, an image A2(t) ' with the same resolution as the image A2(t) is created from the pixels of the image Ai(t) using a spectral mixture model (step S13). More specifically, a distributed model for the high-resolution image and a distributed model for the low-resolution image are generated using LDA, and a characteristic relationship model in which the characteristic values of the plurality of pixels of the high-resolution image and the characteristic value of one pixel of the low-resolution image have been associated is constructed. Then the relationship with the pixels of the high-resolution image is estimated from the pixels of the low-resolution image using the constructed characteristic relationship model.
[0038] Then the spectral correction parameters dl are calculated on the basis of the difference between the spectrum of each pixel of the image A2(t) and the spectrum of each pixel of the generated image A2(t) ' (step S14) . The spectral correction parameters dl are calculated for each spectrum of the low-resolution image A2.
[0039] FIG. 5 is a detailed flowchart of the calculation process (S2) for the temporal change correction parameters d2.
[0040] Firstly, in the step S2, the non-cloud image A2°(ti) of the satellite image sequence 2 at the time ti is obtained (step S21).
[0041] Next, the temporal change correction parameters are calculated from each pixel at the time ti and the time t in a non-cloud area (step S22) . More specifically, the difference between the spectrum of each pixel of the image A2°(t) and the spectrum of each pixel of the image A2°(ti) is calculated. Thus, a spectrum temporal change model can be calculated for the spectrum of each pixel. This is because it is estimated that the pixels with the same spectrum capture the same object on the ground, and the same object on the ground changes similarly in spectrum with the passage of time. In the formula, f2 is the function for determining the temporal change correction parameters d2 from the spectrum of each pixel. It should be noted that d2 denotes the parameters determined by the spectrum of each pixel of the image A2°(ti).
[0042] A pixel Pi in a cloud area is estimated using pixels adjacent to the cloud area (step S23).
[0043] FIG. 6 illustrates the process of step S23.
[0044] A circle of a predetermined size about the pixel Pi to be estimated is created. The created circle includes "n" number of pixels. Furthermore, the reciprocal of the distance from the image P to a j-th pixel A2°(j) of the image A2°(t) is a weighting factor kj . The "j" is parameters for controlling each pixel included in the created circle. Then the spectrum of the pixel A2°(j) included in the created circle is multiplied by the weighting factor of the reciprocal of the distance, and the sum of them is calculated. The spectrum of the pixel Pi to be estimated can be calculated by dividing the sum by the number "n" of pixels .
[0045] In this manner, the pixels of the cloud area can be estimated using the spatial similarity by using the fact that the objects on the ground expressed by adjacent pixels are highly likely to be similar, that is, by performing weighting according to the reciprocal kj of the distance between the pixels.
[0046] It should be noted that, if the objects on the ground in the cloud area are already identified by observations on the ground, or the like, a change in each pixel may be calculated using the temporal change model f2 determined for each kind of objects. For example, if the objects are identified as wheat through the spectrum of the pixels, the pixels of the cloud area are estimated using a growing model showing the relationship between the growing condition of wheat and changes in spectrum.
[0047] Then parameters άβ for correcting temporal changes in the cloud area are calculated using the temporal change correction parameters d2 and the estimated pixel Pi (step S24). Specifically, as for all pixels, the temporal change correction parameters d2 and the spectrum of the pixel Pi are multiplied, and the sum of them is calculated. That is, the function fd is a function for calculating the sum of Pi X d2 for all "i".
[0048] Thus, d2, that is, the temporal change correction parameters dP(d2), with the two satellite images A2°(t) and A2°(ti) taken at different times as parameters, can be calculated.
[0049] While the image generation method according to the present invention has been described above using an example of the estimation in an area covered with clouds as an example of obstacles, the present invention can also be applied to images including the area that is not photographed under the influence of obstacles other than clouds.
[0050] Next, a method for estimating the satellite image at an optional time will be described.
[0051] FIG. 7 is a flowchart of a method for estimating the satellite image Ai°(tj) at an optional time.
[0052] Firstly, the spectral correction parameters dl are calculated using non-cloud areas of the satellite image Αχ at the time t of the image sequence 1 and the satellite image A2 at the time t of the image sequence 2 (SI) . The details of the process of the step SI have been described using FIG . 4.
[0053] Then the image A2°(tj) of the image sequence 2 at the desired time tj is estimated (step S4). An existing method can be used in this estimation. For example, the image A2°(tj) can be calculated from the images A2°(t) by applying the temporal change correction parameters d2 calculated in the step S2 of FIG. 3 to time difference tj - t.
[0054] Thereafter the temporal change correction parameters d3 are calculated using the two satellite images A2 taken at different times (step S5) . It should be noted that the process of the step S5 is the same as that of the above-described step S2 (FIGS. 3 and 5), and the function f2 is a temporal change model for each kind of objects on the ground .
[0055] It should be noted that the temporal change correction parameters d3 may be determined from the existing two images A2 of the image sequence 2 and further changed according to the time difference. For example, the temporal change correction parameters d3 may be calculated by applying the temporal change correction parameters d2 calculated in the step S2 of FIG. 3 to time difference tj - t.
[0056] Finally, the pixels of the image at the time tj are calculated from the image Ai°(t) at the time t using the temporal change correction parameters d3 to create the image Ai°(tj) . Furthermore, the pixels of a cloud area are calculated from an image B2°(tj) at the time t using the correction parameters dl and d3 to create a non-cloud image Bi°(tj) ( step S6) .
[0057] FIG. 8 is a detailed flowchart of the creation process (S6) for the image Ai°(tj).
[0058] Firstly, in the image sequence 1, the image Ai°(tj) at the time tj is created from the image Ai°(t) using the temporal change correction parameters d2 (step S61). However, the image Ai°(tj) created at this stage includes a cloud area in the same manner as the original image Ai°(t) .
[0059] Then the pixels of the cloud area are calculated from the non-cloud image B2°(ti) using the correction parameters dl and d3 to create the non-cloud image Βχ° (t j) with different resolution ( step S6 2 ) .
[0060] Then the created non-cloud image Bi°(tj) is incorporated into the image Ai°(tj), thereby creating the non-cloud image Ai°(tj) (step S63) .
[0061] FIG. 9 is a block diagram showing a logical configuration of an image generation system according to this embodiment.
[0062] The image generation system according to this embodiment is a computer, which has an input unit 110, a display unit 120, a data recording unit 130, a calculation unit 140, a data output unit 150, and a storage unit 160. The input unit 110, the display unit 120, the data recording unit 130, and the data output unit 150, and the storage unit 160 are connected via the calculation unit 140 (or mutually via a bus).
[0063] The input unit 110 has an image input unit 111 and a map information input unit 112. The image input unit 111 is a device into which high-resolution images and low-resolution images are input, and the map information input unit 112 is a device into which map information is input. The image input unit 111 and the map information input unit 112 are configured from devices for receiving the data input, such as an optical disk drive and a USB interface, and human interfaces, such as a keyboard, a touch panel and a mouse. It should be noted that the image input unit 111 and the map information input 112 may be configured from the same input device or different input devices.
[0064] The display unit 120 has an image display unit 121 and a map information display unit 122. The image display unit 121 is a display device which displays an image to be processed. The map information display unit 122 is a display device that displays input map information, and used for specifying a ground control point (GCP), such as a cross point or an edge point, in a map. It should be noted that the image display unit 121 and the map information display unit 122 may be configured from the same display device or different display devices.
[0065] The data recording unit 130 is a nonvolatile storage device which stores the image data to be processed by this image generation system, and configured, for example from a hard disk device or a nonvolatile memory. The calculation unit 140 includes a processor and executes the processing to be processed by the image generation system in response to the execution of a program.
[0066] The data output unit 150 is a device which outputs the results processed by the image generation system, and configured, for example from a printer or a plotter. The storage unit 160 is a storage device which stores the program to be executed by the calculation unit 140, and configured, for example from a hard disk device or a nonvolatile memory .
[0067] FIG. 10 is a block diagram showing a physical configuration of the image generation system according to this embodiment.
[0068] The image generation system according to this embodiment has a calculation unit 10, a storage device 20, a communication interface 30, and a medium driver 40.
[0069] The calculation unit 10 has a processor (CPU) 101 that executes a program, a ROM 102 that is a nonvolatile storage element, and a RAM 103 that is a volatile storage element. The ROM 102 stores non-changing programs (for example, BIOS). The RAM 103 temporarily stores the program stored in the storage device 20 and the data to be used at the time of executing the program.
[0070] The program executed by the calculation unit 100 is provided to the computer over removable media (such as CD-ROMs or flash memories) or a network, and stored in a storage device that is a non-transitory storage medium. Therefore, the computer constituting the image generation system preferably has an interface for reading data from removable media.
[0071] The storage device 20 is a large-capacity and nonvolatile storage device, such as a magnetic-storage device or a flash-memory device, and stores the program to be executed by the processor 101 and the data to be used at the time of executing the program. That is, the program executed by the processor 101 is read from the storage device 20, loaded in the RAM 103, and executed by the processor 101.
[0072] The communication interface 30 is a network interface device that controls the communication with other devices according to a predetermined protocol. The medium driver 40 is an interface (such as an optical disk drive or a USB port) for reading a storage medium 50 that stores the program or data introduced in the image generation system.
[0073] The image generation system according to this embodiment is a computing system which is configured physically on a single computer or on a plurality of computers logically or physically configured, and may operate in separate threads on the same computer or operate on virtual computers built on a plurality of physical computer resources. For example, a sensing provider providing aerial photographs may provide the image generation system in a cloud environment.
[0074] The image generation system according to this embodiment may be mounted on a stand-alone computer or a client-server computer system. When the image generation system is mounted on the client-server computing system, a server executes calculation processing, and a client receives input data and outputs calculation results.
[0075] As described above, according to the embodiment of the present invention, the image covered with obstacles, such as clouds, can be accurately recovered with high reliability. Especially by incorporating much information into spatial and temporal changes, the image of a cloud-covered area can be recovered with high accuracy. Furthermore, because the influence of clouds in the high-resolution image is reduced and the high-resolution image can be obtained at a short time interval, the high-resolution image can be improved in usefulness and reduced in acquisition cost.
[0076] Also, in addition to the reduction in the influence of clouds, the satellite image at an optional time also including future can be determined with high accuracy.
[0077] In this manner, it is possible to construct the image generation system which generates high-resolution images with high efficiency and at low cost.
[0078] It should be understood that the present invention is not limited to the above-described embodiment, but can include various modifications and equivalent configurations within the scope of the appended claims. For example, the above-described embodiment has been given in details in order to explain the present invention clearly, and the present invention is not always limited to an embodiment including all the elements described above. Furthermore, part of the configuration of one embodiment may be replaced by the configuration of another embodiment. Moreover, the configuration of another embodiment may be added to the configuration of one embodiment. Additionally, the addition, deletion, or replacement of another configuration may be made to part of the configuration of each embodiment.
[0079] Furthermore, the above-described elements, functions, processing units, processing means or the like may be implemented in whole or in part by hardware, such as designing in an integrated circuit, or may be implemented by software, wherein the processor interprets and executes a program for performing each function.
[0080] The information, such as programs, tables and files, for implementing each function can be stored in a storage device, such as a memory, hard disk or SSD (Solid State Drive), or a recording medium such as an IC card, SD card or DVD.
[0081] Further, the control lines or information lines which would be necessary for explanation are shown, and all control lines and information lines required for mounting are not shown. Actually, it may be considered that almost all elements are connected to one another.
[0082] The reference in this specification to any prior publication (or information derived from it), or to any matter which is known, is not, and should not be taken as an acknowledgment or admission or any form of suggestion that that prior publication (or information derived from it) or known matter forms part of the common general knowledge in the field of endeavour to which this specification relates.
[0083] Throughout this specification and the claims which follow, unless the context requires otherwise, the word "comprise", and variations such as "comprises" and "comprising", will be understood to imply the inclusion of a stated integer or step or group of integers or steps but not the exclusion of any other integer or step or group of integers or steps.

Claims (12)

  1. THE CLAIMS DEFINING THE INVENTION ARE AS FOLLOWS:
    1. An image generation system which generates an image including no temporal and spatial missing images, comprising: a processor that executes a program; and a memory that stores the program executed by the processor, wherein the processor: calculates spectral correction parameters for transformation between a first image and a second image using the first image and the second image of different resolutions having captured the same area on the ground; calculates temporal change correction parameters for correction of temporal changes using two second images captured at different times; and generates a missing image of the first image from the second image captured at a time different from the first image using the spectral correction parameters and the temporal change correction parameters.
  2. 2. The image generation system according to Claim 1, wherein the first image is a high-resolution image; the second image is a low-resolution image with a resolution lower than a resolution of the first image; and the processor: calculates spectral correction parameters for transformation between the high-resolution image and the low-resolution image using the high-resolution image and the low-resolution image having captured the same area on the ground; calculates temporal change correction parameters for correction of temporal changes using two low-resolution images captured at different times; and generates a missing image of the high-resolution image from the low-resolution image captured at a time different from a time of the high-resolution image using the spectral correction parameters and the temporal change correction parameters.
  3. 3. The image generation system according to Claim 1, wherein the processor: generates a second image at an optional time from the existing second image including a missing image, using the temporal change correction parameters; generates a first image at the optional time, which is an image including a missing image, from the second image at the optional time using the spectral correction parameters; generates an image of the missing area of the first image at the optional time from the second image captured at a time different from the optional time using the spectral correction parameters and the temporal change correction parameters; and generates the first image at the optional time without the missing image by compositing the generated first image at the optional time and the generated first image at the optional time.
  4. 4. The image generation system according to Claim 1, wherein the processor: performs positional alignment between the first image and the second image using a ground control point; generates an image with the same resolution as the second image from the first image using a spectral mixture model; and calculates the spectral correction parameters on the basis of a difference between a spectrum of the second image used for the positional alignment and a spectrum of the second image generated using the spectral mixture model.
  5. 5. The image generation system according to Claim 1, wherein the processor: calculates temporal change parameters by comparing pixels of the second images captured at different times; estimates pixels of a missing area using the pixels adjacent to a cloud area in the second image; and generates a missing image of the first image from the second image incorporating the estimated pixels using the spectral correction parameters and the temporal change correction parameters.
  6. 6. The image generation system according to Claim 5, the wherein processor: determines a predetermined area in the vicinity of a pixel to be estimated; determines a weighting factor on the basis of the reciprocal of the distance between each pixel included in the predetermined area and the pixel to be estimated; and estimates the pixels of the cloud area by adding the value obtained by multiplying the spectrum of each pixel included in the predetermined area by the determined weighting factor.
  7. 7. An image generation method for generating an image including no temporal and spatial missing images using a computer, the computer having a processor that executes a program and a memory that stores the program executed by the processor, the method comprising the steps of: calculating, in the processor, spectral correction parameters for transformation between a first image and a second image using the first image and the second image of different resolutions having captured the same area on the ground; calculating, in the processor, temporal change correction parameters for correction of temporal changes using two second images captured at different times; and generating, in the processor, a missing image of the first image from the second image captured at a time different from the first image using the spectral correction parameters and the temporal change correction parameters.
  8. 8. The image generation method according to Claim 7, wherein the first image is a high-resolution image; wherein the second image is a low-resolution image with a resolution lower than a resolution of the first image; wherein the spectral correction parameter calculation step comprises calculating, in the processor, spectral correction parameters for transformation between the high-resolution image and the low-resolution image using the high-resolution image and the low-resolution image having captured the same area on the ground; wherein the temporal change correction parameter calculation step comprises calculating, in the processor, temporal change correction parameters for correction of temporal changes using two low-resolution images captured at different times; and wherein the image generation step comprises generating, in the processor, a missing image of the high-resolution image from the low-resolution image captured at a time different from a time of the high-resolution image using the spectral correction parameters and the temporal change correction parameters.
  9. 9. The image generation method according to Claim 7, further comprising the steps of: generating, in the processor, a second image at an optional time from the existing second image including a missing image, using the temporal change correction parameters; generating, in the processor, a first image at the optional time, which is an image including a missing image, from the second image at the optional time using the spectral correction parameters; generating, in the processor, an image of the missing area of the first image at the optional time from the second image captured at a time different from the optional time using the spectral correction parameters and the temporal change correction parameters; and generating, in the processor, the first image at the optional time without the missing image by compositing the generated first image at the optional time and the generated first image at the optional time.
  10. 10. The image generation method according to Claim 7, wherein the spectral correction parameter calculation step comprises : performing, in the processor, positional alignment between the first image and the second image using a ground control point; generating, in the processor, an image with the same resolution as the second image from the first image using a spectral mixture model; and calculating, in the processor, the spectral correction parameters on the basis of a difference between a spectrum of the second image used for the positional alignment and a spectrum of the second image generated using the spectral mixture model.
  11. 11. The image generation method according to Claim 7, wherein the image generation step comprises: calculating, in the processor, temporal change parameters by comparing pixels of the second images captured at different times; estimating, in the processor, pixels of a missing area using the pixels adjacent to a cloud area in the second image; and generating, in the processor, a missing image of the first image from the second image incorporating the estimated pixels using the spectral correction parameters and the temporal change correction parameters .
  12. 12. The image generation method according to Claim 11, further comprising the steps of: determining, in the processor, a predetermined area in the vicinity of a pixel to be estimated; determining, in the processor, a weighting factor on the basis of the reciprocal of the distance between each pixel included in the predetermined area and the pixel to be estimated; and estimating, in the processor, the pixels of the cloud area by adding the value obtained by multiplying the spectrum of each pixel included in the predetermined area by the determined weighting factor.
AU2015258202A 2014-12-16 2015-11-18 Image generation system and image generation method Active AU2015258202B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014254246A JP6351496B2 (en) 2014-12-16 2014-12-16 Image generation system and image generation method
JP2014-254246 2014-12-16

Publications (2)

Publication Number Publication Date
AU2015258202A1 AU2015258202A1 (en) 2016-06-30
AU2015258202B2 true AU2015258202B2 (en) 2016-09-29

Family

ID=56140193

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2015258202A Active AU2015258202B2 (en) 2014-12-16 2015-11-18 Image generation system and image generation method

Country Status (2)

Country Link
JP (1) JP6351496B2 (en)
AU (1) AU2015258202B2 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6781014B2 (en) * 2016-11-09 2020-11-04 日本電信電話株式会社 Image generation method, image difference detection method, image generation device and image generation program
KR101985241B1 (en) * 2017-11-09 2019-06-03 한국항공우주연구원 Image Processing System and Method for Camera of Satellite Payload
JP6658795B2 (en) * 2018-05-11 2020-03-04 セイコーエプソン株式会社 Machine learning device, photographing time estimation device, machine learning program, and method for producing photograph data
US10823881B2 (en) * 2018-11-05 2020-11-03 Tianjin Kantian Technology Co., Ltd. Cloud forecast using sequential images
JP7316004B1 (en) * 2022-10-24 2023-07-27 国立研究開発法人農業・食品産業技術総合研究機構 Information processing device, information processing method, and program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110064280A1 (en) * 2009-03-18 2011-03-17 Pasco Corporation Method and apparatus for producing land-surface image data
US7957608B2 (en) * 2005-07-01 2011-06-07 Flir Systems, Inc. Image correction across multiple spectral regimes
US20140029844A1 (en) * 2010-05-20 2014-01-30 Digitalglobe, Inc. Advanced cloud cover assessment for panchromatic images

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000284064A (en) * 1999-03-31 2000-10-13 Mitsubishi Electric Corp Multi-satellite complementary observation system
JP4339289B2 (en) * 2005-07-28 2009-10-07 Necシステムテクノロジー株式会社 Change determination device, change determination method, and change determination program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7957608B2 (en) * 2005-07-01 2011-06-07 Flir Systems, Inc. Image correction across multiple spectral regimes
US20110064280A1 (en) * 2009-03-18 2011-03-17 Pasco Corporation Method and apparatus for producing land-surface image data
US20140029844A1 (en) * 2010-05-20 2014-01-30 Digitalglobe, Inc. Advanced cloud cover assessment for panchromatic images

Also Published As

Publication number Publication date
JP2016115190A (en) 2016-06-23
JP6351496B2 (en) 2018-07-04
AU2015258202A1 (en) 2016-06-30

Similar Documents

Publication Publication Date Title
AU2015258202B2 (en) Image generation system and image generation method
US11501443B2 (en) Generation of synthetic high-elevation digital images from temporal sequences of high-elevation digital images
US11715181B2 (en) System and method to fuse multiple sources of optical data to generate a high-resolution, frequent and cloud-/gap-free surface reflectance product
Zhu et al. An enhanced spatial and temporal adaptive reflectance fusion model for complex heterogeneous regions
US11710219B2 (en) Detection and replacement of transient obstructions from high elevation digital images
EP3621034A1 (en) Method and apparatus for calibrating relative parameters of collector, and storage medium
EP3605063B1 (en) Vegetation index calculation device and method, and computer program
JP5898004B2 (en) Power generation amount prediction device, power generation amount prediction method, program, and power control system
Kluger et al. Two shifts for crop mapping: Leveraging aggregate crop statistics to improve satellite-based maps in new regions
JP7042146B2 (en) Front-end part in satellite image change extraction system, satellite image change extraction method, and satellite image change extraction system
US20220215659A1 (en) Imputation of remote sensing time series for low-latency agricultural applications
Jing et al. Efficient point cloud corrections for mobile monitoring applications using road/rail-side infrastructure
Babapour et al. A novel post-calibration method for digital cameras using image linear features
JP6082162B2 (en) Image generation system and image generation method
Tyagi et al. Elevation Data Acquisition Accuracy Assessment for ESRI Drone2Map, Agisoft Metashape, and Pix4Dmapper UAV Photogrammetry Software
JP7388595B2 (en) Image expansion device, control method, and program
EP4276745A1 (en) Method and system for geo referencing stabilization
US11321821B2 (en) Method and system for generating composite geospatial images
CA3117084C (en) Generation of synthetic high-elevation digital images from temporal sequences of high-elevation digital images
HESSE Possibilities and challenges in the application of multi-temporal airborne lidar data sets for the monitoring of archaeological landscapes
McNeill et al. Assessment of digital elevation model accuracy using ALOS-PRISM stereo imagery
Erickson A summary of Michigan program for earth resource information systems

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)