WO2017098313A1 - Method of processing satellite images for simulating night time - Google Patents

Method of processing satellite images for simulating night time Download PDF

Info

Publication number
WO2017098313A1
WO2017098313A1 PCT/IB2015/059718 IB2015059718W WO2017098313A1 WO 2017098313 A1 WO2017098313 A1 WO 2017098313A1 IB 2015059718 W IB2015059718 W IB 2015059718W WO 2017098313 A1 WO2017098313 A1 WO 2017098313A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
path
rgb
paths
channel
Prior art date
Application number
PCT/IB2015/059718
Other languages
French (fr)
Inventor
Jarosław WOŹNIAK
Anna FRYŚKOWSKA
Michał KĘDZIERSKI
Wojciech KRZYWDA
Original Assignee
Vfrpoland Sp. Z O.O.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vfrpoland Sp. Z O.O. filed Critical Vfrpoland Sp. Z O.O.
Publication of WO2017098313A1 publication Critical patent/WO2017098313A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Definitions

  • the object of the invention is a method of producing photo sceneries at night time.
  • the method allows - based on homogeneous satellite RGB images obtained in summer time - generation of new images to simulate the night.
  • digital image processing such as PCA, IHS transformations or filtration of images, it is possible to reproducibly obtain images varied in terms of simulation of time of their acquisition.
  • an original algorithm for processing images has been developed.
  • the flights performed are essentially divided in two groups: those performed according to IFR (Instrumental Flight Rules - based on indications of flight instruments) and those performed according to VFR (Visual Flight Rules - with the visibility of the ground and with the use of visual navigation methods consisting in identification and comparison of characteristic places and objects on the ground with the information described on the map).
  • IFR Instrumental Flight Rules - based on indications of flight instruments
  • VFR Visual Flight Rules - with the visibility of the ground and with the use of visual navigation methods consisting in identification and comparison of characteristic places and objects on the ground with the information described on the map.
  • the vast majority of flights are performed according to VFR.
  • An essential element of any VFR flight simulation is the environment in which the flight is performed.
  • the factor responsible for imaging the view of land in a flight simulator is scenery.
  • the standard scenery generated by the platform of flight simulator based on algorithms stored therein constitutes a considerably simplified image of reality but without its individual and characteristic features.
  • An alternative to the standard scenery is a photographic scenery, based on satellite or aerial images of land surface.
  • the photographic scenery is a faithful copy of the view of land, mapped in a flight simulator, i.e. comprising all the individual and characteristic features of land surface. Due to this feature, photographic sceneries are desirable among users of flight simulators around the world.
  • Transformation of color arrangement is performed, from RGB to I HS (Intensity Hue Saturation), to give the image in IHS encoding in the form of li, H 2 and S 3 .
  • the original image Z(j,k) in the form of RGB is subjected to contextual operation - in this case, filtration in the form of convolution of the original image with the Prewitt edge filtering mask.
  • the resulting image Z S will have components R', G' and B' .
  • composition encoded with individual I HS channels is performed in the form of: the first and second channels constitute an I ntensity channel from I HS channel, the third channel is encoded with red channel R' from image Z S .
  • RGB-IHS I ntensity-H ue- Saturation
  • Transformation I ntensity-Hue-Saturation (RBG - IHS) is a widely used technique for integrating images.
  • hue transformation from RBG to I HS system is used, and then a high-resolution image is substituted in the place of color intensity component with inverse transformation.
  • filtration of the image with the use of the Prewitt mask is used. This filtration allows detection of edges of the objects mapped in the image.
  • Filtration of the image can be represented in the easiest way as a convolution of the image function with the function of a given filter. I n fact, these operations are described using a filter mask (convolution kernel). A new pixel value is calculated in this manner, based on the values of adjacent pixels.
  • filter masks take the form of a matrix of size 3x3 :
  • Pixel value after filtration is obtained according to the following form
  • PREWITT MASK known as operator or kernel
  • Prewitt filter is another representative from the gradient group. It allows differentiation of the signal in eight different directions. For edge detection, it uses the first derivative.
  • the prototype of the Prewitt mask is a tripled Roberts mask. Below, only two examples of the Prewitt mask are presented as the other can be constructed through their rotations:
  • the Prewitt operator can be defined as:
  • This gradient for extracting edges has a definitely directional nature. Most often, it is used to detect edges in images which have large high-frequency noise. It is a combination of a gradient filter with averaging, rectangular and Gaussian filters. The sum of weights in the mask is always equal to 0 so that in the areas having a fixed value of the function, the operator generates a value of 0.
  • performing, in case of compliance of point configuration, an operation determined for a given transformation. Usually, it is just a change of color or hue of a given point.
  • Morphological transformations are usually iterative transformations, i.e. consisting in a multiple repetition of a certain elementary sequence of operations with respect to the image obtained as a result of previous operation.
  • Simple morphological filters such as erosion and dilatation, smooth the edges of figures.
  • the disadvantage thereof is that they change the surface area of the figure: decreased with erosion and increased with dilatation.
  • Image is the first argument, and structural element is the second one.
  • the principle of operation is based on applying, to each pixel of the image, a structural element (SE) in its central point. If at least one adjacent pixel covered by the SE has a value of "0", the current pixel assumes a value of "0" (background).
  • SE structural element
  • FIG. 9 An exemplary diagram is shown in Fig. 9.
  • Fig. 10 An exemplary diagram is shown in Fig. 9.
  • Fig. 10 a flow diagram of erosion in the image composed of pixels is shown.
  • the processing according to the above algorithm consists in that an image simulating the image obtained at night is obtained.
  • Objects such as fields, unlit roads and buildings are shown without color qualities typical for them - that is, just as these objects are visible at night, while lit roads and objects that emit light, e.g. buildings, are shown in a manner imitating the generated illumination on the background of other unlit objects.
  • Stagnant and flowing watercourses are shown as dark areas with a tendency towards black color, giving a sense of depth.
  • the whole image is shown in the convention of image obtained at night, under a clear sky, where the light from elements producing it radiates on unlit surroundings. Resulting scenes can be loaded into the flight simulator and enriched with effects such as the effect of rain falling or illumination glow. When loaded into the flight simulator, predominance of blue color is reduced.
  • Fig. 1 shows a flowchart of the algorithm for the simulation of night time
  • Fig. 2 shows an original RGB image
  • Fig. 3 shows an image after I HS transformation
  • Fig. 4 shows an image after using a gradient filter with Prewitt mask
  • Fig. 5 shows an image after using morphological filtration - erosion
  • Fig. 6 shows an image after summation of the resulting images, thereby providing a night visualization
  • Fig. 7 and Fig. 8 show a view of implemented images with the addition of special effects in the flight simulator
  • Fig. 9 shows an exemplary scheme
  • Fig. 10 shows a flow diagram of erosion in the image composed of pixels.
  • the method of producing the photo scenery simulating night time, based on a homogeneous RGB image is based on utilization of homogeneous, 8-bit satellite image saved as RGB, as shown in Fig. 2.
  • digital processing of RGB imaging original image Z(j,k)
  • Path A consists is transformation of color arrangement from RGB to I HS (I ntensity Hue Saturation), to give the image in I HS encoding in the form of li, H 2 and S 3 , as shown in Fig. 3.
  • I HS I ntensity Hue Saturation
  • the last step is a selective combination of results from the two paths A and B.
  • An image composition through encoding of components with individual channels of the images obtained from both paths is performed as follow: the first and second channels are constituted by the I ntensity channel from I HS image (path A), the third channel is encoded with red channel R' from Z S image (path B) (Fig. 6).
  • This operation is a key step for obtaining the simulation of night time photoscenery because an image simulating the image obtained at night is obtained. Objects such as fields, unlit roads and buildings are shown without color qualities typical for them - that is, just as these objects are visible at night, while lit roads and objects that emit light, e.g.
  • buildings are shown in a manner imitating the generated illumination on the background of other unlit objects. Stagnant and flowing watercourses are shown as dark areas with a tendency towards black color, giving a sense of depth. The whole image is shown in the convention of image obtained at night, under a clear sky, where the light from elements producing it radiates on unlit surroundings. Resulting scenes can be loaded into the flight simulator and enriched with effects such as the effect of rain falling or illumination glow. When loaded into the flight simulator, predominance of blue color is reduced.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Color Image Communication Systems (AREA)

Abstract

The essence of the invention is a method of producing the photo scenery simulating night time based on utilization of homogeneous, 8-bit satellite image saved as RGB, characterized in that digital processing of RGB image (original image Z(j,k) ) is performed in parallel in two paths, wherein path A consists in transformation of color arrangement from RGB to IHS (Intensity Hue Saturation), to give the image in IHS encoding in the form of l1, H2 and S3, and path B consists in that the original image Z(j,k) in the form of RGB is at first subjected to a contextual operation - convolution of the original image with the Prewitt mask of gradient filter, and then is subjected to morphological filtration through erosion operations, to obtain ZPME image, wherein in the next step, images obtained from both filtrations within path B are summed up as follows: ZS = ZPME +ZP and the resulting image ZS has components R', G' and B', whereupon the last step is a selective combination of results from the two paths A and B consisting in performance of image composition through encoding of components with individual channels of the images obtained from both paths in such a way that the first and second channels are constituted by the Intensity channel from IHS image (path A), the third channel is encoded with red channel R' from ZS image (path B).

Description

METHOD OF PROCESSING SATELLITE IMAGES FOR SIMULATING NIGHT TIME
The object of the invention is a method of producing photo sceneries at night time. The method allows - based on homogeneous satellite RGB images obtained in summer time - generation of new images to simulate the night. By using color combinations and sequentially performing operations of digital image processing, such as PCA, IHS transformations or filtration of images, it is possible to reproducibly obtain images varied in terms of simulation of time of their acquisition. For the night time, an original algorithm for processing images has been developed.
The flights performed (real and simulated) are essentially divided in two groups: those performed according to IFR (Instrumental Flight Rules - based on indications of flight instruments) and those performed according to VFR (Visual Flight Rules - with the visibility of the ground and with the use of visual navigation methods consisting in identification and comparison of characteristic places and objects on the ground with the information described on the map). The vast majority of flights are performed according to VFR.
An essential element of any VFR flight simulation is the environment in which the flight is performed. The factor responsible for imaging the view of land in a flight simulator is scenery. The standard scenery generated by the platform of flight simulator based on algorithms stored therein constitutes a considerably simplified image of reality but without its individual and characteristic features. An alternative to the standard scenery is a photographic scenery, based on satellite or aerial images of land surface. The photographic scenery is a faithful copy of the view of land, mapped in a flight simulator, i.e. comprising all the individual and characteristic features of land surface. Due to this feature, photographic sceneries are desirable among users of flight simulators around the world.
The problem of carrying out changes in image radiometry which simulate changes in time of day, in multispectral single-season images, in a planned, automated and fully professional way, is an issue which is currently practically not raised at all by the authors of software dedicated to photogrammetry and satellite remote sensing. Available systems allowing changes in time of day do not take into account, in radiometry of images, many factors influencing these changes. Commercial software used in digital processing of images does not allow a fully automatic or semi-automatic simulation of "changes in time of day". This is mainly due to the fact that such processing of satellite imaging is a complex and problematic process which requires to take into account practically all the factors influencing the image radiometry, among others radiometric characteristics of the sensor and objects located on the imaged surface of land, atmospheric conditions, land topography. Development and implementation of technology enabling a planned change of image radiometry in order to simulate the change of night time based on multispectral satellite images is a technologically innovative issue, and a solution to this problem is needed and could certainly be used in military and civilian flight simulators.
To perform imaging which simulates night time according to the invention, methods of digital processing of images have been used, in particular: transformation of color arrangement of IHS, filtration and algebra of images.
With the use of the said methods of digital images processing of, an algorithm which allows generation of images simulating night time, as indicated in Fig., has been developed. 1.
The essence of the invention is a method of generating the photo scenery simulating night time based on utilization of homogeneous, 8-bit satellite image saved as RGB, characterized in that digital processing of RGB image (original image Z(j,k) ) is performed in parallel in two paths, wherein path A consists in transformation of color arrangement from RGB to I HS (Intensity Hue Saturation), to give the image in I HS encoding in the form of li, H2 and S3, and path B consists in that the original image Z(j,k) in the form of RGB is at first subjected to a contextual operation - convolution of the original image with the Prewitt mask of gradient filter, and then is subjected to morphological filtration through erosion operations, to obtain ZPME image, wherein in the next step, images obtained from both filtrations within path B are summed up as follows: ZS = ZPME +ZP and the resulting image ZS has components R', G' and B', whereupon the last step is a selective combination of results from the two paths A and B consisting in performance of image composition through encoding of components with individual channels of the images obtained from both paths in such a way that the first and second channels are constituted by the I ntensity channel from I HS image (path A), the third channel is encoded with red channel R' from ZS image (path B). The method of producing the photo scenery simulating night time is based on utilization of homogeneous, 8-bit satellite image saved as RGB. Then, the image is processed digitally and doubly.
Transformation of color arrangement is performed, from RGB to I HS (Intensity Hue Saturation), to give the image in IHS encoding in the form of li, H2 and S3.
I n parallel, the original image Z(j,k) in the form of RGB is subjected to contextual operation - in this case, filtration in the form of convolution of the original image with the Prewitt edge filtering mask. After summing up the primary image Z(j,k) and the image after filtration, the resulting image ZP is subjected to morphological filtration - erosion, to give ZPME image. All the obtained images are summed up as follows: ZS = ZPME +ZP . The resulting image ZS will have components R', G' and B' .
Then, a composition encoded with individual I HS channels is performed in the form of: the first and second channels constitute an I ntensity channel from I HS channel, the third channel is encoded with red channel R' from image ZS.
Description of one method of transformation of color arrangement RGB-IHS (I ntensity-H ue- Saturation) is shown below.
Transformation I ntensity-Hue-Saturation (RBG - IHS) is a widely used technique for integrating images. In this method, hue transformation from RBG to I HS system is used, and then a high-resolution image is substituted in the place of color intensity component with inverse transformation. The process is carried out under the assumption that this component represents the diversity of the resultant spectral brightness of the imaged scene and is highly correlated with the level of luminance in the channel with a higher resolution [Mroz M., Podwyzszenie rozdzielczosci przestrzennej obrazow wielospektralnych Landsat 7 ETM+ przy wykorzystaniu wtasciwych tekstualnych radiometrycznych kanatu panchromatycznego. Archiwum Fotogrametrii, Kartografii i Teledetekcji, Vol. 11, Krakow, 2001] .
The transition from RGB to I HS model is associated with the execution of the following algorithm :
Figure imgf000005_0001
and the formula for the inverse transformation :
Figure imgf000005_0002
where:
I - pixel intensity value,
¾¾- value of the first variable,
¾¾- value of the second variable.
As a further processing, filtration of the image with the use of the Prewitt mask is used. This filtration allows detection of edges of the objects mapped in the image.
Filtration of the image can be represented in the easiest way as a convolution of the image function with the function of a given filter. I n fact, these operations are described using a filter mask (convolution kernel). A new pixel value is calculated in this manner, based on the values of adjacent pixels. Typically, filter masks take the form of a matrix of size 3x3 :
Figure imgf000005_0003
Additionally, if the processed pixel (x,y) of the image together with its surroundings is recorded as follows:
Figure imgf000006_0001
Pixel value after filtration is obtained according to the following form
Figure imgf000006_0002
where:
L'(x,y) - value of pixel luminance after filtration;
r - normalization coefficient, used due to the fact that the sum of the luminance and the filter mask cannot be lower than zero;
K(m,n) - weights of pixels surrounding the pixel (x,y);
L(m,n) - discrete function of brightness of the source image. 1
The above formula proves contextuality of the operations performed, i.e. if one wants to determine the value of a single pixel, the values of adjacent pixels should be known. As a result, in principle, it is not possible to calculate the values for edge pixels.
Despite many limitations, contextual filtrations are the most commonly used algorithms that process images.
Considering the filtration in the context of the invention, it can be seen that there is a very wide range of methods that attempt to extract objects based on edge detection. They can be divided based on several criteria. In order to identify edges, most of the algorithms use, respectively:
• first order approximation of image intensity gradient;
• Laplace approximation (2nd order);
• or morphological operations.
In turn, the most frequently used methods for edge detection are:
1. gradient methods - i.e. Prewitt, Sobel, Roberts filters;
1
Karbowski K., Podstawy rekonstrukcji elementow maszyn i innych obiektow w procesach wytwarzania, Wydawnictwo Politechniki Krakowskiej, Krakow 2008 2. methods of second order: LoG;
3. and Canny algorithm.
Wherein the PREWITT MASK, known as operator or kernel, functions as follows:
Prewitt operator
Prewitt filter is another representative from the gradient group. It allows differentiation of the signal in eight different directions. For edge detection, it uses the first derivative. The prototype of the Prewitt mask is a tripled Roberts mask. Below, only two examples of the Prewitt mask are presented as the other can be constructed through their rotations:
90°
Figure imgf000007_0001
Considering the mask in the form of:
Figure imgf000007_0004
The Prewitt operator can be defined as:
Figure imgf000007_0002
Wherein:
Figure imgf000007_0003
Where K=l. In the above formulation, gradients of rows and columns are normalized and provide information about the acquired positive and negative weighted averages of isolated edge positions.
This gradient for extracting edges has a definitely directional nature. Most often, it is used to detect edges in images which have large high-frequency noise. It is a combination of a gradient filter with averaging, rectangular and Gaussian filters. The sum of weights in the mask is always equal to 0 so that in the areas having a fixed value of the function, the operator generates a value of 0.
Implementation of the morphological transformation - erosion - consists in:
♦ applying the central point sequentially to all the image points;
♦ checking whether the local configuration of point corresponds to the one stored in the mask;
♦ performing, in case of compliance of point configuration, an operation determined for a given transformation. Usually, it is just a change of color or hue of a given point.
Morphological transformations are usually iterative transformations, i.e. consisting in a multiple repetition of a certain elementary sequence of operations with respect to the image obtained as a result of previous operation.
Simple morphological filters, such as erosion and dilatation, smooth the edges of figures. The disadvantage thereof is that they change the surface area of the figure: decreased with erosion and increased with dilatation.
Implementation of erosion:
Image is the first argument, and structural element is the second one. The principle of operation is based on applying, to each pixel of the image, a structural element (SE) in its central point. If at least one adjacent pixel covered by the SE has a value of "0", the current pixel assumes a value of "0" (background).
An exemplary diagram is shown in Fig. 9. In Fig. 10, a flow diagram of erosion in the image composed of pixels is shown.
The processing according to the above algorithm consists in that an image simulating the image obtained at night is obtained. Objects such as fields, unlit roads and buildings are shown without color qualities typical for them - that is, just as these objects are visible at night, while lit roads and objects that emit light, e.g. buildings, are shown in a manner imitating the generated illumination on the background of other unlit objects. Stagnant and flowing watercourses are shown as dark areas with a tendency towards black color, giving a sense of depth. The whole image is shown in the convention of image obtained at night, under a clear sky, where the light from elements producing it radiates on unlit surroundings. Resulting scenes can be loaded into the flight simulator and enriched with effects such as the effect of rain falling or illumination glow. When loaded into the flight simulator, predominance of blue color is reduced.
The invention is further presented in an embodiment and in the drawing, in which Fig. 1 shows a flowchart of the algorithm for the simulation of night time, Fig. 2 shows an original RGB image, Fig. 3 shows an image after I HS transformation, Fig. 4 shows an image after using a gradient filter with Prewitt mask, and Fig. 5 shows an image after using morphological filtration - erosion, Fig. 6 shows an image after summation of the resulting images, thereby providing a night visualization, Fig. 7 and Fig. 8 show a view of implemented images with the addition of special effects in the flight simulator, Fig. 9 shows an exemplary scheme and Fig. 10 shows a flow diagram of erosion in the image composed of pixels.
The method of producing the photo scenery simulating night time, based on a homogeneous RGB image, is based on utilization of homogeneous, 8-bit satellite image saved as RGB, as shown in Fig. 2. In the first step, digital processing of RGB imaging (original image Z(j,k) ) is performed in parallel in two paths. Path A consists is transformation of color arrangement from RGB to I HS (I ntensity Hue Saturation), to give the image in I HS encoding in the form of li, H2 and S3, as shown in Fig. 3. The aim of this transformation is to extract the intensity component from the image. On the other hand, path B consists in that the original image Z(j,k) in the form of RGB is at first subjected to a contextual operation - convolution of the original image with the Prewitt mask of gradient filter, as shown in Fig. 4. Then, the the so- obtained image is subjected to morphological filtration through erosion operations, to obtain ZPM E image (Fig. 5). I n the next step, images obtained from both filtrations within path B are summed up as follows: ZS = ZPME +ZP, and the resulting image ZS has components R', G' and B' . The aim of these operations is to extract objects which are the basis to indicate objects that emit and do no emit lighting of the area at night time, and to initially generalize the resulting content of the image.
The last step is a selective combination of results from the two paths A and B. An image composition through encoding of components with individual channels of the images obtained from both paths is performed as follow: the first and second channels are constituted by the I ntensity channel from I HS image (path A), the third channel is encoded with red channel R' from ZS image (path B) (Fig. 6). This operation is a key step for obtaining the simulation of night time photoscenery because an image simulating the image obtained at night is obtained. Objects such as fields, unlit roads and buildings are shown without color qualities typical for them - that is, just as these objects are visible at night, while lit roads and objects that emit light, e.g. buildings, are shown in a manner imitating the generated illumination on the background of other unlit objects. Stagnant and flowing watercourses are shown as dark areas with a tendency towards black color, giving a sense of depth. The whole image is shown in the convention of image obtained at night, under a clear sky, where the light from elements producing it radiates on unlit surroundings. Resulting scenes can be loaded into the flight simulator and enriched with effects such as the effect of rain falling or illumination glow. When loaded into the flight simulator, predominance of blue color is reduced.

Claims

Claims
1. A method of producing the photo scenery simulating night time based on utilization of homogeneous, 8-bit satellite image saved as RGB, characterized in that digital processing of RGB image (original image Z(j,k) ) is performed in parallel in two paths, wherein path A consists in transformation of color arrangement from RGB to I HS (Intensity Hue Saturation), to give the image in I HS encoding in the form of li, H2 and S3, and path B consists in that the original image Z(j,k) in the form of RGB is at first subjected to a contextual operation - convolution of the original image with the Prewitt mask of gradient filter, and then is subjected to morphological filtration through erosion operations, to obtain ZPME image, wherein in the next step, images obtained from both filtrations within path B are summed up as follows: ZS = ZPME +ZP and the resulting image ZS has components R', G' and B', whereupon the last step is a selective combination of results from the two paths A and B consisting in performance of image composition through encoding of components with individual channels of the images obtained from both paths in such a way that the first and second channels are constituted by the I ntensity channel from I HS image (path A), the third channel is encoded with red channel R' from ZS image (path B).
PCT/IB2015/059718 2015-12-09 2015-12-17 Method of processing satellite images for simulating night time WO2017098313A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
PLP.415233 2015-12-09
PL415233A PL415233A1 (en) 2015-12-09 2015-12-09 Method for creating aviation photosetting that simulates the night-time, and is based on the use of a uniform 8-bit satellite imaging registered in the form of RGB

Publications (1)

Publication Number Publication Date
WO2017098313A1 true WO2017098313A1 (en) 2017-06-15

Family

ID=59013754

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2015/059718 WO2017098313A1 (en) 2015-12-09 2015-12-17 Method of processing satellite images for simulating night time

Country Status (2)

Country Link
PL (1) PL415233A1 (en)
WO (1) WO2017098313A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110853455A (en) * 2019-09-06 2020-02-28 中国电子科技集团公司第二十八研究所 Information simulator system for C4ISR system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090046095A1 (en) * 2007-08-16 2009-02-19 Southwest Research Institute Image Analogy Filters For Terrain Modeling

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090046095A1 (en) * 2007-08-16 2009-02-19 Southwest Research Institute Image Analogy Filters For Terrain Modeling

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
MROZ M.: "Podwyiszenie rozdzielczosci przestrzennej obrazow wielospektralnych Landsat 7 ETM+ przy wykorzystaniu wfasciwych tekstualnych i radiometrycznych kanafu panchromatycznego", ARCHIWUM FOTOGRAMETRII, KARTOGRAFII I TELEDETEKCJI, vol. 11, 2001
POHL C ET AL: "MULTISENSOR IMAGE FUSION IN REMOTE SENSING: CONCEPTS, METHODS AND APPLICATIONS", INTERNATIONAL JOURNAL OF REMOTE SENSING, BASINGSTOKE, HANTS, GB, vol. 19, no. 5, 20 March 1998 (1998-03-20), pages 823 - 854, XP008044449, ISSN: 0143-1161, DOI: 10.1080/014311698215748 *
QIAN DU ET AL: "Color Display for Hyperspectral Imagery", IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 46, no. 6, 1 June 2008 (2008-06-01), pages 1858 - 1866, XP011214897, ISSN: 0196-2892 *
TOET A ET AL: "NEW FALSE COLOR MAPPING FOR IMAGE FUSION", OPTICAL ENGINEERING, SOC. OF PHOTO-OPTICAL INSTRUMENTATION ENGINEERS, BELLINGHAM, vol. 35, no. 3, 1 March 1996 (1996-03-01), pages 650 - 658, XP000597453, ISSN: 0091-3286, DOI: 10.1117/1.600657 *
WILLIAM B THOMPSON ET AL: "A Spatial Post-Processing Algorithm for Images of Night Scenes", vol. 7, no. 1, 1 November 2002 (2002-11-01), pages 11pp, XP007912664, ISSN: 1086-7651, Retrieved from the Internet <URL:http://www.cis.rit.edu/jaf/publications/jgt02_paper.pdf> [retrieved on 20100415] *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110853455A (en) * 2019-09-06 2020-02-28 中国电子科技集团公司第二十八研究所 Information simulator system for C4ISR system

Also Published As

Publication number Publication date
PL415233A1 (en) 2017-06-19

Similar Documents

Publication Publication Date Title
Berman et al. Non-local image dehazing
US7095420B2 (en) System and related methods for synthesizing color imagery
JP4858793B2 (en) Tree number calculation method and tree number calculation device
JP2011513860A (en) How to visualize point cloud data
CN103258334B (en) The scene light source colour method of estimation of coloured image
Lee et al. Cloud removal of satellite images using convolutional neural network with reliable cloudy image synthesis model
Ruban et al. Method for determining elements of urban infrastructure objects based on the results from air monitoring
CN103927759A (en) Automatic cloud detection method of aerial images
CN115082328A (en) Method and apparatus for image correction
CN108012135B (en) Image processing method and device, computer readable storage medium and computer equipment
Kumar et al. Digital image processing of remotely sensed satellite images for information extraction
CN113506275B (en) Urban image processing method based on panorama
CN117726550A (en) Multi-scale gating attention remote sensing image defogging method and system
WO2017098313A1 (en) Method of processing satellite images for simulating night time
CA2660339C (en) Geospatial modeling system for performing filtering operations based upon a sum of differences of a given and neighboring location points and related methods
CA2660343C (en) Geospatial modeling system for separating foliage data from building data based upon loose and strict tolerance noise filtering operations and related methods
Harayama et al. Multi-source object-oriented classification of landcover using very high resolution imagery and digital elevation model.
KR20220127715A (en) Method and apparatus for correcting image
Moeller et al. Urban change extraction from high resolution satellite image
US11270521B2 (en) Creation of a simulation scene from a specified view point
Herrera-Arellano et al. Color outdoor image enhancement by V-NIR fusion and weighted luminance
Ghodeswar et al. CAMOUFLAGE PATTERN GENERATION USING LAB COLOR MODEL
JP6672559B2 (en) Meteorological data charting system
JP2007183710A (en) Color saturation correction system for green space
Easton et al. Rediscovering text in the yale martellus map

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15826055

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15826055

Country of ref document: EP

Kind code of ref document: A1