KR20170088202A - Method for evaluation of fusion feasibility on multi-sensor satellite images and Apparatus Thereof - Google Patents

Method for evaluation of fusion feasibility on multi-sensor satellite images and Apparatus Thereof Download PDF

Info

Publication number
KR20170088202A
KR20170088202A KR1020160008275A KR20160008275A KR20170088202A KR 20170088202 A KR20170088202 A KR 20170088202A KR 1020160008275 A KR1020160008275 A KR 1020160008275A KR 20160008275 A KR20160008275 A KR 20160008275A KR 20170088202 A KR20170088202 A KR 20170088202A
Authority
KR
South Korea
Prior art keywords
image
distortion
pushbroom
radar
satellite
Prior art date
Application number
KR1020160008275A
Other languages
Korean (ko)
Other versions
KR101770745B1 (en
Inventor
정형섭
이승찬
박숭환
Original Assignee
서울시립대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 서울시립대학교 산학협력단 filed Critical 서울시립대학교 산학협력단
Priority to KR1020160008275A priority Critical patent/KR101770745B1/en
Publication of KR20170088202A publication Critical patent/KR20170088202A/en
Application granted granted Critical
Publication of KR101770745B1 publication Critical patent/KR101770745B1/en

Links

Images

Classifications

    • H04N5/2258
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/20Adaptations for transmission via a GHz frequency band, e.g. via satellite
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Astronomy & Astrophysics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Image Processing (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The present invention relates to a heterogeneous satellite image fusion possibility evaluation method and an apparatus thereof, and more particularly, to a technique capable of determining an optimum pair of heterogeneous satellite images for heterogeneous satellite image fusion by calculating a distortion amount that can occur when fusing the heterogeneous satellite images by using an image fusion possibility index. The heterogeneous satellite image fusion possibility evaluation apparatus according to an embodiment of the present invention comprises: a satellite image acquisition part; an imaging information acquisition part; a distortion amount estimation part; and a fusion possibility determination part.

Description

TECHNICAL FIELD [0001] The present invention relates to a method and an apparatus for evaluating fusion of a satellite image,

The present invention relates to a method and an apparatus for evaluating the convergence possibility of a heterogeneous satellite image picked up from different sensors, and more particularly, to a method and apparatus for evaluating the convergence possibility of a heterogeneous satellite image picked up from different sensors, A method for evaluating the possibility, and a device therefor.

A multi-sensor image in a satellite image system means an image picked up by different sensors, and different kinds of satellite image fusion are performed by mathematically merging the information obtained from each single sensor, This is a technology that overcomes the limitations of information. The heterogeneous satellite imagery has been actively used by the earth observation satellites with various sensor characteristics, and has been used for various fields such as defense, environment, agriculture and the ocean.

In general, optical satellites simultaneously provide high-resolution full-color images and low-resolution multispectral images. The full color image has advantages such as classification of land cover through analysis of spectral characteristics of individuals because it is easy to read objects by providing high resolution image information compared with multispectral image and multispectral image has high spectral resolution. In addition, since infrared image can acquire brightness information by radiant energy emission from the surface, it is used for military purposes and is effective for nighttime target detection. Radar image can be taken regardless of day or night. It is effective for monitoring accompanying natural disasters.

For ideal utilization, it is effective to provide high-dimensional information by fusing satellite images taken from different sensors. However, it is difficult to produce a fusion image without terrain distortion due to the characteristics of heterogeneous satellite images obtained at different times and geometries. Is difficult. In particular, since the imaging method of optical and infrared images and radar images is completely different, it was aimed to develop convergence technology on the assumption that the initial different types of satellite image fusion techniques were imaged by the same conditions and methods, 10-1194405 " Method for detecting a target using plural heterogeneous sensors ", and Korean Patent No. 10-1035055 " Object tracking system and method using heterogeneous cameras "

However, the prior art described above is a technique of fusing an image captured from a heterogeneous sensor or a heterogeneous camera rather than an image captured from a satellite, and images captured from satellites are picked up at different time and geometrical conditions Because terrain distortion may occur depending on the satellite sensor and the terrain, especially in the case of mountainous terrain, the position error between the two images can be large according to each imaging geometry. Do.

Korean Registered Patent No. 10-1194405 (Registered on October 18, 2012) Korean Registered Patent No. 10-1035055 (Registered on May 30, 2011)

It is an object of the present invention to determine a convergent heterogeneous satellite image by developing a technique capable of determining the convergence possibility of a heterogeneous satellite image picked up from different sensors.

To achieve these and other advantages and in accordance with the purpose of the present invention, as embodied and broadly described herein, there is provided a method and apparatus for evaluating heterogeneous satellite image fusion, including a satellite image acquiring unit, an imaging information acquiring unit, a distortion amount estimating unit,

The satellite image acquisition unit acquires an optical image, an infrared image, and a radar image. Wherein the imaging information obtaining unit obtains incident angle information, spatial resolution information, and trajectory angle information at the time of image capturing from a header file of the image, and the distortion amount estimating unit obtains distortions in the observation direction using the imaging information acquired by the imaging information obtaining unit And the convergence possibility determining unit determines the convergence possibility from the amount of distortion in the observation direction and the amount of distortion in the flight direction.

The present invention has the effect of determining a heterogeneous satellite image.

The present invention has an effect of determining a heterogeneous satellite image that can be finally fused by calculating the distortion amount in the observation direction and the distortion amount in the flight direction.

The present invention has the effect of determining heterogeneous satellite images that can be converged before the user performs heterogeneous satellite image fusion through calculating the amount of distortion in the observation direction and the amount of distortion in the flight direction.

The present invention has an effect of determining a heterogeneous satellite image pair that can be converged from a plurality of satellite images through fusion possibility analysis.

The source technology of the present invention has an effect that it can be utilized as satellite image processing software developed in Korea or as a core technology for fusion of different types of satellite images. Therefore, the present invention has an effect of increasing the utilization of satellite images and expanding the national satellite industry through commercialization of source technology.

The present invention is applicable to convergence of multi-purpose practical satellite 2, multipurpose practical satellite 3, infrared satellite 3, and multipurpose practical satellite 5, which are representative domestic optical satellites, and multipurpose practical satellite 5, which is a radar satellite. Can be maximized.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a schematic view showing a configuration of a heterogeneous satellite image fusion possibility evaluating apparatus according to an embodiment of the present invention; FIG.
2 is a flowchart illustrating a method for evaluating the possibility of different types of satellite image fusion according to an embodiment of the present invention.
3 is an image showing an example of a photographic geometry of a pushbroom sensor and a radar sensor according to an embodiment of the present invention.
FIG. 4 is an image showing an example of an observation direction distortion amount and a flight direction distortion amount according to a heterogeneous satellite image fusion possibility evaluation method according to an embodiment of the present invention.
FIG. 5 is an image showing an example of a satellite image having different photographic geometric conditions through a heterogeneous satellite image fusion possibility evaluation method according to an embodiment of the present invention.
6 is an image showing a result of a fusion image through a heterogeneous satellite image fusion possibility evaluation method according to an embodiment of the present invention.

The terms and words used in the present specification and claims should not be construed as limited to ordinary or dictionary terms and the inventor may appropriately define the concept of the term in order to best describe its invention It should be construed as meaning and concept consistent with the technical idea of the present invention.

Therefore, the embodiments described in this specification and the configurations shown in the drawings are merely the most preferred embodiments of the present invention and do not represent all the technical ideas of the present invention. Therefore, It is to be understood that equivalents and modifications are possible.

The terminology used in this application is used only to describe a specific embodiment and is not intended to limit the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In the present application, the terms "comprises" or "having" and the like are intended to specify that there are stated features, numbers, steps, operations, elements, parts, or combinations thereof, But do not preclude the presence or addition of features, numbers, steps, operations, components, parts, or combinations thereof.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear. In the following description of the embodiments of the present invention, specific values are only examples.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a schematic view showing a configuration of a heterogeneous satellite image fusion possibility evaluating apparatus according to an embodiment of the present invention; FIG.

1, a heterogeneous satellite image fusion possibility evaluating apparatus according to the present invention includes a satellite image acquiring unit 110, an imaging information acquiring unit 120, a distortion amount estimating unit 130, and a convergence determining unit 140 .

The satellite image acquiring unit 110 acquires an optical image, an infrared image, and a radar image obtained by picking up a specific object using different types of sensors. The satellite image generally refers to an image recorded on an image sensor of a satellite. In the present invention, the satellite image refers to an optical image, an infrared image, and a radar image. At this time, since the optical image and the infrared image have the same photographing system, the scanning method must coincide mutually.

As an example, the optical satellite and the infrared satellite use a pushbroom scanning method. The pushbroom scanning method refers to a satellite image acquisition method in which hundreds of detectors for detecting the energy of an object are arranged in a line and the ground is scanned in the direction in which the satellite passes, It is a remote imaging system that constructs images using a linear array of charge coupled devices that record the signals. In this case, the pushbroom scanning method is a method adopted in most satellite systems because it can obtain a high resolution image and does not overlap in a scanning area, unlike a frame camera. In the pushbroom scanning method, linearly arranged sensors acquire information on the surface of the ground by one line in accordance with the flight direction of the payload, and bundle the acquired lines into one plane to obtain a pushbroom image. At this time, the pushbroom system includes a pushbroom sensor, which detects the energy of a specific object. In the present invention, an optical image and an infrared image are obtained using a pushbroom system using the pushbroom scanning method.

In addition, the radar satellite image capturing method is used. In the radar image capturing method, a radar pulse transmitted once is scanned along a satellite orbit to obtain an image of one line, and continuously transmits and receives a radar pulse This is a method of capturing an image. In the present invention, a radar image is acquired using a radar image capturing system to which the radar image capturing method is applied. The radar imaging system includes a radar sensor, a radar antenna, a pulse generator, and a converter.

The imaging information acquisition unit 120 may include an incident angle information acquisition unit 121, a spatial resolution information acquisition unit 122 and a trajectory angle information acquisition unit 123. The imaging information acquisition unit 120 may include incident angle information, spatial resolution information, Obtain each information.

More specifically, the imaging information obtaining unit 120 obtains incidence angle information, ground sample distance information, and track angle information of the satellite image. Here, the spatial resolution information acquires the spatial resolution information in the observation direction and the spatial resolution information in the flight direction, respectively. The image pickup information is generally contained in a header file which is an image pickup information file of a satellite image.

That is, the incident angle information obtaining unit 121 in the sensing information obtaining unit 120 obtains the incident angle information of the satellite image, the spatial resolution information obtaining unit 122 obtains the spatial resolution information of the satellite image, The unit 123 acquires the orbit angle information.

The distortion amount estimating unit 130 calculates a distortion amount that can be generated when the fusion of the satellite image is performed.

That is, the distortion amount estimating unit 130 calculates the amount of distortion in the observation direction and the amount of distortion in the flight direction using the incident angle information, the spatial resolution information, and the orbit angle information acquired by the imaging information obtaining unit 120.

In more detail, the distortion estimator 130 may calculate the amount of distortion in the observation direction and the amount of distortion in the flight direction, which may occur when the satellite image is fused using the sensing information of the satellite image.

The pushbroom system and the radar imaging system generate different geometric distortions, and for the pushbroom system, the height

Figure pat00001
Of the object is projected from the satellite image
Figure pat00002
Is defined by the following equation (1).

Figure pat00003

At this time,

Figure pat00004
Represents the height at which an object in the pushbroom system is projected from the satellite image,
Figure pat00005
Represents the height of the object,
Figure pat00006
Represents the incident angle information of the satellite image acquired by the imaging information acquisition unit 120. [

In the radar imaging system,

Figure pat00007
Of the object is projected from the satellite image
Figure pat00008
Is defined by the following equation (2).

Figure pat00009

At this time,

Figure pat00010
Represents a height at which an object in a radar image capturing system is projected from a satellite image,
Figure pat00011
Represents the height of the object,
Figure pat00012
Represents the incident angle information of the satellite image acquired by the imaging information acquisition unit 120. [

3 illustrates the height of an object in a pushbroom system and the height of an object in a radar image capturing system projected from the image, according to an embodiment of the present invention. FIG. 3A shows the height at which an object in a pushbroom system is projected from an image

Figure pat00013
FIG. 3B shows the height at which an object in the radar image capturing system is projected from the satellite image
Figure pat00014
Respectively.

In the pushbroom system, the incident angle is proportional to the tangent value, but in the radar imaging system, it is proportional to the tangent value of the altitude angle. If the pushbroom image acquired in the pushbroom system and the radar image acquired in the radar image capturing system are to be merged, the following equation (3) can be determined.

Figure pat00015

here,

Figure pat00016
Is the incident angle of the pushbroom sensor,
Figure pat00017
Represents the incident angle of the radar sensor. From this equation, we can find a fusible pair of pushbroom system and a fusible pair of radar sensors in a radar imaging system. For example, if the incident angle of the radar sensor is 45 degrees, the incident angle of the pushbroom sensor is? 45. These results can be made on the assumption that the satellites equipped with two sensors are traveling in a polar orbit, and most satellites have their own assumptions because they travel in polar orbits. If the incident angle of the radar sensor is 30 degrees, the incident angle of the pushbroom sensor should be -60 degrees. If the incident angle of the radar sensor is 60 degrees, the incident angle of the pushbrom sensor should be -30 degrees. Here, the sign is positive when the observation angle is photographed to the right.

Ideally, when a satellite image is captured without error, it is possible to find a pair that can be converged by the method of Equation (3). However, in reality, distortion occurs in the viewing direction and the flying direction due to terrain distortion, We propose a method to find the fusible pair of pushbroom sensor and radar sensor according to the amount.

The amount of distortion of observation direction between pushbroom image and radar image

Figure pat00018
Can be defined by the following equation (4).

Figure pat00019

here,

Figure pat00020
and
Figure pat00021
Represents the projection height of the pushbroom image and the radar image, respectively,
Figure pat00022
Means the spatial resolution of the pixel direction obtained by the sensing information obtaining unit. Here, the pixel direction means a direction perpendicular to the direction in which the satellite image is flying.

If a radar image with an incident angle of 40 degrees with a spatial resolution of 1 m and an optical image with an incident angle of -20 degrees are fused in an area with an average height of 10 m, the projection height in the radar image is about -11.9 m, It is about -3.6m. Therefore, the amount of distortion of the observation direction becomes 8.3 pixels.

In addition,

Figure pat00023
Can be defined as the following equation (5).

Figure pat00024

here,

Figure pat00025
Is the spatial resolution in the line direction obtained by the imaging information obtaining unit,
Figure pat00026
Represents the difference in the shooting orbit angle between the pushbroom image and the radar image,
Figure pat00027
Is the mean value of the projection height of the Pushbroom image and the radar image
Figure pat00028
. Here, the line direction means the direction in which the satellite image is flying.

Meanwhile, FIG. 4 shows the amount of distortion in the observation direction and the amount of distortion in the direction of flight in the pushbroom system and the radar imaging system according to the embodiment of the present invention. In FIG. 4A, the sensor located on the left side is a radar sensor, and the sensor located on the right side is a pushbroom sensor. When an object having the same height is projected on each sensor, the pushbroom sensor

Figure pat00029
, But the radar sensor is
Figure pat00030
And the amount of distortion in the observation direction
Figure pat00031
Can be confirmed. In FIG. 4B, sensors having different orbital angles are shown, and the amount of distortion of the flight direction due to the difference of the orbital angles
Figure pat00032
Can be confirmed.

The distortion amount estimating unit 130 is a technology that is essential to the fusion-capable heterogeneous satellite image determining apparatus of the present invention. More specifically, the distortion amount estimating unit 130 estimates the amount of distortion in the observation direction An estimation unit 131 and a flight direction distortion amount estimating unit 132 for calculating a distortion amount in the flight direction.

That is, the distortion amount estimating unit 130 may include an observation direction distortion amount estimating unit 131 and a direction direction distortion amount estimating unit 132, and the viewing direction distortion amount estimating unit 131 may calculate And the direction direction distortion amount estimating unit 132 calculates the amount of distortion in the direction of flight when the satellite image is fused.

The fusibility possibility determination unit 140 determines a heterogeneous satellite image that can be converged. That is, the convergence possibility determining unit 140 can quantitatively represent the convergence possibility of different types of satellite images by the Fusion Feasibility Index. When the convergence possibility index is 0, Means that there is no distortion at all, and the larger the fusibility index, the greater the distortion when fusing the satellite image.

More specifically, the convergence possibility determining unit 140 expresses the amount of distortion in each pixel direction and the amount of distortion in the flight direction calculated from the distortion amount estimating unit 130 by a quantitative image fusibility index .

At this time, the image fusibility index is applied considering the amount of distortion in the pixel direction and the amount of distortion in the direction of flight, and can be determined as shown in Equation (6).

Figure pat00033

At this time,

Figure pat00034
Means the image fusion possible index,
Figure pat00035
Means the amount of distortion in the viewing direction,
Figure pat00036
Means the amount of distortion in the flight direction. If the amount of distortion in the observation direction and the amount of distortion in the direction of flight are almost zero, the image fusibility index is expressed as 0, and it can be expected that the amount of distortion in the satellite image fusion is almost zero. On the other hand, as the amount of distortion in the pixel direction and the amount of distortion in the direction of flight increases, the image fusibility index is expressed as a large value, so that the distortion amount in the satellite image fusion can be expected to be very large.

Meanwhile, FIG. 5 shows a stereoscopic image obtained at different times and geometries for the same region according to an embodiment of the present invention. FIG. 5A shows only a global color image with a WorldView-1 satellite image, and has a spatial resolution of 0.5 m. The stereoscopic images utilized an angle of incidence of 15 degrees, an image having an azimuth angle of 290 degrees, an angle of incidence of 34 degrees, and an azimuth angle of 216 degrees. FIG. 5B shows a full-color image and a multi-spectral image with a WorldView-2 satellite image. The full-color image has a spatial resolution of 0.5 m and the multi-spectral image has a spatial resolution of 2 m. The stereoscopic images utilized an incident angle of 11 degrees, an image having an azimuth angle of 32 degrees, an incident angle of 21 degrees, and an azimuth angle of 175 degrees. Here, the global color image of WorldView-1 and the multispectral image of WorldView-2 have different imaging characteristics and can be called heterogeneous satellite images. The full color image has advantages such as classification of land cover through analysis of spectral characteristics of individuals because it is easy to read objects by providing high resolution image information compared with multispectral image and multispectral image has high spectral resolution.

Generally, when a full-color image and a multispectral image photographed by WorldView-2 are fused, the amount of distortion that can occur due to the same incident angle and orbit angle is small. However, Is distorted in the observation direction and the flight direction due to the difference in the incident angle difference, the difference in the orbit angle and the spatial resolution. Especially, it is known that the mountain type is more severe.

Meanwhile, FIG. 6 shows a form of a fusion image of WorldView-1 and WorldView-2 as an embodiment of the present invention. FIG. 6A shows a full-color image at an angle of incidence of 15 degrees and an azimuth angle of 290 degrees and a multi-spectral image at an angle of incidence of 11 degrees and an azimuth angle of 89 degrees in the WorldView-2. FIG. 2 at 32 degrees and an azimuth angle of 175 degrees. The incidence angles of the WorldView-1 and WorldView-2 images used in the production of FIG. 6A are all 15 degrees or less, and images with small incidence angles can be produced with high positional accuracy. On the other hand, in the case of WorldView-1 and WorldView-2 images used in the production of FIG. 6B, the azimuth difference is relatively small at about 40 degrees, but incidence angles are all 30 degrees or more. 6a, the positional errors of the low building and the road are small due to the small incidence angle, whereas the position error is caused by the high incidence angle in Fig. 6b, but the azimuth angle Since the difference is small, the fusion quality of the artificial structure is good.

Accordingly, the heterogeneous satellite image fusion possibility evaluating apparatus according to an embodiment of the present invention can calculate the amount of distortion that can occur in the heterogeneous satellite image, and it is possible to quantitatively evaluate the convergence possibility from the image fusion possible index. Therefore, it is possible to determine an appropriate fusion pair by predicting the convergence result in satellite image fusion, which requires a long processing time to date.

Hereinafter, a method for evaluating heterogeneous satellite image fusion possibility according to an embodiment of the present invention will be briefly described based on the details described above.

2 is a flowchart illustrating a method for evaluating the possibility of different types of satellite image fusion according to an embodiment of the present invention.

Referring to FIG. 2, the satellite image obtaining unit 110 obtains an optical image, an infrared image, and a radar image obtained by picking up a specific object using different kinds of sensors (S210).

That is, the satellite image acquisition unit 110 acquires an optical image, an infrared image, and a radar image using different types of sensors. The satellite image generally refers to an image recorded on an image sensor of a satellite. In the present invention, the satellite image refers to an optical image, an infrared image, and a radar image. At this time, since the optical image and the infrared image have the same photographing system, the scanning method must coincide mutually.

As an example, the optical satellite and the infrared satellite use a pushbroom scanning method. The pushbroom scanning method refers to a satellite image acquisition method in which hundreds of detectors for detecting the energy of an object are arranged in a line and the ground is scanned in the direction in which the satellite passes, It is a remote imaging system that constructs images using a linear array of charge coupled devices that record the signals. In this case, the pushbroom scanning method is a method adopted in most satellite systems because it can obtain a high resolution image and does not overlap in a scanning area, unlike a frame camera. In the present invention, an optical image and an infrared image are obtained using a pushbroom system using the pushbroom scanning method. The pushbroom system includes a pushbroom sensor and a detector for sensing the energy of a particular object,

In addition, the radar satellite image capturing method is used. In the radar image capturing method, a radar pulse transmitted once is scanned along a satellite orbit to obtain an image of one line, and continuously transmits and receives a radar pulse This is a method of capturing an image. In the present invention, a radar image is acquired using a radar image capturing system to which the radar image capturing method is applied. The radar imaging system includes a radar sensor, a radar antenna, a pulse generator, and a converter.

Next, the imaging information obtaining unit 120 obtains the incident angle information, the spatial resolution information, and the orbital angle information of the satellite image (S220).

More specifically, the imaging information obtaining unit 120 obtains incidence angle information, track angle information, and ground sample distance information of different types of satellite images. Here, the spatial resolution information acquires the spatial resolution information in the observation direction and the spatial resolution information in the flight direction, respectively. The image pickup information is generally contained in a header file which is an image pickup information file of a satellite image.

The imaging information acquisition unit 120 may include an incident angle information acquisition unit 121, a spatial resolution information acquisition unit 122, and a track angle information acquisition unit 123 in the heterogeneous satellite image fusion possibility evaluation method of the present invention. have.

That is, the imaging information acquisition unit 120 may include an incident angle information acquisition unit 121, a spatial resolution information acquisition unit 122, and a trajectory angle information acquisition unit 123, The spatial resolution information obtaining unit 122 obtains spatial resolution information of the heterogeneous satellite image, and the orbit angle information obtaining unit 123 obtains the orbit angle information.

Step S220 of acquiring the imaging information may include acquiring incident angle information, acquiring orbital angle information, and acquiring spatial resolution information.

Next, the distortion amount estimating unit 130 calculates the amount of distortion in the observation direction and the amount of distortion in the flight direction using the incident angle information, the spatial resolution information, and the orbit angle information acquired by the imaging information obtaining unit 120 (S230 ). In step S230, the distortion amount estimating unit 130 is a core technology of the heterogeneous satellite image fusion possibility evaluating method of the present invention. More specifically, the distortion amount estimating unit 130 estimates the amount of distortion in the observation direction A direction distortion amount estimating unit 131 and a direction direction distortion amount estimating unit 132 for calculating the amount of distortion in the direction of flight.

That is, the distortion amount estimating unit 130 may include an observation direction distortion amount estimating unit 131 and a flight direction distortion amount estimating unit 132, and the observation direction distortion amount estimating unit 131 may calculate different types of satellite images And calculates a distortion amount in the observation direction when fusion is performed. The flight direction distortion amount estimation unit 132 calculates a distortion amount in the flight direction when fusion of different types of satellite images.

That is, in step S230, the distortion amount estimating unit 130 calculates the amount of distortion in the observation direction and the amount of distortion in the flight direction using the incident angle information, the spatial resolution information, and the orbit angle information acquired by the imaging information obtaining unit 120 .

More specifically, the height at which the object is projected in the image in step S230 is proportional to the tangent value of the elevation angle in the radar imaging system, although the incident angle in the pushbroom system is proportional to the tangent value. The projection height of the object in the image increases as the incident angle of the Pushbroom system increases. In the case of the radar imaging system, the projection height decreases as the incident angle increases.

Equations (1) and (2) show the height at which an object is projected onto the image in the push-pull room system and the height at which an object in the radar image capturing system is projected onto the image. do.

Here, the ideal fusible twin crystal in a pushbroom system and a radar imaging system is a pair of tangent incidence angles? 1. For example, if the incident angle of the radar sensor is 45 degrees, the incident angle of the pushbroom sensor is? 45. If the incident angle of the radar sensor is 30 degrees, the incident angle of the pushbroom sensor should be -60 degrees. If the incident angle of the radar sensor is 60 degrees, the incident angle of the pushbrom sensor should be -30 degrees. Here, the sign is positive when the observation angle is photographed to the right.

Equation (3) explains a fusible pair that is an ideal state in a push foom system and a radar image capturing system, and a detailed description thereof has been described above. Ideally, when a satellite image is captured without error, it is possible to find a pair that can be converged by the method of Equation (3). However, in reality, distortion occurs in the viewing direction and the flying direction due to terrain distortion, We propose a method to find the fusible pair of pushbroom sensor and radar sensor according to the amount.

In step S230, the distortion amount estimating unit 130 can calculate the distortion amount in the viewing direction from the projection height of the object and spatial resolution in the pixel direction.

Equation (4) shows a method of calculating the amount of distortion of the observation direction between two images, and a detailed description thereof has been given above.

Further, in step S230, the distortion amount estimating unit 130 can calculate the distortion amount in the flight direction from the difference in the viewing direction distortion, the difference in the orbit angle, and the spatial resolution in the line direction.

Equation (5) shows a method of calculating the amount of distortion of the flight direction between the pushbroom image and the radar image, and a detailed description thereof has been given above.

Next, the convergence possibility determining unit 140 determines a different type of satellite image capable of convergence (S240).

That is, the convergence possibility determining unit 140 in step S240 expresses the possibility of fusion of different types of satellite images by using an image fusibility index, and the range of the value has a positive value from 0. When the image fidelity index is 0, it means that there is no distortion when fusing different types of satellite images. The larger the fusibility index, the larger the distortion is when fusion of different types of satellite images.

Equation (6) shows a method of determining a fusible satellite image, and a detailed description thereof has been given above.

Therefore, the heterogeneous satellite image fusibility evaluation method proposed by the present invention can calculate the amount of distortion in the observation direction and the amount of distortion in the flight direction that can occur when fusion of different types of satellite images, . ≪ / RTI > Therefore, it is possible to find an appropriate fusion pair by predicting the convergence result in the satellite image convergence, which requires much processing time to date.

The heterogeneous satellite image fusion possibility evaluation method according to an embodiment of the present invention may be implemented in a form of a program command that can be performed through various computer means and recorded in a computer readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions recorded on the medium may be those specially designed and configured for the present invention or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape, optical media such as CD-ROMs and DVDs, and ROMs, , A flash memory, and the like, and a hardware device specifically configured to store and execute the program audfudd. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.

As described above, the present invention has been described with reference to particular embodiments, such as specific elements, and specific embodiments and drawings. However, it should be understood that the present invention is not limited to the above- And various modifications and changes may be made thereto by those skilled in the art to which the present invention pertains.

Accordingly, the spirit of the present invention should not be construed as being limited to the embodiments described, and all of the equivalents or equivalents of the claims, as well as the following claims, belong to the scope of the present invention .

100: heterogeneous satellite image convergence possibility evaluation device
110: Satellite image acquisition unit
120: imaging information acquiring unit
121: Incident angle information acquisition unit
122: spatial resolution information acquisition unit
123: Orbit angle information acquiring unit
130: Distortion amount estimation unit
131: Observation direction distortion amount estimation unit
132: Flying direction distortion amount estimating unit
140: Fusion possibility determination unit

Claims (16)

Acquiring an optical image and an infrared image by capturing a specific object using a pushbroom scanning method, imaging the specific object using a radar image capturing method, and obtaining a radar image, respectively;
Acquiring incident angle information, trajectory angle information, and spatial resolution information at the time of imaging from the optical image, the infrared image, and the radar image;
Calculating a distortion amount of a viewing direction and a distortion amount of a flying direction generated when the pushbroom image and the radar image are fused using the incident angle information, the orbit angle information, and the spatial resolution information; And
And estimating convergence possibility by expressing the amount of distortion in the observation direction and the amount of distortion in the direction of flight by an image fusibility index.
The method according to claim 1,
Wherein the spatial resolution information includes:
A spatial resolution information of the observation direction, and spatial resolution information of the flight direction.
The method according to claim 1,
In the pushbroom system
Figure pat00037
Is a heterogeneous satellite image convergence evaluation method defined by the following equation:
Figure pat00038

here,
Figure pat00039
Represents a height at which the specific object in the pushbroom system is projected from the satellite image,
Figure pat00040
Represents the height of the specific object,
Figure pat00041
Represents the incident angle information of the pushbroom image.
The method according to claim 1,
In the radar imaging system,
Figure pat00042
The height of the specific object in the radar image
Figure pat00043
Is a heterogeneous satellite image convergence evaluation method defined by the following equation:
Figure pat00044

here,
Figure pat00045
Represents a height at which the specific object in the radar image capturing system is projected from the radar image,
Figure pat00046
Represents the height of the specific object,
Figure pat00047
Represents the incident angle information.
The method according to claim 1,
A distortion amount in the observation direction between the pushbroom image and the radar image
Figure pat00048
Is a method for evaluating the possibility of fusion of different types of satellite images defined by the following equations:
Figure pat00049

here,
Figure pat00050
and
Figure pat00051
Represents a projection height of the pushbroom image and the radar image, respectively,
Figure pat00052
Means the spatial resolution in the pixel direction.
The method according to claim 1,
The amount of distortion in the flying direction between the pushbroom image and the radar image
Figure pat00053
Is a method for evaluating the possibility of fusion of different types of satellite images defined by the following equations:
Figure pat00054

here,
Figure pat00055
Is the spatial resolution in the line direction,
Figure pat00056
And the difference between the trajectory angles of the pushbroom image and the radar image,
Figure pat00057
Is an average value of the projection height of the pushbroom image and the radar image
Figure pat00058
.
The method according to claim 1,
The step of evaluating the feasibility of convergence is,
Wherein the image fusibility index is defined by the following equation: < EMI ID =
Figure pat00059

here,
Figure pat00060
Means the image fusibility index,
Figure pat00061
Means the amount of distortion in the observation direction,
Figure pat00062
Means the amount of distortion in the flight direction.
A satellite image acquiring unit for acquiring an optical image and an infrared image by capturing a specific object using a pushbroom scanning method, capturing the specific object using a radar image capturing method and acquiring a radar image, respectively;
An imaging information acquiring unit that acquires incident angle information, trajectory angle information, and spatial resolution information at the time of imaging from the optical image, the infrared image, and the radar image;
A distortion amount estimating unit for calculating a distortion amount in an observation direction and a distortion amount in a flight direction generated when the pushbroom image and the radar image are fused using the incident angle information, the orbit angle information, and the spatial resolution information; And
And a convergence possibility determining unit for evaluating convergence possibility by expressing the amount of distortion in the observation direction and the amount of distortion in the direction of flight by an image fusibility index.
9. The method of claim 8,
Wherein the imaging information obtaining unit obtains,
An incident angle information obtaining unit for obtaining the incident angle information;
An orbit angle information acquiring unit for acquiring the orbit angle information; And
And a spatial resolution information obtaining unit for obtaining the spatial resolution information.
9. The method of claim 8,
Wherein the spatial resolution information includes:
The spatial resolution information of the observation direction, and the spatial resolution information of the flight direction.
9. The method of claim 8,
Wherein the distortion amount estimating unit comprises:
In the pushbroom system
Figure pat00063
Different types of satellite image fusion possibility evaluation apparatuses defined by the following equations:
Figure pat00064

here,
Figure pat00065
Represents a height at which the specific object in the pushbroom system is projected from the satellite image,
Figure pat00066
Represents the height of the specific object,
Figure pat00067
Represents the incident angle information of the pushbroom image.
9. The method of claim 8,
Wherein the distortion amount estimating unit comprises:
In the radar imaging system,
Figure pat00068
The height of the specific object in the radar image
Figure pat00069
Different types of satellite image fusion possibility evaluation apparatuses defined by the following equations:
Figure pat00070

here,
Figure pat00071
Represents a height at which the specific object in the radar image capturing system is projected from the radar image,
Figure pat00072
Represents the height of the specific object,
Figure pat00073
Represents the incident angle information.
9. The method of claim 8,
Wherein the distortion amount estimating unit comprises:
An observation direction distortion amount estimating unit for calculating a distortion amount in the observation direction; And
And a flight direction distortion amount estimating unit for calculating a distortion amount in the flight direction.
14. The method of claim 13,
And the observation direction distortion amount estimating unit.
A distortion amount in the observation direction between the pushbroom image and the radar image
Figure pat00074
Different types of satellite image fusion possibility evaluation apparatuses defined by the following equations:
Figure pat00075

here,
Figure pat00076
and
Figure pat00077
Represents the projection height of the pushbroom image and the radar image, respectively,
Figure pat00078
Means the spatial resolution in the pixel direction.
14. The method of claim 13,
Wherein the flying direction distortion amount estimating unit estimates,
The amount of distortion in the flying direction between the pushbroom image and the radar image
Figure pat00079
Different types of satellite image fusion possibility evaluation apparatuses defined by the following equations:
Figure pat00080

here,
Figure pat00081
Is the spatial resolution in the line direction,
Figure pat00082
And the difference between the trajectory angles of the pushbroom image and the radar image,
Figure pat00083
Is an average value of the projection height of the pushbroom image and the radar image
Figure pat00084
.
9. The method of claim 8,
Wherein the fusion possibility determining unit
Wherein the image fusibility index is defined by the following equation: < EMI ID =
Figure pat00085

here,
Figure pat00086
Means the image fusibility index,
Figure pat00087
Means the amount of distortion in the observation direction,
Figure pat00088
Means the amount of distortion in the flight direction.
KR1020160008275A 2016-01-22 2016-01-22 Method for evaluation of fusion feasibility on multi-sensor satellite images and Apparatus Thereof KR101770745B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160008275A KR101770745B1 (en) 2016-01-22 2016-01-22 Method for evaluation of fusion feasibility on multi-sensor satellite images and Apparatus Thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160008275A KR101770745B1 (en) 2016-01-22 2016-01-22 Method for evaluation of fusion feasibility on multi-sensor satellite images and Apparatus Thereof

Publications (2)

Publication Number Publication Date
KR20170088202A true KR20170088202A (en) 2017-08-01
KR101770745B1 KR101770745B1 (en) 2017-08-23

Family

ID=59650138

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160008275A KR101770745B1 (en) 2016-01-22 2016-01-22 Method for evaluation of fusion feasibility on multi-sensor satellite images and Apparatus Thereof

Country Status (1)

Country Link
KR (1) KR101770745B1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019184709A1 (en) * 2018-03-29 2019-10-03 上海智瞳通科技有限公司 Data processing method and device based on multi-sensor fusion, and multi-sensor fusion method
WO2020134856A1 (en) * 2018-12-29 2020-07-02 长沙天仪空间科技研究院有限公司 Remote sensing satellite system
WO2021125395A1 (en) * 2019-12-18 2021-06-24 한국항공우주연구원 Method for determining specific area for optical navigation on basis of artificial neural network, on-board map generation device, and method for determining direction of lander

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102538242B1 (en) 2021-09-30 2023-06-01 서울대학교 산학협력단 Daily image fusion producing method and system using geostationary satellite imagery

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019184709A1 (en) * 2018-03-29 2019-10-03 上海智瞳通科技有限公司 Data processing method and device based on multi-sensor fusion, and multi-sensor fusion method
US11675068B2 (en) 2018-03-29 2023-06-13 Shanghai YuGan Microelectronics Co., Ltd Data processing method and device based on multi-sensor fusion, and multi-sensor fusion method
WO2020134856A1 (en) * 2018-12-29 2020-07-02 长沙天仪空间科技研究院有限公司 Remote sensing satellite system
WO2021125395A1 (en) * 2019-12-18 2021-06-24 한국항공우주연구원 Method for determining specific area for optical navigation on basis of artificial neural network, on-board map generation device, and method for determining direction of lander

Also Published As

Publication number Publication date
KR101770745B1 (en) 2017-08-23

Similar Documents

Publication Publication Date Title
US10303966B2 (en) Method and system of image-based change detection
CN108171733B (en) Method of registering two or more three-dimensional 3D point clouds
JP6484729B2 (en) Unmanned aircraft depth image acquisition method, acquisition device, and unmanned aircraft
Grodecki et al. IKONOS geometric accuracy
CA2729712C (en) Method of searching for a thermal target
US8428344B2 (en) System and method for providing mobile range sensing
KR101770745B1 (en) Method for evaluation of fusion feasibility on multi-sensor satellite images and Apparatus Thereof
Toschi et al. Combining airborne oblique camera and LiDAR sensors: Investigation and new perspectives
US9453731B2 (en) System and method for determining orientation relative to earth
US20050253928A1 (en) Target identification and location system and a method thereof
Zhang et al. Evaluating the potential of PPK direct georeferencing for UAV-SfM photogrammetry and precise topographic mapping
KR20210105487A (en) Method for estimation of river bed change rate using hyperspectral image
CN110516588B (en) Remote sensing satellite system
Koppanyi et al. Experiences with acquiring highly redundant spatial data to support driverless vehicle technologies
Khezrabad et al. A new approach for geometric correction of UAV-based pushbroom images through the processing of simultaneously acquired frame images
Misra et al. An automated approach for reference image generation using multi-tier Resourcesat-2A data over Indian land terrain
Hruska Small UAV-acquired, high-resolution, georeferenced still imagery
Coulter et al. Automated co-registration of multitemporal airborne frame images for near real-time change detection
Ringaby et al. Co-aligning aerial hyperspectral push-broom strips for change detection
Qiu et al. A tie point matching strategy for very high resolution SAR-optical stereogrammety over urban areas
Pritt et al. Georegistration of multiple-camera wide area motion imagery
Andaru et al. Multitemporal UAV photogrammetry for sandbank morphological change analysis: evaluations of camera calibration methods, co-registration strategies, and the reconstructed DSMs
US20230368540A1 (en) Velocity estimation in remotely sensed imagery
Kim et al. Incorrect Match Detection Method for Arctic Sea-Ice Reconstruction Using UAV Images
KR102019990B1 (en) Method and apparatus for estimating vehicle position based on visible light communication that considering motion blur compensation

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant