KR20170088202A - Method for evaluation of fusion feasibility on multi-sensor satellite images and Apparatus Thereof - Google Patents
Method for evaluation of fusion feasibility on multi-sensor satellite images and Apparatus Thereof Download PDFInfo
- Publication number
- KR20170088202A KR20170088202A KR1020160008275A KR20160008275A KR20170088202A KR 20170088202 A KR20170088202 A KR 20170088202A KR 1020160008275 A KR1020160008275 A KR 1020160008275A KR 20160008275 A KR20160008275 A KR 20160008275A KR 20170088202 A KR20170088202 A KR 20170088202A
- Authority
- KR
- South Korea
- Prior art keywords
- image
- distortion
- pushbroom
- radar
- satellite
- Prior art date
Links
Images
Classifications
-
- H04N5/2258—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/20—Adaptations for transmission via a GHz frequency band, e.g. via satellite
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Astronomy & Astrophysics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Electromagnetism (AREA)
- Image Processing (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
Description
The present invention relates to a method and an apparatus for evaluating the convergence possibility of a heterogeneous satellite image picked up from different sensors, and more particularly, to a method and apparatus for evaluating the convergence possibility of a heterogeneous satellite image picked up from different sensors, A method for evaluating the possibility, and a device therefor.
A multi-sensor image in a satellite image system means an image picked up by different sensors, and different kinds of satellite image fusion are performed by mathematically merging the information obtained from each single sensor, This is a technology that overcomes the limitations of information. The heterogeneous satellite imagery has been actively used by the earth observation satellites with various sensor characteristics, and has been used for various fields such as defense, environment, agriculture and the ocean.
In general, optical satellites simultaneously provide high-resolution full-color images and low-resolution multispectral images. The full color image has advantages such as classification of land cover through analysis of spectral characteristics of individuals because it is easy to read objects by providing high resolution image information compared with multispectral image and multispectral image has high spectral resolution. In addition, since infrared image can acquire brightness information by radiant energy emission from the surface, it is used for military purposes and is effective for nighttime target detection. Radar image can be taken regardless of day or night. It is effective for monitoring accompanying natural disasters.
For ideal utilization, it is effective to provide high-dimensional information by fusing satellite images taken from different sensors. However, it is difficult to produce a fusion image without terrain distortion due to the characteristics of heterogeneous satellite images obtained at different times and geometries. Is difficult. In particular, since the imaging method of optical and infrared images and radar images is completely different, it was aimed to develop convergence technology on the assumption that the initial different types of satellite image fusion techniques were imaged by the same conditions and methods, 10-1194405 " Method for detecting a target using plural heterogeneous sensors ", and Korean Patent No. 10-1035055 " Object tracking system and method using heterogeneous cameras "
However, the prior art described above is a technique of fusing an image captured from a heterogeneous sensor or a heterogeneous camera rather than an image captured from a satellite, and images captured from satellites are picked up at different time and geometrical conditions Because terrain distortion may occur depending on the satellite sensor and the terrain, especially in the case of mountainous terrain, the position error between the two images can be large according to each imaging geometry. Do.
It is an object of the present invention to determine a convergent heterogeneous satellite image by developing a technique capable of determining the convergence possibility of a heterogeneous satellite image picked up from different sensors.
To achieve these and other advantages and in accordance with the purpose of the present invention, as embodied and broadly described herein, there is provided a method and apparatus for evaluating heterogeneous satellite image fusion, including a satellite image acquiring unit, an imaging information acquiring unit, a distortion amount estimating unit,
The satellite image acquisition unit acquires an optical image, an infrared image, and a radar image. Wherein the imaging information obtaining unit obtains incident angle information, spatial resolution information, and trajectory angle information at the time of image capturing from a header file of the image, and the distortion amount estimating unit obtains distortions in the observation direction using the imaging information acquired by the imaging information obtaining unit And the convergence possibility determining unit determines the convergence possibility from the amount of distortion in the observation direction and the amount of distortion in the flight direction.
The present invention has the effect of determining a heterogeneous satellite image.
The present invention has an effect of determining a heterogeneous satellite image that can be finally fused by calculating the distortion amount in the observation direction and the distortion amount in the flight direction.
The present invention has the effect of determining heterogeneous satellite images that can be converged before the user performs heterogeneous satellite image fusion through calculating the amount of distortion in the observation direction and the amount of distortion in the flight direction.
The present invention has an effect of determining a heterogeneous satellite image pair that can be converged from a plurality of satellite images through fusion possibility analysis.
The source technology of the present invention has an effect that it can be utilized as satellite image processing software developed in Korea or as a core technology for fusion of different types of satellite images. Therefore, the present invention has an effect of increasing the utilization of satellite images and expanding the national satellite industry through commercialization of source technology.
The present invention is applicable to convergence of multi-purpose
BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a schematic view showing a configuration of a heterogeneous satellite image fusion possibility evaluating apparatus according to an embodiment of the present invention; FIG.
2 is a flowchart illustrating a method for evaluating the possibility of different types of satellite image fusion according to an embodiment of the present invention.
3 is an image showing an example of a photographic geometry of a pushbroom sensor and a radar sensor according to an embodiment of the present invention.
FIG. 4 is an image showing an example of an observation direction distortion amount and a flight direction distortion amount according to a heterogeneous satellite image fusion possibility evaluation method according to an embodiment of the present invention.
FIG. 5 is an image showing an example of a satellite image having different photographic geometric conditions through a heterogeneous satellite image fusion possibility evaluation method according to an embodiment of the present invention.
6 is an image showing a result of a fusion image through a heterogeneous satellite image fusion possibility evaluation method according to an embodiment of the present invention.
The terms and words used in the present specification and claims should not be construed as limited to ordinary or dictionary terms and the inventor may appropriately define the concept of the term in order to best describe its invention It should be construed as meaning and concept consistent with the technical idea of the present invention.
Therefore, the embodiments described in this specification and the configurations shown in the drawings are merely the most preferred embodiments of the present invention and do not represent all the technical ideas of the present invention. Therefore, It is to be understood that equivalents and modifications are possible.
The terminology used in this application is used only to describe a specific embodiment and is not intended to limit the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In the present application, the terms "comprises" or "having" and the like are intended to specify that there are stated features, numbers, steps, operations, elements, parts, or combinations thereof, But do not preclude the presence or addition of features, numbers, steps, operations, components, parts, or combinations thereof.
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear. In the following description of the embodiments of the present invention, specific values are only examples.
BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a schematic view showing a configuration of a heterogeneous satellite image fusion possibility evaluating apparatus according to an embodiment of the present invention; FIG.
1, a heterogeneous satellite image fusion possibility evaluating apparatus according to the present invention includes a satellite
The satellite
As an example, the optical satellite and the infrared satellite use a pushbroom scanning method. The pushbroom scanning method refers to a satellite image acquisition method in which hundreds of detectors for detecting the energy of an object are arranged in a line and the ground is scanned in the direction in which the satellite passes, It is a remote imaging system that constructs images using a linear array of charge coupled devices that record the signals. In this case, the pushbroom scanning method is a method adopted in most satellite systems because it can obtain a high resolution image and does not overlap in a scanning area, unlike a frame camera. In the pushbroom scanning method, linearly arranged sensors acquire information on the surface of the ground by one line in accordance with the flight direction of the payload, and bundle the acquired lines into one plane to obtain a pushbroom image. At this time, the pushbroom system includes a pushbroom sensor, which detects the energy of a specific object. In the present invention, an optical image and an infrared image are obtained using a pushbroom system using the pushbroom scanning method.
In addition, the radar satellite image capturing method is used. In the radar image capturing method, a radar pulse transmitted once is scanned along a satellite orbit to obtain an image of one line, and continuously transmits and receives a radar pulse This is a method of capturing an image. In the present invention, a radar image is acquired using a radar image capturing system to which the radar image capturing method is applied. The radar imaging system includes a radar sensor, a radar antenna, a pulse generator, and a converter.
The imaging
More specifically, the imaging
That is, the incident angle
The distortion
That is, the distortion
In more detail, the
The pushbroom system and the radar imaging system generate different geometric distortions, and for the pushbroom system, the height
Of the object is projected from the satellite image Is defined by the following equation (1).
At this time,
Represents the height at which an object in the pushbroom system is projected from the satellite image, Represents the height of the object, Represents the incident angle information of the satellite image acquired by the imagingIn the radar imaging system,
Of the object is projected from the satellite image Is defined by the following equation (2).
At this time,
Represents a height at which an object in a radar image capturing system is projected from a satellite image, Represents the height of the object, Represents the incident angle information of the satellite image acquired by the imaging3 illustrates the height of an object in a pushbroom system and the height of an object in a radar image capturing system projected from the image, according to an embodiment of the present invention. FIG. 3A shows the height at which an object in a pushbroom system is projected from an image
FIG. 3B shows the height at which an object in the radar image capturing system is projected from the satellite image Respectively.In the pushbroom system, the incident angle is proportional to the tangent value, but in the radar imaging system, it is proportional to the tangent value of the altitude angle. If the pushbroom image acquired in the pushbroom system and the radar image acquired in the radar image capturing system are to be merged, the following equation (3) can be determined.
here,
Is the incident angle of the pushbroom sensor, Represents the incident angle of the radar sensor. From this equation, we can find a fusible pair of pushbroom system and a fusible pair of radar sensors in a radar imaging system. For example, if the incident angle of the radar sensor is 45 degrees, the incident angle of the pushbroom sensor is? 45. These results can be made on the assumption that the satellites equipped with two sensors are traveling in a polar orbit, and most satellites have their own assumptions because they travel in polar orbits. If the incident angle of the radar sensor is 30 degrees, the incident angle of the pushbroom sensor should be -60 degrees. If the incident angle of the radar sensor is 60 degrees, the incident angle of the pushbrom sensor should be -30 degrees. Here, the sign is positive when the observation angle is photographed to the right.Ideally, when a satellite image is captured without error, it is possible to find a pair that can be converged by the method of Equation (3). However, in reality, distortion occurs in the viewing direction and the flying direction due to terrain distortion, We propose a method to find the fusible pair of pushbroom sensor and radar sensor according to the amount.
The amount of distortion of observation direction between pushbroom image and radar image
Can be defined by the following equation (4).
here,
and Represents the projection height of the pushbroom image and the radar image, respectively, Means the spatial resolution of the pixel direction obtained by the sensing information obtaining unit. Here, the pixel direction means a direction perpendicular to the direction in which the satellite image is flying.If a radar image with an incident angle of 40 degrees with a spatial resolution of 1 m and an optical image with an incident angle of -20 degrees are fused in an area with an average height of 10 m, the projection height in the radar image is about -11.9 m, It is about -3.6m. Therefore, the amount of distortion of the observation direction becomes 8.3 pixels.
In addition,
Can be defined as the following equation (5).
here,
Is the spatial resolution in the line direction obtained by the imaging information obtaining unit, Represents the difference in the shooting orbit angle between the pushbroom image and the radar image, Is the mean value of the projection height of the Pushbroom image and the radar image . Here, the line direction means the direction in which the satellite image is flying.Meanwhile, FIG. 4 shows the amount of distortion in the observation direction and the amount of distortion in the direction of flight in the pushbroom system and the radar imaging system according to the embodiment of the present invention. In FIG. 4A, the sensor located on the left side is a radar sensor, and the sensor located on the right side is a pushbroom sensor. When an object having the same height is projected on each sensor, the pushbroom sensor
, But the radar sensor is And the amount of distortion in the observation direction Can be confirmed. In FIG. 4B, sensors having different orbital angles are shown, and the amount of distortion of the flight direction due to the difference of the orbital angles Can be confirmed.The distortion
That is, the distortion
The fusibility
More specifically, the convergence
At this time, the image fusibility index is applied considering the amount of distortion in the pixel direction and the amount of distortion in the direction of flight, and can be determined as shown in Equation (6).
At this time,
Means the image fusion possible index, Means the amount of distortion in the viewing direction, Means the amount of distortion in the flight direction. If the amount of distortion in the observation direction and the amount of distortion in the direction of flight are almost zero, the image fusibility index is expressed as 0, and it can be expected that the amount of distortion in the satellite image fusion is almost zero. On the other hand, as the amount of distortion in the pixel direction and the amount of distortion in the direction of flight increases, the image fusibility index is expressed as a large value, so that the distortion amount in the satellite image fusion can be expected to be very large.Meanwhile, FIG. 5 shows a stereoscopic image obtained at different times and geometries for the same region according to an embodiment of the present invention. FIG. 5A shows only a global color image with a WorldView-1 satellite image, and has a spatial resolution of 0.5 m. The stereoscopic images utilized an angle of incidence of 15 degrees, an image having an azimuth angle of 290 degrees, an angle of incidence of 34 degrees, and an azimuth angle of 216 degrees. FIG. 5B shows a full-color image and a multi-spectral image with a WorldView-2 satellite image. The full-color image has a spatial resolution of 0.5 m and the multi-spectral image has a spatial resolution of 2 m. The stereoscopic images utilized an incident angle of 11 degrees, an image having an azimuth angle of 32 degrees, an incident angle of 21 degrees, and an azimuth angle of 175 degrees. Here, the global color image of WorldView-1 and the multispectral image of WorldView-2 have different imaging characteristics and can be called heterogeneous satellite images. The full color image has advantages such as classification of land cover through analysis of spectral characteristics of individuals because it is easy to read objects by providing high resolution image information compared with multispectral image and multispectral image has high spectral resolution.
Generally, when a full-color image and a multispectral image photographed by WorldView-2 are fused, the amount of distortion that can occur due to the same incident angle and orbit angle is small. However, Is distorted in the observation direction and the flight direction due to the difference in the incident angle difference, the difference in the orbit angle and the spatial resolution. Especially, it is known that the mountain type is more severe.
Meanwhile, FIG. 6 shows a form of a fusion image of WorldView-1 and WorldView-2 as an embodiment of the present invention. FIG. 6A shows a full-color image at an angle of incidence of 15 degrees and an azimuth angle of 290 degrees and a multi-spectral image at an angle of incidence of 11 degrees and an azimuth angle of 89 degrees in the WorldView-2. FIG. 2 at 32 degrees and an azimuth angle of 175 degrees. The incidence angles of the WorldView-1 and WorldView-2 images used in the production of FIG. 6A are all 15 degrees or less, and images with small incidence angles can be produced with high positional accuracy. On the other hand, in the case of WorldView-1 and WorldView-2 images used in the production of FIG. 6B, the azimuth difference is relatively small at about 40 degrees, but incidence angles are all 30 degrees or more. 6a, the positional errors of the low building and the road are small due to the small incidence angle, whereas the position error is caused by the high incidence angle in Fig. 6b, but the azimuth angle Since the difference is small, the fusion quality of the artificial structure is good.
Accordingly, the heterogeneous satellite image fusion possibility evaluating apparatus according to an embodiment of the present invention can calculate the amount of distortion that can occur in the heterogeneous satellite image, and it is possible to quantitatively evaluate the convergence possibility from the image fusion possible index. Therefore, it is possible to determine an appropriate fusion pair by predicting the convergence result in satellite image fusion, which requires a long processing time to date.
Hereinafter, a method for evaluating heterogeneous satellite image fusion possibility according to an embodiment of the present invention will be briefly described based on the details described above.
2 is a flowchart illustrating a method for evaluating the possibility of different types of satellite image fusion according to an embodiment of the present invention.
Referring to FIG. 2, the satellite
That is, the satellite
As an example, the optical satellite and the infrared satellite use a pushbroom scanning method. The pushbroom scanning method refers to a satellite image acquisition method in which hundreds of detectors for detecting the energy of an object are arranged in a line and the ground is scanned in the direction in which the satellite passes, It is a remote imaging system that constructs images using a linear array of charge coupled devices that record the signals. In this case, the pushbroom scanning method is a method adopted in most satellite systems because it can obtain a high resolution image and does not overlap in a scanning area, unlike a frame camera. In the present invention, an optical image and an infrared image are obtained using a pushbroom system using the pushbroom scanning method. The pushbroom system includes a pushbroom sensor and a detector for sensing the energy of a particular object,
In addition, the radar satellite image capturing method is used. In the radar image capturing method, a radar pulse transmitted once is scanned along a satellite orbit to obtain an image of one line, and continuously transmits and receives a radar pulse This is a method of capturing an image. In the present invention, a radar image is acquired using a radar image capturing system to which the radar image capturing method is applied. The radar imaging system includes a radar sensor, a radar antenna, a pulse generator, and a converter.
Next, the imaging
More specifically, the imaging
The imaging
That is, the imaging
Step S220 of acquiring the imaging information may include acquiring incident angle information, acquiring orbital angle information, and acquiring spatial resolution information.
Next, the distortion
That is, the distortion
That is, in step S230, the distortion
More specifically, the height at which the object is projected in the image in step S230 is proportional to the tangent value of the elevation angle in the radar imaging system, although the incident angle in the pushbroom system is proportional to the tangent value. The projection height of the object in the image increases as the incident angle of the Pushbroom system increases. In the case of the radar imaging system, the projection height decreases as the incident angle increases.
Equations (1) and (2) show the height at which an object is projected onto the image in the push-pull room system and the height at which an object in the radar image capturing system is projected onto the image. do.
Here, the ideal fusible twin crystal in a pushbroom system and a radar imaging system is a pair of tangent incidence angles? 1. For example, if the incident angle of the radar sensor is 45 degrees, the incident angle of the pushbroom sensor is? 45. If the incident angle of the radar sensor is 30 degrees, the incident angle of the pushbroom sensor should be -60 degrees. If the incident angle of the radar sensor is 60 degrees, the incident angle of the pushbrom sensor should be -30 degrees. Here, the sign is positive when the observation angle is photographed to the right.
Equation (3) explains a fusible pair that is an ideal state in a push foom system and a radar image capturing system, and a detailed description thereof has been described above. Ideally, when a satellite image is captured without error, it is possible to find a pair that can be converged by the method of Equation (3). However, in reality, distortion occurs in the viewing direction and the flying direction due to terrain distortion, We propose a method to find the fusible pair of pushbroom sensor and radar sensor according to the amount.
In step S230, the distortion
Equation (4) shows a method of calculating the amount of distortion of the observation direction between two images, and a detailed description thereof has been given above.
Further, in step S230, the distortion
Equation (5) shows a method of calculating the amount of distortion of the flight direction between the pushbroom image and the radar image, and a detailed description thereof has been given above.
Next, the convergence
That is, the convergence
Equation (6) shows a method of determining a fusible satellite image, and a detailed description thereof has been given above.
Therefore, the heterogeneous satellite image fusibility evaluation method proposed by the present invention can calculate the amount of distortion in the observation direction and the amount of distortion in the flight direction that can occur when fusion of different types of satellite images, . ≪ / RTI > Therefore, it is possible to find an appropriate fusion pair by predicting the convergence result in the satellite image convergence, which requires much processing time to date.
The heterogeneous satellite image fusion possibility evaluation method according to an embodiment of the present invention may be implemented in a form of a program command that can be performed through various computer means and recorded in a computer readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions recorded on the medium may be those specially designed and configured for the present invention or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape, optical media such as CD-ROMs and DVDs, and ROMs, , A flash memory, and the like, and a hardware device specifically configured to store and execute the program audfudd. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.
As described above, the present invention has been described with reference to particular embodiments, such as specific elements, and specific embodiments and drawings. However, it should be understood that the present invention is not limited to the above- And various modifications and changes may be made thereto by those skilled in the art to which the present invention pertains.
Accordingly, the spirit of the present invention should not be construed as being limited to the embodiments described, and all of the equivalents or equivalents of the claims, as well as the following claims, belong to the scope of the present invention .
100: heterogeneous satellite image convergence possibility evaluation device
110: Satellite image acquisition unit
120: imaging information acquiring unit
121: Incident angle information acquisition unit
122: spatial resolution information acquisition unit
123: Orbit angle information acquiring unit
130: Distortion amount estimation unit
131: Observation direction distortion amount estimation unit
132: Flying direction distortion amount estimating unit
140: Fusion possibility determination unit
Claims (16)
Acquiring incident angle information, trajectory angle information, and spatial resolution information at the time of imaging from the optical image, the infrared image, and the radar image;
Calculating a distortion amount of a viewing direction and a distortion amount of a flying direction generated when the pushbroom image and the radar image are fused using the incident angle information, the orbit angle information, and the spatial resolution information; And
And estimating convergence possibility by expressing the amount of distortion in the observation direction and the amount of distortion in the direction of flight by an image fusibility index.
Wherein the spatial resolution information includes:
A spatial resolution information of the observation direction, and spatial resolution information of the flight direction.
In the pushbroom system Is a heterogeneous satellite image convergence evaluation method defined by the following equation:
here, Represents a height at which the specific object in the pushbroom system is projected from the satellite image, Represents the height of the specific object, Represents the incident angle information of the pushbroom image.
In the radar imaging system, The height of the specific object in the radar image Is a heterogeneous satellite image convergence evaluation method defined by the following equation:
here, Represents a height at which the specific object in the radar image capturing system is projected from the radar image, Represents the height of the specific object, Represents the incident angle information.
A distortion amount in the observation direction between the pushbroom image and the radar image Is a method for evaluating the possibility of fusion of different types of satellite images defined by the following equations:
here, and Represents a projection height of the pushbroom image and the radar image, respectively, Means the spatial resolution in the pixel direction.
The amount of distortion in the flying direction between the pushbroom image and the radar image Is a method for evaluating the possibility of fusion of different types of satellite images defined by the following equations:
here, Is the spatial resolution in the line direction, And the difference between the trajectory angles of the pushbroom image and the radar image, Is an average value of the projection height of the pushbroom image and the radar image .
The step of evaluating the feasibility of convergence is,
Wherein the image fusibility index is defined by the following equation: < EMI ID =
here, Means the image fusibility index, Means the amount of distortion in the observation direction, Means the amount of distortion in the flight direction.
An imaging information acquiring unit that acquires incident angle information, trajectory angle information, and spatial resolution information at the time of imaging from the optical image, the infrared image, and the radar image;
A distortion amount estimating unit for calculating a distortion amount in an observation direction and a distortion amount in a flight direction generated when the pushbroom image and the radar image are fused using the incident angle information, the orbit angle information, and the spatial resolution information; And
And a convergence possibility determining unit for evaluating convergence possibility by expressing the amount of distortion in the observation direction and the amount of distortion in the direction of flight by an image fusibility index.
Wherein the imaging information obtaining unit obtains,
An incident angle information obtaining unit for obtaining the incident angle information;
An orbit angle information acquiring unit for acquiring the orbit angle information; And
And a spatial resolution information obtaining unit for obtaining the spatial resolution information.
Wherein the spatial resolution information includes:
The spatial resolution information of the observation direction, and the spatial resolution information of the flight direction.
Wherein the distortion amount estimating unit comprises:
In the pushbroom system Different types of satellite image fusion possibility evaluation apparatuses defined by the following equations:
here, Represents a height at which the specific object in the pushbroom system is projected from the satellite image, Represents the height of the specific object, Represents the incident angle information of the pushbroom image.
Wherein the distortion amount estimating unit comprises:
In the radar imaging system, The height of the specific object in the radar image Different types of satellite image fusion possibility evaluation apparatuses defined by the following equations:
here, Represents a height at which the specific object in the radar image capturing system is projected from the radar image, Represents the height of the specific object, Represents the incident angle information.
Wherein the distortion amount estimating unit comprises:
An observation direction distortion amount estimating unit for calculating a distortion amount in the observation direction; And
And a flight direction distortion amount estimating unit for calculating a distortion amount in the flight direction.
And the observation direction distortion amount estimating unit.
A distortion amount in the observation direction between the pushbroom image and the radar image Different types of satellite image fusion possibility evaluation apparatuses defined by the following equations:
here, and Represents the projection height of the pushbroom image and the radar image, respectively, Means the spatial resolution in the pixel direction.
Wherein the flying direction distortion amount estimating unit estimates,
The amount of distortion in the flying direction between the pushbroom image and the radar image Different types of satellite image fusion possibility evaluation apparatuses defined by the following equations:
here, Is the spatial resolution in the line direction, And the difference between the trajectory angles of the pushbroom image and the radar image, Is an average value of the projection height of the pushbroom image and the radar image .
Wherein the fusion possibility determining unit
Wherein the image fusibility index is defined by the following equation: < EMI ID =
here, Means the image fusibility index, Means the amount of distortion in the observation direction, Means the amount of distortion in the flight direction.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160008275A KR101770745B1 (en) | 2016-01-22 | 2016-01-22 | Method for evaluation of fusion feasibility on multi-sensor satellite images and Apparatus Thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160008275A KR101770745B1 (en) | 2016-01-22 | 2016-01-22 | Method for evaluation of fusion feasibility on multi-sensor satellite images and Apparatus Thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170088202A true KR20170088202A (en) | 2017-08-01 |
KR101770745B1 KR101770745B1 (en) | 2017-08-23 |
Family
ID=59650138
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020160008275A KR101770745B1 (en) | 2016-01-22 | 2016-01-22 | Method for evaluation of fusion feasibility on multi-sensor satellite images and Apparatus Thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101770745B1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019184709A1 (en) * | 2018-03-29 | 2019-10-03 | 上海智瞳通科技有限公司 | Data processing method and device based on multi-sensor fusion, and multi-sensor fusion method |
WO2020134856A1 (en) * | 2018-12-29 | 2020-07-02 | 长沙天仪空间科技研究院有限公司 | Remote sensing satellite system |
WO2021125395A1 (en) * | 2019-12-18 | 2021-06-24 | 한국항공우주연구원 | Method for determining specific area for optical navigation on basis of artificial neural network, on-board map generation device, and method for determining direction of lander |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102538242B1 (en) | 2021-09-30 | 2023-06-01 | 서울대학교 산학협력단 | Daily image fusion producing method and system using geostationary satellite imagery |
-
2016
- 2016-01-22 KR KR1020160008275A patent/KR101770745B1/en active IP Right Grant
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019184709A1 (en) * | 2018-03-29 | 2019-10-03 | 上海智瞳通科技有限公司 | Data processing method and device based on multi-sensor fusion, and multi-sensor fusion method |
US11675068B2 (en) | 2018-03-29 | 2023-06-13 | Shanghai YuGan Microelectronics Co., Ltd | Data processing method and device based on multi-sensor fusion, and multi-sensor fusion method |
WO2020134856A1 (en) * | 2018-12-29 | 2020-07-02 | 长沙天仪空间科技研究院有限公司 | Remote sensing satellite system |
WO2021125395A1 (en) * | 2019-12-18 | 2021-06-24 | 한국항공우주연구원 | Method for determining specific area for optical navigation on basis of artificial neural network, on-board map generation device, and method for determining direction of lander |
Also Published As
Publication number | Publication date |
---|---|
KR101770745B1 (en) | 2017-08-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10303966B2 (en) | Method and system of image-based change detection | |
CN108171733B (en) | Method of registering two or more three-dimensional 3D point clouds | |
JP6484729B2 (en) | Unmanned aircraft depth image acquisition method, acquisition device, and unmanned aircraft | |
Grodecki et al. | IKONOS geometric accuracy | |
CA2729712C (en) | Method of searching for a thermal target | |
US8428344B2 (en) | System and method for providing mobile range sensing | |
KR101770745B1 (en) | Method for evaluation of fusion feasibility on multi-sensor satellite images and Apparatus Thereof | |
Toschi et al. | Combining airborne oblique camera and LiDAR sensors: Investigation and new perspectives | |
US9453731B2 (en) | System and method for determining orientation relative to earth | |
US20050253928A1 (en) | Target identification and location system and a method thereof | |
Zhang et al. | Evaluating the potential of PPK direct georeferencing for UAV-SfM photogrammetry and precise topographic mapping | |
KR20210105487A (en) | Method for estimation of river bed change rate using hyperspectral image | |
CN110516588B (en) | Remote sensing satellite system | |
Koppanyi et al. | Experiences with acquiring highly redundant spatial data to support driverless vehicle technologies | |
Khezrabad et al. | A new approach for geometric correction of UAV-based pushbroom images through the processing of simultaneously acquired frame images | |
Misra et al. | An automated approach for reference image generation using multi-tier Resourcesat-2A data over Indian land terrain | |
Hruska | Small UAV-acquired, high-resolution, georeferenced still imagery | |
Coulter et al. | Automated co-registration of multitemporal airborne frame images for near real-time change detection | |
Ringaby et al. | Co-aligning aerial hyperspectral push-broom strips for change detection | |
Qiu et al. | A tie point matching strategy for very high resolution SAR-optical stereogrammety over urban areas | |
Pritt et al. | Georegistration of multiple-camera wide area motion imagery | |
Andaru et al. | Multitemporal UAV photogrammetry for sandbank morphological change analysis: evaluations of camera calibration methods, co-registration strategies, and the reconstructed DSMs | |
US20230368540A1 (en) | Velocity estimation in remotely sensed imagery | |
Kim et al. | Incorrect Match Detection Method for Arctic Sea-Ice Reconstruction Using UAV Images | |
KR102019990B1 (en) | Method and apparatus for estimating vehicle position based on visible light communication that considering motion blur compensation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |