JPS59228180A - Radar image simulation system - Google Patents

Radar image simulation system

Info

Publication number
JPS59228180A
JPS59228180A JP58102602A JP10260283A JPS59228180A JP S59228180 A JPS59228180 A JP S59228180A JP 58102602 A JP58102602 A JP 58102602A JP 10260283 A JP10260283 A JP 10260283A JP S59228180 A JPS59228180 A JP S59228180A
Authority
JP
Japan
Prior art keywords
satellite
point
coordinate system
line
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP58102602A
Other languages
Japanese (ja)
Inventor
Hirotaka Mizuno
浩孝 水野
Fuminobu Yoshimura
吉村 文伸
Nobutake Yamagata
山縣 振武
Yutaka Kubo
裕 久保
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Priority to JP58102602A priority Critical patent/JPS59228180A/en
Publication of JPS59228180A publication Critical patent/JPS59228180A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • G01C11/025Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures by scanning the object

Abstract

PURPOSE:To calculate the position of a photographing point on earth corresponding to each pixel of a radar image by using data for the position and attitude of a satellite and the ground surface to calculate the position of the photographing point on earth corresponding to each point on an observation image. CONSTITUTION:Line numbers of an observation are inputted into a satellite position/attitude calculating section 6 to determine the position and attitude of the satellite. A rotary matrix M between the earth reference coordinate system and the satellite reference coordinate system is found with a coordinate conversion matrix calculating section 3 and the ground surface DTM is converted to the satellite reference coordinate system with a coordinate converting section 7. A cross line between a beam plane and the ground surface DTM' converted is determined with an intersection calculating section 8. On the other hand, the beam distance to the photographing point from a satellite is determined with a beam distance calculating section 9. The photographing point is decided with a point of intersection calculating section 10 and inputted into a coordinate converting section 11 to be converted to earth reference coordinate system.

Description

【発明の詳細な説明】 〔発明の利用分野〕 本発明は、模擬画像の生成方式に関し、特に衛星、航空
機搭載のレーク°画像の各点について対応する地表撮影
地点の位置を決定するレーダ画像模擬方式に関する。
[Detailed Description of the Invention] [Field of Application of the Invention] The present invention relates to a method for generating a simulated image, and in particular to a radar image simulation method for determining the position of a corresponding ground photographing point for each point in a satellite or aircraft-mounted rake image. Regarding the method.

〔発明の背景〕[Background of the invention]

衛星、航空機等の観測I[!]I像の解析、オリ用金行
なうためには、まず画像の幾何学的歪全補正して地図に
重なるようにすることが必要である。観測画像の歪は、
画像の各点での衛星、航空機の位置。
Observation of satellites, aircraft, etc. I [! ] In order to analyze and use the I-image, it is first necessary to completely correct the geometric distortion of the image so that it overlaps with the map. The distortion of the observed image is
Satellite, aircraft position at each point in the image.

姿勢の変動あるいは地形の起伏の影響で生じる。It occurs due to changes in posture or the effects of terrain undulations.

特に衛星が斜め下の地表全観測した場合には、地形の起
伏の影響によシ第4図のように点18を実際vc撮影し
ても光学センサの場合には点22を撮影した様に、レー
ダの場合には衛星から、等距離の点23を像影した様に
歪んで児える(パララックス)。第5図にこれらの3次
元歪補正処理の流れを示す。この中で地表の標尚テータ
25(L)TM)と衛星の飛行条件とから観測画像26
の3次元虫を模擬し、観測画像の各点に対応する地表の
撮影地点の位置を決定することによシ、観測歯i#26
の各点と立体歪補正部28で補正でれた補正画像27の
各点との対応関係を知ることができる。
In particular, when a satellite observes the entire surface of the earth diagonally below, due to the effects of topographical undulations, even if point 18 is actually photographed by VC as shown in Figure 4, if an optical sensor is used, it will be as if point 22 was photographed. In the case of radar, the image is distorted as if it were a point 23 equidistant from the satellite (parallax). FIG. 5 shows the flow of these three-dimensional distortion correction processes. Among these, observation image 26 is based on the ground elevation data 25(L)TM) and the flight conditions of the satellite.
By simulating the three-dimensional insect of
It is possible to know the correspondence between each point of the image 27 and each point of the corrected image 27 corrected by the stereoscopic distortion correction unit 28.

衛星や航望機等に搭載される従来の走立型光学センサに
よる観測画像の模擬の際には、第1図に示す構成の装置
が用いられる。観測画像の各画素(画像の基本単位ンの
位置(t、p)から、その画素が撮影された時刻を基に
、第3図における地球基準座標系16における衛星位置
r(はベクトル全表わす)、姿勢θ(オイラー角)が衛
星位置、費勢計鼻部1で求まる。また、視線ベクトル計
算部2では第3図の衛星基準座標系15に2けるセンサ
の視線の向きを表わす視線ベクトルSが求まる。次に1
 rとθによシ座標笈換行列計算部3で針具される地球
基準座標系16と衛星基準座標系との間の回転付列Mに
よって座標変換部4で、この上を地球基準座標系での視
線ベクトル旦に変換する。そして、交点計算部5で衛星
の位置rと視線ゝクトル1で決まる直線と地表面DTM
との交点を計算することによシ画素位置Ct、p)に対
応する地表の撮影地点の座標rek算出するものである
。DTMは地球基準座標系で多角形の集合として近似的
に与えられることが多い。
When simulating images observed by conventional optical sensors mounted on satellites, spacecraft, etc., an apparatus having the configuration shown in FIG. 1 is used. Each pixel of the observed image (from the position (t, p) of the basic unit of the image, based on the time when that pixel was photographed, the satellite position r (represents the entire vector) in the earth reference coordinate system 16 in Fig. 3) , the attitude θ (Euler angle) is determined from the satellite position and the cost meter nose 1.The line-of-sight vector calculation unit 2 calculates the line-of-sight vector S representing the direction of the sensor's line of sight in the satellite reference coordinate system 15 of FIG. is found. Next, 1
The rotation matrix M between the earth reference coordinate system 16 and the satellite reference coordinate system is converted into the earth reference coordinate system by the coordinate transformation unit 4, which is converted into the earth reference coordinate system by the coordinate conversion matrix calculation unit 3 according to r and θ. Convert the line of sight vector to . Then, the intersection calculation unit 5 calculates the straight line determined by the satellite position r and line-of-sight vector 1 and the ground surface DTM.
The coordinates rek of the photographing point on the earth's surface corresponding to the pixel position Ct, p) are calculated by calculating the intersection with the pixel position Ct, p). DTM is often given approximately as a set of polygons in an earth reference coordinate system.

しかし、レーダ画像を模擬する場合には、画素位置(t
、p)については光学センサの様に視線ベクトル二が決
定するのでなく、視線の長さが決まる。そのため前述の
方式ではレーダ画像の模擬はできないという問題点があ
った。
However, when simulating a radar image, the pixel position (t
, p) is not determined by the line-of-sight vector 2 as in the case of optical sensors, but by the length of the line-of-sight. Therefore, the above-mentioned method has a problem in that it is not possible to simulate a radar image.

〔発明の目的〕[Purpose of the invention]

本発明の目的は、上記従来方式の欠点を改善し、レーダ
画像の模擬において、観測画像の各画素毎に高精度に地
球表面の撮影地点を決定することができるレーダ画像模
擬方式全提供することvchる。
An object of the present invention is to improve the drawbacks of the above-mentioned conventional methods and to provide an entire radar image simulation method that can determine the photographing point on the earth's surface with high precision for each pixel of an observed image in radar image simulation. vchru.

〔発明の概要〕[Summary of the invention]

上記目的全達成するた、め本発明においては、地球表面
のデータを衛星位置を原点とする衛星基準座標系で表現
し、さらに観測画像の1ライン毎に対応するレーダパル
ス全含む平面との又巌全求め、観測画像画素位置で決ま
る視線の長さ全半径とし、衛星基準座標系の原点を中心
としレーダパルスの平面上にある円弧と前述の交線との
交点を求のることによシ撮影地点の座標全算出すること
により、3次元窒間での円弧と地表面を近佃する平面と
の交点計算を2次元平面上で行なう点に特徴がある。
In order to achieve all of the above objects, the present invention expresses data on the earth's surface in a satellite reference coordinate system with the satellite position as the origin, and furthermore, expresses data on the earth's surface in a satellite reference coordinate system with the satellite position as the origin, and furthermore The length of the line of sight determined by the pixel position of the observed image is the total radius, and the intersection point of the arc on the plane of the radar pulse centered on the origin of the satellite reference coordinate system and the above-mentioned intersection line is calculated. The feature is that by calculating all the coordinates of the photographing point, the intersection point between the arc in the three-dimensional space and the plane that approaches the ground surface is calculated on the two-dimensional plane.

〔発明の実施例〕[Embodiments of the invention]

第2図に本発明の実施例のブロック構成を示す。 FIG. 2 shows a block configuration of an embodiment of the present invention.

まず、衛星はレーダパルス全進行方向と垂直な平面内に
画像の1ライン毎に発射し、地表からの反射波を時間的
にサンプルすることにより、1247分の画像を得る。
First, the satellite emits an image line by line in a plane perpendicular to the entire traveling direction of the radar pulse, and samples the reflected waves from the ground surface over time to obtain an image of 1247 minutes.

よって、観測画像中の画素(t、p)の撮影時刻は、ラ
イン番号tだけで決定されpによらない。したがってそ
の時の衛星位置r、姿勢θもtのみから衛星位置、姿勢
計算部6で求められる。
Therefore, the photographing time of the pixel (t, p) in the observed image is determined only by the line number t and is not dependent on p. Therefore, the satellite position r and attitude θ at that time are also determined by the satellite position and attitude calculation unit 6 from only t.

地球基準座標系16と衛星基準座標系15との間の回転
行列Mは、第1図と同様に座標変換行列計算部3におい
て、rとθを用いて計算される。
The rotation matrix M between the earth reference coordinate system 16 and the satellite reference coordinate system 15 is calculated by the coordinate transformation matrix calculation unit 3 using r and θ, as in FIG.

地表面データDTMは、地球基準座標系で多角形の集合
で近似さn与えられるとする。これfrとMを用いるこ
とにより、座標変換部7で衛星基準座標系に変換しDT
M’を得る。さらに第3図に示す様に、レーダパルスの
ビームが衛星基準座標系15において平面12(x=o
)上にあるとし、ビーム面との交線計算部8において、
この平面とDTM’との交線13(DTM)“を求める
DTMが多角形の集合で近似されているのでDTM“は
折線である。
It is assumed that the ground surface data DTM is approximated by a set of polygons n in the earth reference coordinate system. By using this fr and M, the coordinate conversion unit 7 converts it to the satellite reference coordinate system and DT
Get M'. Furthermore, as shown in FIG.
), and in the intersection calculation unit 8 with the beam plane,
Since the DTM for determining the intersection line 13 (DTM)" between this plane and the DTM' is approximated by a set of polygons, the DTM" is a broken line.

一方、画素(tsp)のうち、pからは第3図のように
衛星14から撮影地点18までのビーム距離dがビーム
距離計算部9において、計算さ扛る。
On the other hand, from the pixel (tsp), the beam distance d from the satellite 14 to the photographing point 18 is calculated in the beam distance calculating section 9 as shown in FIG.

したがって撮影地点は平面12上で衛星14の位置を中
心とし半径dの円弧17の上にあることになる。撮影地
点18の決定は、この円弧17と先に求めた折線13(
DTM“)との交点を求めることによシ為される。これ
らは、すべて平面12(x=0)上での話であるから、
処理はyと2座標に関して行なえばよく、2次元の問題
に簡単化される。
Therefore, the photographing point is located on the plane 12 on an arc 17 centered on the position of the satellite 14 and having a radius d. The shooting point 18 is determined by using this arc 17 and the previously obtained broken line 13 (
This is done by finding the intersection with DTM"). Since these are all on the plane 12 (x = 0),
The processing only needs to be performed with respect to y and two coordinates, and is simplified to a two-dimensional problem.

交点計算部10での円弧と折線の交点方法であるが、折
線の成分である各線分について以下の処理を行なう。第
3図に訃いて、線分の両端点20゜21と衛星14まで
の距離全算出し、これらの間に円弧17の半径dが入っ
ていれば、円弧と線分とが交わることが判かる。次に線
分の中心19についても同様の距離金求め、これとdと
の大小比較を行ない交点18が中点のどちらにあるか全
判定する。第3図では交点18は中点19と端点21の
間に存在することが判かる。そして次には線分19−2
1について同様の処理を行なう。この処理をくり返すこ
とにより線分を2等分しながら1点へ収束させてゆき、
結果として交点19を決定する。その座標を白とする。
Regarding the method of intersection of a circular arc and a broken line in the intersection calculation unit 10, the following processing is performed for each line segment that is a component of a broken line. Referring to Figure 3, if we calculate the entire distance between the end points 20°21 of the line segment and the satellite 14, and if the radius d of the arc 17 is between them, we can see that the arc intersects with the line segment. Karu. Next, a similar distance is determined for the center 19 of the line segment, and this is compared in magnitude with d to determine which of the midpoints the intersection 18 is located at. It can be seen in FIG. 3 that the intersection point 18 exists between the midpoint 19 and the endpoint 21. And then line segment 19-2
Similar processing is performed for 1. By repeating this process, the line segment is divided into two parts and converged to one point,
As a result, intersection point 19 is determined. Let that coordinate be white.

Qの座標t(y、z)とすれば、衛星基準座標系15で
は(0,y、z)であシ、これ全座標変換部11におい
てMとrf用いて地球基準座標系16に変換し座標rc
f得る。
If the coordinates of Q are t(y, z), they are (0, y, z) in the satellite reference coordinate system 15, and are converted to the earth reference coordinate system 16 in the total coordinate conversion unit 11 using M and rf. coordinates rc
f get.

この処理を画像の第tラインのすべての画素番号pにつ
き行なうことにより1ライン分の処理が終了する。なお
第(t+11ラインの画素については、衛星位置r、姿
姿勢全全計算なおすところから始める。
By performing this process for all pixel numbers p of the tth line of the image, the process for one line is completed. For the pixel on the (t+11th line), start by recalculating the satellite position r and the attitude.

以上に述べた処理により、観測画像上のすべての画素(
t、p)について対応する地表面の座標re が計算さ
れる。
Through the processing described above, all pixels (
t, p), the corresponding ground surface coordinates re are calculated.

〔発明の効果〕〔Effect of the invention〕

以上説明したごとく本発明によれば、レーダ画像の各画
素に対応する地上撮影地点の位置全精度よく算出できる
効果がある。また、画像1ライン単位の処理を行なって
いるので計算が効率的であるという副次的効果もめる。
As explained above, according to the present invention, there is an effect that the position of the ground photographing point corresponding to each pixel of the radar image can be calculated with full accuracy. In addition, since processing is performed on a line-by-line basis, calculations are more efficient as a side effect.

【図面の簡単な説明】[Brief explanation of the drawing]

第1図は、従来の短資型光学センサによる観測画像を模
擬する装置のブロック構成図、第2図は本発明によるレ
ーダ画像模擬装置のブロック構成図、第3図はビーム面
と地表曲との交線と、ビームとの交点の算出処理?説明
する図、第4図は光学センサあるいはレーダによる観測
画像のバララックスの原理を示す図、第5図は衛星画像
の立体歪補正処理の模式図である。 第  1  図 %Z図 ¥J  3  図
Fig. 1 is a block diagram of a device for simulating an observation image using a conventional short-term optical sensor, Fig. 2 is a block diagram of a radar image simulating device according to the present invention, and Fig. 3 is a diagram showing the relationship between the beam surface and the ground surface curvature. Calculating the intersection between the intersection line and the beam? The explanatory diagrams, FIG. 4, are diagrams showing the principle of balax in images observed by an optical sensor or radar, and FIG. 5 is a schematic diagram of the three-dimensional distortion correction process of satellite images. Fig. 1 %Z Fig.¥J 3 Fig.

Claims (1)

【特許請求の範囲】 1、衛星、航空機等のレーダ画像において観測画像上の
各点に対応した地上の撮影地点の位置を衛星の位置、姿
勢、地表面のデータを用いることによシ褒−出すること
を特徴とするレーダ画像模擬方式。 2 上記地上の撮影地点の位置は地表面データを衛星を
基準としだ座標系で表現するステップと、さらにレーダ
ビームを含む面との交線を求めるステップと、この交線
と画像−上の画素位置によって半径の定まる円弧との交
点を2次元座標系で算出するステップとによシ決定する
ことを特徴とする特許請求の範囲第1項のレーダ画像模
擬方式。
[Claims] 1. In radar images of satellites, aircraft, etc., the position of the photographing point on the ground corresponding to each point on the observed image can be determined by using data on the position, attitude, and ground surface of the satellite. A radar image simulation method that is characterized by 2. The location of the photographing point on the ground is determined by a step of expressing the ground surface data in a coordinate system with the satellite as a reference, a step of finding the line of intersection with the surface containing the radar beam, and a step of calculating the line of intersection between this line and the pixel on the image. 2. The radar image simulation method according to claim 1, further comprising the step of calculating an intersection point with a circular arc whose radius is determined by a position in a two-dimensional coordinate system.
JP58102602A 1983-06-10 1983-06-10 Radar image simulation system Pending JPS59228180A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP58102602A JPS59228180A (en) 1983-06-10 1983-06-10 Radar image simulation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP58102602A JPS59228180A (en) 1983-06-10 1983-06-10 Radar image simulation system

Publications (1)

Publication Number Publication Date
JPS59228180A true JPS59228180A (en) 1984-12-21

Family

ID=14331778

Family Applications (1)

Application Number Title Priority Date Filing Date
JP58102602A Pending JPS59228180A (en) 1983-06-10 1983-06-10 Radar image simulation system

Country Status (1)

Country Link
JP (1) JPS59228180A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2611266A1 (en) * 1987-02-19 1988-08-26 Centre Nat Etd Spatiales Device and method for locating views on the ground taken by a satellite
CN106597434A (en) * 2016-11-28 2017-04-26 中国人民解放军国防科学技术大学 Agile satellite target decomposition method and system based on push-scan trajectories
CN110986886A (en) * 2019-12-18 2020-04-10 中国科学院长春光学精密机械与物理研究所 Double-camera dynamic rotation scanning three-dimensional imaging simulation device
CN113640799A (en) * 2021-08-11 2021-11-12 北京无线电测量研究所 Method and device for determining central irradiation point of radar beam and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2611266A1 (en) * 1987-02-19 1988-08-26 Centre Nat Etd Spatiales Device and method for locating views on the ground taken by a satellite
CN106597434A (en) * 2016-11-28 2017-04-26 中国人民解放军国防科学技术大学 Agile satellite target decomposition method and system based on push-scan trajectories
CN106597434B (en) * 2016-11-28 2019-04-30 中国人民解放军国防科学技术大学 It is a kind of based on pushing away the quick Satellite Targets decomposition method and system for sweeping track
CN110986886A (en) * 2019-12-18 2020-04-10 中国科学院长春光学精密机械与物理研究所 Double-camera dynamic rotation scanning three-dimensional imaging simulation device
CN113640799A (en) * 2021-08-11 2021-11-12 北京无线电测量研究所 Method and device for determining central irradiation point of radar beam and storage medium

Similar Documents

Publication Publication Date Title
JP4685313B2 (en) Method for processing passive volumetric image of any aspect
WO2019127445A1 (en) Three-dimensional mapping method, apparatus and system, cloud platform, electronic device, and computer program product
CN105677942B (en) A kind of spaceborne natural scene SAR complex image data rapid simulation method of repeat track
JP4448187B2 (en) Image geometric correction method and apparatus
JPH07107548B2 (en) Positioning method using artificial satellites
CN107527382B (en) Data processing method and device
CN112184786B (en) Target positioning method based on synthetic vision
CN102506867A (en) SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system) combined navigation method based on Harris comer matching and combined navigation system
CN108733711B (en) Distribution line space distance obtaining method based on three-dimensional GIS technology
CN108062788A (en) A kind of three-dimensional rebuilding method, device, equipment and medium
CN103344958B (en) Based on the satellite-borne SAR high-order Doppler parameter evaluation method of almanac data
Hattori et al. Orientation of high-resolution satellite images based on affine projection
CN112461204B (en) Method for satellite to dynamic flying target multi-view imaging combined calculation of navigation height
Burkard et al. User-aided global registration method using geospatial 3D data for large-scale mobile outdoor augmented reality
CN110411449B (en) Aviation reconnaissance load target positioning method and system and terminal equipment
JPS59228180A (en) Radar image simulation system
Ono et al. Digital mapping using high resolution satellite imagery based on 2D affine projection model
CN109003295B (en) Rapid matching method for aerial images of unmanned aerial vehicle
CN114964248A (en) Target position calculation and indication method for motion trail out of view field
CN109341685B (en) Fixed wing aircraft vision auxiliary landing navigation method based on homography transformation
JP3024666B2 (en) Method and system for generating three-dimensional display image of high-altitude image
CN114459461B (en) Navigation positioning method based on GIS and real-time photoelectric video
CN112712559B (en) SfM point cloud correction method based on NED coordinate system vector rotation
JP3290845B2 (en) Radar image processing device
CN115775324B (en) Phase correlation image matching method under guidance of cross scale filtering