JP2003148897A - Shot control device - Google Patents

Shot control device

Info

Publication number
JP2003148897A
JP2003148897A JP2001351215A JP2001351215A JP2003148897A JP 2003148897 A JP2003148897 A JP 2003148897A JP 2001351215 A JP2001351215 A JP 2001351215A JP 2001351215 A JP2001351215 A JP 2001351215A JP 2003148897 A JP2003148897 A JP 2003148897A
Authority
JP
Japan
Prior art keywords
image
shooting
vehicle body
impact
visible
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2001351215A
Other languages
Japanese (ja)
Inventor
Shizuo Karasawa
鎮男 唐澤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Priority to JP2001351215A priority Critical patent/JP2003148897A/en
Publication of JP2003148897A publication Critical patent/JP2003148897A/en
Pending legal-status Critical Current

Links

Abstract

PROBLEM TO BE SOLVED: To solve the problem of causing the error of a hit deviation computed value by the impact of a shot in a shot control device mounted on a vehicle body because of a requirement of fixing a visible image pickup part and an ultraviolet image pickup part in an inertial space in order to accurately compute hit deviation, in the case of acquiring an image in the collimated state by the visible image pickup part and an image in the hit state by the ultraviolet image pickup part and composing these images to compute the hit deviation. SOLUTION: The jolting amount of a vehicle body when hit, caused by the impact of the shot is detected, and the fluctuation amount is corrected to the hit image of the ultraviolet image pickup part by an image composition processing part. Matching processing is further performed between the collimated image of the visible image pickup part and the image of the visible image pickup part in the stationary state of the vehicle body after shot, and an azimuth fluctuation portion in the inertial space of the vehicle body is also corrected to reduce the error of the hit deviation computed value.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【発明の属する技術分野】この発明は弾着偏差を算出す
る車両搭載用射撃統制装置の改良に関するものであり、
さらに、射撃時の衝撃により発生する車体の変動に起因
する弾着偏差検出値の誤差を低減する射撃統制装置に関
するものである。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an improvement of a vehicle-mounted shooting control device for calculating a landing deviation,
Furthermore, the present invention relates to a shooting control device that reduces an error in a landing deviation detection value due to a change in a vehicle body caused by a shock at the time of shooting.

【0002】[0002]

【従来の技術】図4は従来における射撃統制装置の応用
例を説明する図である。図において、1は可視撮像部、
2は紫外線撮像部、3aは可視撮像処理部、4は紫外線
映像処理部、5は外部から入力される撃発トリガ信号
線、6は照準時映像、7は弾着時映像、8aは映像合成
処理部、9aは弾着偏差出力信号線である。
2. Description of the Related Art FIG. 4 is a diagram for explaining an application example of a conventional shooting control device. In the figure, 1 is a visible image pickup unit,
2 is an ultraviolet imaging unit, 3a is a visible imaging processing unit, 4 is an ultraviolet image processing unit, 5 is a percussion trigger signal line input from the outside, 6 is an aiming image, 7 is an impacting image, and 8a is an image combining process. Reference numeral 9a denotes a landing deviation output signal line.

【0003】ここで、従来の技術について説明する。銃
や砲の射撃を実施する場合、目標を照準して撃発を実行
する。しかし、銃または砲の特性や温度等の環境状況に
より、必ずしも一発で命中するとは限らない。非命中で
再度射撃を実施する場合、一発目に照準した場所と弾が
到達した場所との差、即ち弾着偏差が分かれば、その偏
差分を照準点に補正することにより、次に命中する可能
性は向上する。しかし、弾着偏差を人間の眼で確認する
ことは困難であり、できたとしても時間を要するため、
再度射撃までの時間が長くなる。そこで、可視撮像部1
と紫外線撮像部2を備え、これらの映像を合成すること
で、自動的に弾着偏差を算出し、再度射撃する場合の時
間を短縮することができる。ここで、可視と紫外線の両
方が必要になる理由は次のとおりである。
Here, a conventional technique will be described. When firing a gun or gun, aim and fire the target. However, depending on the characteristics of the gun or gun and the environmental conditions such as temperature, it is not always possible to hit with one shot. When shooting again without hitting, if the difference between the place where the first shot is made and the place where the bullet has arrived, that is, the deviation in impact is known, the deviation is corrected to the aiming point, and the next hit is made. The likelihood of doing so increases. However, it is difficult to confirm the landing deviation with human eyes, and even if it is possible, it will take time,
It will take a long time to shoot again. Therefore, the visible image pickup unit 1
And the ultraviolet ray imaging unit 2 are combined, and these images are combined to automatically calculate the landing deviation and reduce the time required for another shot. Here, the reason why both visible light and ultraviolet light are necessary is as follows.

【0004】可視映像では照準点は見えるが、射撃時の
銃または砲から発生する火炎の影響で弾着点が見え難
い。一方紫外線映像では、弾着時に弾が紫外線領域の光
を発するため弾着点は見えるが、照準点が見え難い。
Although the aiming point can be seen in the visible image, it is difficult to see the impact point due to the influence of the flame generated by the gun or gun during shooting. On the other hand, in an ultraviolet image, since the bullet emits light in the ultraviolet range when it is hit, the hit point is visible, but the aim point is difficult to see.

【0005】従来の技術においては、可視映像処理部3
aが出力する照準時映像6と紫外線映像処理部4が出力
する弾着時映像7を、映像合成処理部8aが合成して弾
着偏差9aを出力する。図5は映像合成処理部8aの処
理内容を説明するイメージ図であり、照準時映像10と
弾着時映像11を重ね合わせて合成映像12を作成し、
照準点と弾着点の差を計算することにより弾着偏差9a
を出力する。
In the prior art, the visible image processing section 3
The aiming image 6 output by a and the hitting image 7 output by the ultraviolet ray image processing unit 4 are combined by the image combining processing unit 8a, and the hitting deviation 9a is output. FIG. 5 is an image diagram for explaining the processing contents of the video composition processing unit 8a. The aiming image 10 and the landing image 11 are overlapped to create a composite image 12,
By calculating the difference between the aiming point and the landing point, the landing deviation 9a
Is output.

【0006】[0006]

【発明が解決しようとする課題】上記のような射撃統制
装置においては、可視撮像部1と紫外線撮像部2が固定
され、照準時映像6と弾着時映像7が相対的に合致して
いることが必要条件である。つまり、車両に搭載される
射撃統制装置においては、銃または砲の射撃時の衝撃に
より弾着偏差9aに誤差が生じてしまう。図6はこの誤
差について説明するイメージ図であり、照準時映像6と
弾着時映像7が相対的にずれてしまうため、合成映像は
図5の合成映像12のようにならず、弾着偏差の誤差を
示すイメージ図13に示す通り、弾着偏差9aに誤差が
生じる。
In the shooting control device as described above, the visible image pickup portion 1 and the ultraviolet ray image pickup portion 2 are fixed, and the aiming image 6 and the hitting image 7 are relatively matched. Is a necessary condition. That is, in the shooting control device mounted on the vehicle, an error occurs in the landing deviation 9a due to the impact of the shooting of the gun or the gun. FIG. 6 is an image diagram for explaining this error. Since the aiming image 6 and the landing image 7 are relatively displaced, the composite image does not look like the composite image 12 in FIG. Image showing error As shown in FIG. 13, an error occurs in the landing deviation 9a.

【0007】この誤差の要因は次の二つである。一つ
は、弾着時における紫外線撮像部2の方位が上記衝撃に
より移動し、照準時映像6と弾着時映像7が相対的に合
致しなくなるためである。もう一つの要因は、射撃前後
の車体の方位が上記衝撃により移動し、照準時映像6と
弾着時映像7が相対的に合致しなくなるためである。
The causes of this error are the following two. One is that the direction of the ultraviolet imaging unit 2 at the time of landing moves due to the impact, and the aiming image 6 and the landing image 7 do not relatively match. Another factor is that the azimuth of the vehicle body before and after shooting moves due to the impact, and the aiming image 6 and the landing image 7 do not relatively match.

【0008】この発明はこのような課題を改善するため
になされたもので、射撃の衝撃による車体等の方位の変
動を補正して弾着偏差の算出誤差を低減させる、射撃統
制装置を提案するものである。
The present invention has been made in order to solve such a problem, and proposes a shooting control device which corrects a variation in the orientation of a vehicle body or the like due to a shock of shooting to reduce a calculation error of a landing deviation. It is a thing.

【0009】[0009]

【課題を解決するための手段】第1の発明による射撃統
制装置は、車体の慣性空間に対する移動角度から、紫外
線撮像部の方位の変動量を補正するものである。
A shooting control apparatus according to a first aspect of the present invention corrects a variation amount of a direction of an ultraviolet ray imaging unit from a moving angle of a vehicle body with respect to an inertial space.

【0010】第2の発明による射撃統制装置は、射撃に
よる衝撃が収まる時の可視映像と照準時の可視映像のマ
ッチング処理を行うことにより、車体の方位の変動量を
補正するものである。
The shooting control apparatus according to the second aspect of the present invention corrects the variation amount of the azimuth of the vehicle body by performing the matching processing of the visible image when the impact of shooting is subsided and the visible image when aiming.

【0011】第3の発明による射撃統制装置は、車体の
慣性空間に対する移動角度から、紫外線撮像部の方位の
変動量を補正し、さらに、射撃による衝撃が収まる時の
可視映像と照準時の可視映像のマッチング処理を行うこ
とにより、車体の方位の変動量を補正するものである。
A shooting control device according to a third aspect of the present invention corrects the variation amount of the direction of the ultraviolet imaging unit from the movement angle of the vehicle body with respect to the inertial space, and further, a visible image when the impact of shooting is subsided and a visible image when aiming. By performing the image matching process, the variation amount of the azimuth of the vehicle body is corrected.

【0012】[0012]

【発明の実施の形態】実施の形態1.図1はこの発明の
実施の形態1を示す図であり、図において1は可視撮像
部、2は紫外線撮像部、3aは可視映像処理部、4は紫
外線映像処理部、5は外部から入力される撃発トリガ信
号線、6は照準時映像、7は弾着時映像、8bは映像合
成処理部、9bは弾着偏差出力信号線、14は空間角度
検出部、15は慣性空間に対する車体の移動角度であ
る。
BEST MODE FOR CARRYING OUT THE INVENTION Embodiment 1. 1 is a diagram showing a first embodiment of the present invention, in which 1 is a visible image pickup section, 2 is an ultraviolet ray image pickup section, 3a is a visible image processing section, 4 is an ultraviolet ray image processing section, and 5 is an external input. 6 is a shooting image, 7 is a shooting image, 8b is an image synthesis processing unit, 9b is a deviation output signal line, 14 is a space angle detection unit, and 15 is a movement of the vehicle body with respect to inertial space. It is an angle.

【0013】空間角度検出部14は、射撃による車体の
慣性空間基準に対する移動角度15を検出して映像合成
処理部8bに出力する。車体に固定された紫外線撮像部
2は、弾着時において上記車体の慣性空間基準に対する
移動角度15と同じ量で慣性空間基準に対して移動して
いる。つまり、弾着時映像7に表示される弾着点には車
体の慣性空間基準に対する移動角度15の誤差が含まれ
ている。そこで映像合成処理部8bは、弾着時映像7に
おける弾着点に車体の慣性空間基準に対する移動角度1
5を補正して、その結果と照準時映像6における照準点
を照合して弾着偏差を算出する。
The space angle detection unit 14 detects a movement angle 15 of the vehicle body with respect to the inertial space reference due to shooting, and outputs it to the image synthesis processing unit 8b. The ultraviolet imaging unit 2 fixed to the vehicle body moves with respect to the inertial space reference by the same amount as the movement angle 15 of the vehicle body with respect to the inertial space reference when landing. That is, the impact point displayed in the impact image 7 includes an error of the movement angle 15 with respect to the inertial space reference of the vehicle body. Therefore, the image synthesis processing unit 8b sets the moving angle 1 with respect to the inertial space reference of the vehicle body at the impact point in the impact image 7 at impact.
5 is corrected, and the result is collated with the aiming point in the aiming image 6 to calculate the landing deviation.

【0014】これによれば、弾着時における紫外線撮像
部2の方位が射撃時の衝撃により移動した量を補正し、
弾着偏差の算出誤差を低減できる。
According to this, the amount of movement of the azimuth of the ultraviolet ray image pickup unit 2 at the time of landing is corrected by the impact at the time of shooting,
The calculation error of the landing deviation can be reduced.

【0015】実施の形態2.図2はこの発明の実施の形
態2を図であり、図において1は可視撮像部、2は紫外
線撮像部、3bは可視映像処理部、4は紫外線映像処理
部、5は外部から入力される撃発トリガ信号線、6は照
準時映像、7は弾着時映像、8cは映像合成処理部、9
cは弾着偏差出力信号線、16は射撃による衝撃が継続
する時間をカウントするタイマー、17は射撃による衝
撃が収まって車体が静止した時の可視映像、18はマッ
チング処理部、19は射撃前後の車体方位の変動量であ
る。
Embodiment 2. 2 is a diagram showing a second embodiment of the present invention, in which 1 is a visible image pickup section, 2 is an ultraviolet ray image pickup section, 3b is a visible image processing section, 4 is an ultraviolet ray image processing section, and 5 is an external input. Strike trigger signal line, 6 is an image when aiming, 7 is an image when landing, 8c is an image synthesizing processing unit, 9
c is an ejection deviation output signal line, 16 is a timer that counts the time during which the impact of shooting is continued, 17 is a visible image when the impact of shooting is stopped and the vehicle body is stationary, 18 is a matching processing unit, 19 is before and after shooting Is the amount of change in the vehicle body direction.

【0016】可視映像処理部3bは、撃発トリガ信号線
5がオンとなった時の可視映像、つまり照準時映像6
と、撃発トリガ信号線5がオンとなってから車体が射撃
による動揺が収まるTsec後の可視映像、つまり車体
静止後映像17をマッチング処理部18に出力し、マッ
チング処理部18はこれらの二つの画像のずれ量を演算
して映像合成処理部8cに出力する。映像合成処理部8
cは照準時映像6にこのずれ量を補正する。
The visible image processing section 3b is a visible image when the percussion trigger signal line 5 is turned on, that is, a sighting image 6
Then, the visible image after Tsec at which the sway due to the shooting of the vehicle body is stopped after the firing trigger signal line 5 is turned on, that is, the image 17 after the vehicle body stands still is output to the matching processing unit 18, and the matching processing unit 18 outputs these two images. The image shift amount is calculated and output to the video synthesis processing unit 8c. Video composition processing unit 8
c corrects this shift amount in the aiming image 6.

【0017】これによれば、射撃前後の車体方位の変動
量を補正し、弾着偏差の算出誤差を低減できる。
According to this, it is possible to correct the variation amount of the vehicle body direction before and after the shooting and reduce the calculation error of the landing deviation.

【0018】実施の形態3.図3はこの発明の実施の形
態3を示す図であり、図において1は可視撮像部、2は
紫外線撮像部、3bは可視映像処理部、4は紫外線映像
処理部、5は外部から入力される撃発トリガ信号線、6
は照準時映像、7は弾着時映像、8dは映像合成処理
部、9dは弾着偏差出力信号線、14は空間角度検出
部、15は慣性空間に対する車体の移動角度、16は射
撃による衝撃が継続する時間をカウントするタイマー、
17は射撃による衝撃が収まって車体が静止した時の可
視映像、18はマッチング処理部、19は射撃前後の車
体方位の変動量である。
Embodiment 3. 3 is a diagram showing a third embodiment of the present invention, in which 1 is a visible image pickup unit, 2 is an ultraviolet ray image pickup unit, 3b is a visible image processing unit, 4 is an ultraviolet image processing unit, and 5 is an external input. Strike trigger signal line, 6
Is an image when aiming, 7 is an image when landing, 8d is an image synthesis processing unit, 9d is an output signal line for deviation of landing, 14 is a space angle detection unit, 15 is a movement angle of the vehicle body with respect to inertial space, and 16 is an impact by shooting. A timer that counts how long the
Reference numeral 17 is a visible image when the impact of the shooting is stopped and the vehicle body is stationary, 18 is a matching processing unit, and 19 is a variation amount of the vehicle body orientation before and after the shooting.

【0019】可視映像処理部3bは、撃発トリガ信号線
5がオンとなった時の可視映像、つまり照準時映像6
と、撃発トリガ信号線5がオンとなってから車体が射撃
による動揺が収まるTsec後の可視映像、つまり車体
静止後映像17をマッチング処理部18に出力し、マッ
チング処理部18はこれらの二つの画像のずれ量を演算
して映像合成処理部8dに出力する。映像合成処理部8
dは照準時映像6にこのずれ量を補正して照準点を算出
する。。
The visible image processing section 3b is a visible image when the firing trigger signal line 5 is turned on, that is, an image 6 when aiming.
Then, the visible image after Tsec at which the sway due to the shooting of the vehicle body is stopped after the firing trigger signal line 5 is turned on, that is, the image 17 after the vehicle body stands still is output to the matching processing unit 18, and the matching processing unit 18 outputs these two images. An image shift amount is calculated and output to the video synthesis processing unit 8d. Video composition processing unit 8
In d, the shift amount is corrected in the aiming image 6 to calculate the aiming point. .

【0020】空間角度検出部14は、射撃による車体の
慣性空間基準に対する移動角度15を検出して映像合成
処理部8dに出力する。車体に固定された紫外線撮像部
2は、弾着時において上記車体の慣性空間基準に対する
移動角度15と同じ量で慣性空間基準に対して移動して
いる。つまり、弾着時映像7に表示される弾着点には車
体の慣性空間基準に対する移動角度15の誤差が含まれ
ている。そこで映像合成処理部8dは、弾着時映像7に
おける弾着点に車体の慣性空間基準に対する移動角度1
5を補正して弾着点を算出する。。、その結果と照準時
映像6における照準点を照合して弾着偏差を算出する。
The space angle detection unit 14 detects the movement angle 15 of the vehicle body with respect to the inertial space reference due to shooting and outputs it to the image synthesis processing unit 8d. The ultraviolet imaging unit 2 fixed to the vehicle body moves with respect to the inertial space reference by the same amount as the movement angle 15 of the vehicle body with respect to the inertial space reference when landing. That is, the impact point displayed in the impact image 7 includes an error of the movement angle 15 with respect to the inertial space reference of the vehicle body. Therefore, the image composition processing unit 8d sets the movement angle 1 with respect to the inertial space reference of the vehicle body at the impact point in the impact image 7 at impact.
5 is corrected to calculate the impact point. . The result is compared with the aiming point in the aiming image 6 to calculate the landing deviation.

【0021】さらに映像合成処理部8dは、上記補正後
の照準点と上記補正後の弾着点を照合して弾着偏差を算
出する。
Further, the image composition processing unit 8d compares the corrected aiming point with the corrected impact point to calculate the impact deviation.

【0022】これによれば、弾着時における紫外線撮像
部2の方位が射撃時の衝撃により移動した量を補正し、
さらに、射撃前後の車体方位の変動量を補正し、弾着偏
差の算出誤差を低減できる。
According to this, the amount of movement of the azimuth of the ultraviolet ray image pickup unit 2 at the time of landing is corrected by the impact at the time of shooting,
Further, it is possible to correct the variation amount of the vehicle body direction before and after shooting, and reduce the calculation error of the landing deviation.

【0023】[0023]

【発明の効果】第1の発明によれば、車両に搭載する射
撃統制装置において、射撃の衝撃による弾着時の車体の
動揺に起因する誤差を低減した弾着偏差を得ることがで
きる。
According to the first aspect of the present invention, in the shooting control device mounted on the vehicle, it is possible to obtain the landing deviation in which the error caused by the motion of the vehicle body at the time of landing due to the impact of shooting is reduced.

【0024】第2の発明によれば、車両に搭載する射撃
統制装置において、射撃の衝撃による車体の方位変動に
起因する誤差を低減した弾着偏差を得ることができる。
According to the second aspect of the invention, in the shooting control device mounted on the vehicle, it is possible to obtain the landing deviation in which the error caused by the variation in the orientation of the vehicle body due to the impact of shooting is reduced.

【0025】第3の発明によれば、車両に搭載する射撃
統制装置において、射撃の衝撃による弾着時の車体の動
揺に起因する誤差と、射撃の衝撃による車体の方位変動
に起因する誤差を低減した弾着偏差を得ることができ
る。
According to the third aspect of the present invention, in the shooting control device mounted on the vehicle, an error caused by the motion of the vehicle body at the time of landing due to the impact of shooting and an error caused by the azimuth variation of the vehicle body due to the impact of shooting are eliminated. It is possible to obtain a reduced landing deviation.

【図面の簡単な説明】[Brief description of drawings]

【図1】 この発明による射撃統制装置の実施の形態1
を示す図である。
FIG. 1 is a first embodiment of a shooting control device according to the present invention.
FIG.

【図2】 この発明による射撃統制装置の実施の形態2
を示す図である。
FIG. 2 is a second embodiment of the shooting control device according to the present invention.
FIG.

【図3】 この発明による射撃統制装置の実施の形態3
を示す図である。
FIG. 3 is a third embodiment of the shooting control device according to the present invention.
FIG.

【図4】 従来の射撃統制装置を示す図である。FIG. 4 is a view showing a conventional shooting control device.

【図5】 従来の弾着偏差算出要領を示す図である。FIG. 5 is a view showing a conventional bullet point deviation calculation procedure.

【図6】 従来の射撃統制装置を車両に搭載した場合の
弾着偏差の誤差を示す図である。
FIG. 6 is a diagram showing an error of a landing deviation when a conventional shooting control device is mounted on a vehicle.

【符号の説明】 1 可視撮像部、2 紫外線撮像部、3a 可視映像処
理部、3b 可視映像処理部、4 紫外線映像処理部、
5 撃発トリガ信号線、6 照準時映像、7 弾着時映
像、8a 映像合成処理部、8b 映像合成処理部、8
c 映像合成処理部、9a 弾着偏差出力信号線、9b
弾着偏差出力線、9c 弾着偏差出力線、 10 照
準時映像、11 弾着時映像、12 合成映像、13
弾着偏差の誤差を示すイメージ図、14 空間角度検出
部、15 慣性空間に対する移動角度、16 タイマ
ー、17 車体静止画像、18 マッチング処理部、1
9 車体方位の変動量
[Explanation of Codes] 1 visible image pickup unit, 2 ultraviolet ray image pickup unit, 3a visible image processing unit, 3b visible image processing unit, 4 ultraviolet image processing unit,
5 shooting trigger signal line, 6 aiming image, 7 bullet landing image, 8a image synthesis processing unit, 8b image synthesis processing unit, 8
c video composition processing unit, 9a impact deviation output signal line, 9b
Impact deviation output line, 9c Impact deviation output line, 10 Aiming image, 11 Impact image, 12 Composite image, 13
Image diagram showing error of landing deviation, 14 Spatial angle detection unit, 15 Movement angle with respect to inertial space, 16 Timer, 17 Vehicle body still image, 18 Matching processing unit, 1
9 Variation of body direction

Claims (3)

【特許請求の範囲】[Claims] 【請求項1】 可視撮像画像と紫外線撮像画像を用いて
弾着偏差を算出する車両搭載用の射撃統制装置におい
て、上記紫外線撮像画像の慣性空間における移動角度を
検出して射撃の衝撃による弾着偏差の誤差を低減するこ
とを特徴とする射撃統制装置。
1. A shooting control device for mounting on a vehicle, which calculates a landing deviation using a visible imaged image and an ultraviolet imaged image, detects a moving angle of the ultraviolet imaged image in an inertial space, and landed by an impact of shooting. A shooting control device characterized by reducing deviation error.
【請求項2】 可視撮像画像と紫外線撮像画像を用いて
弾着偏差を算出する車両搭載用の射撃統制装置におい
て、射撃時の可視撮像画像と射撃後の可視撮像画像の特
徴量比較から射撃前後の車両の変動量を算出して射撃の
衝撃による弾着偏差の誤差を低減することを特徴とする
射撃統制装置。
2. In a shooting control device for mounting on a vehicle, which calculates a landing deviation using a visible imaged image and an ultraviolet imaged image, comparing a characteristic amount of a visible imaged image during shooting and a visible imaged image after shooting before and after shooting. A shooting control device characterized by calculating the variation amount of the vehicle and reducing an error of a landing deviation due to a shooting impact.
【請求項3】 可視撮像画像と紫外線撮像画像を用いて
弾着偏差を算出する車両搭載用の射撃統制装置におい
て、上記紫外線撮像画像の慣性空間における移動角度を
検出し、且つ射撃時の可視撮像画像と射撃後の可視撮像
画像の特徴量比較から射撃前後の車両の変動量を算出し
て射撃の衝撃による弾着偏差の誤差を低減することを特
徴とする射撃統制装置。
3. A vehicle-mounted shooting control device for calculating a landing deviation using a visible imaged image and an ultraviolet imaged image, and detecting a moving angle of the ultraviolet imaged image in an inertial space, and visualizing at the time of shooting. A shooting control device characterized by calculating a variation amount of a vehicle before and after shooting by comparing a feature amount between an image and a visible captured image after shooting to reduce an error in landing deviation due to impact of shooting.
JP2001351215A 2001-11-16 2001-11-16 Shot control device Pending JP2003148897A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2001351215A JP2003148897A (en) 2001-11-16 2001-11-16 Shot control device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2001351215A JP2003148897A (en) 2001-11-16 2001-11-16 Shot control device

Publications (1)

Publication Number Publication Date
JP2003148897A true JP2003148897A (en) 2003-05-21

Family

ID=19163550

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2001351215A Pending JP2003148897A (en) 2001-11-16 2001-11-16 Shot control device

Country Status (1)

Country Link
JP (1) JP2003148897A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011027313A (en) * 2009-07-24 2011-02-10 Ihi Aerospace Co Ltd Hit observation system
WO2013153306A1 (en) * 2012-04-12 2013-10-17 Philippe Levilly Remotely operated target-processing system
KR101376689B1 (en) 2012-12-13 2014-03-20 국방과학연구소 Method for compensating fluctuation error of gun fire control system using gun barrel image

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011027313A (en) * 2009-07-24 2011-02-10 Ihi Aerospace Co Ltd Hit observation system
WO2013153306A1 (en) * 2012-04-12 2013-10-17 Philippe Levilly Remotely operated target-processing system
FR2989456A1 (en) * 2012-04-12 2013-10-18 Philippe Levilly TELEOPERATED TARGET PROCESSING SYSTEM
US9671197B2 (en) 2012-04-12 2017-06-06 Philippe Levilly Remotely operated target-processing system
KR101376689B1 (en) 2012-12-13 2014-03-20 국방과학연구소 Method for compensating fluctuation error of gun fire control system using gun barrel image

Similar Documents

Publication Publication Date Title
US8400619B1 (en) Systems and methods for automatic target tracking and beam steering
US5929444A (en) Aiming device using radiated energy
US6549872B2 (en) Method and apparatus for firing simulation
EP0929787B1 (en) Target aiming system
US20080163536A1 (en) Sighting Mechansim For Fire Arms
CN109154486B (en) Bore sighting device and method
KR101468160B1 (en) Training system for improving shooting accuracy and its control method
US6497171B2 (en) Method for correcting dynamic gun errors
US20050262993A1 (en) Targeting systems
JP2003148897A (en) Shot control device
JP3878360B2 (en) Small weapon aiming device
KR100751503B1 (en) Target practice device
US20080192979A1 (en) Shot pattern and target display
JP4961619B2 (en) Control device
JP2000356500A5 (en)
JPS5984098A (en) Automatic gun sighting system for aircraft
JP2006119070A (en) Ballistic position measuring device and aiming implement for small firearm
JPH11183096A (en) Landing observation image processor
AU754674B2 (en) Shooting simulation method
JP2000171199A (en) Passive impact splash standardizing apparatus
JP4695414B2 (en) Shooting training equipment
US20160018196A1 (en) Target scoring system and method
JP3905770B2 (en) Cannonball position detection method, apparatus, program, and recording medium
JPH10122793A (en) Method for aiming gun according to trajectory locus, aiming apparatus and firearm controller
JP2006207876A (en) Shooting training system