WO2017222490A2 - Video ammunition and laser position assessment method - Google Patents

Video ammunition and laser position assessment method Download PDF

Info

Publication number
WO2017222490A2
WO2017222490A2 PCT/TR2017/050283 TR2017050283W WO2017222490A2 WO 2017222490 A2 WO2017222490 A2 WO 2017222490A2 TR 2017050283 W TR2017050283 W TR 2017050283W WO 2017222490 A2 WO2017222490 A2 WO 2017222490A2
Authority
WO
WIPO (PCT)
Prior art keywords
camera
cos
sin
calculation
horizontal axis
Prior art date
Application number
PCT/TR2017/050283
Other languages
French (fr)
Other versions
WO2017222490A3 (en
Inventor
Ali TEKIN
Original Assignee
Havelsan Hava Elektronik Sanayi Ve Ticaret Anonim Sirketi
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Havelsan Hava Elektronik Sanayi Ve Ticaret Anonim Sirketi filed Critical Havelsan Hava Elektronik Sanayi Ve Ticaret Anonim Sirketi
Publication of WO2017222490A2 publication Critical patent/WO2017222490A2/en
Publication of WO2017222490A3 publication Critical patent/WO2017222490A3/en

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J5/00Target indicating systems; Target-hit or score detecting systems
    • F41J5/10Cinematographic hit-indicating systems

Definitions

  • the invention relates to Video Ammunition and Laser Position Assessment Algorithm/Method (VALPAA) realizing calculation of positions of the traces left by ammunition or laser designator marker in camera images on the earth.
  • VALPAA Video Ammunition and Laser Position Assessment Algorithm/Method
  • VALPAA is developed for use in Video Ammunition and Laser Assessment System (VALAS) and is used to calculate exactly where the ammunition and/or laser designator marking captured in VALAS is on the earth.
  • VALAS Video Ammunition and Laser Assessment System
  • assessment of ammunition shot at targets during military shooting fields is made by operators manually after shooting by use of optic nivo at 2 different towers.
  • assessment of target is not only slow but also open to user mistake as the calculations are made by 2 different operators.
  • a certain time period is to be waited before starting assessment of second target after completion of assessment of one target during training activities since the said assessment is slow.
  • KREA Taebaek Field Another available field (KOREA Taebaek Field) has video based assessment. However, this system shows extreme dependence to conditions such as night/day time, season conditions (summer, winter, fall, spring), type of ammunition, sun shining, open/cloudy weather, cloud passing etc., and requires manual attempt. System supports only one (1 ) target. The said system does not have laser assessment system.
  • Day time cameras are used in the above mentioned video based assessment system (in KOREA Taebaek Field).
  • KOREA Taebaek Field When weather conditions change, the image taken by camera change considerably and sometimes immediately. (Subject to reflection of the sunshine at a time, passing of cloud or even season conditions, green or yellow tones etc. are dominant in some seasons)
  • automatic assessment algorithms fail to act accordingly in case of immediate changes since the algorithms are not as complex as human eye. In such case the operator/user is expected to intervene the software continuously, rewind the recorded videos and mark certain places of the video and execute position calculation manually.
  • the invention has been developed with inspiration from existing situation and aims to eliminate the above mentioned disadvantages.
  • Video Ammunition and Laser Position Assessment Algorithm disclosed hereunder is to conduct calculation of position of the ammunitions or laser designator marker left in camera images by using camera and target positions and angles without need for any interactions from the operator. The value calculated thereby is free of errors. Additionally, algorithm also allows use of two (2) or more cameras. Thus the measurements can be made even more accurately.
  • Figure 1 shows various parameters used in the video ammunition and laser position assessment method disclosed hereunder.
  • Figure 2 shows the farthest place in sensor intersection of camera resolution pixel.
  • Figure 3 shows sensor plane
  • Figure 4 shows the view created as a result of perpendicular viewing the camera sensor in respect to optical axis.
  • Figure 5 shows homographs of the camera sensor, that is, earth projection.
  • Figure 6 shows top view and enlarged view of Figure 3 to make the facing camera sensor more understandable.
  • FIG. 7 is the drawing showing interrelations of components contained in Video Ammunition and Laser Assessment System (VALAS).
  • VALAS Video Ammunition and Laser Assessment System
  • Camera 1 sensor horizontal axis 2b
  • Camera 2 sensor horizontal axis
  • the longest edge of the polygon constituted by set of earth projection polygon of the pixel corresponding to ammunition trace point on camera 1 image sensor and earth projection polygon of the pixel corresponding to ammunition trace point on camera 2 image sensor
  • Mission controller workstation (computer)
  • Mission controller software 42a Mission conduct station (computer)
  • VALPAA Video Ammunition and Laser Position Assessment Algorithm/Method
  • VALPAA Video Ammunition and Laser Position Assessment Algorithm/Method
  • Camera 1 (1 a), camera 2 (1 b) and camera n (1 c) captures images of the target (6).
  • Camera 1 sensor horizontal axis (2a), camera 2 sensor horizontal axis (2b) and camera n sensor horizontal axis (2c) are respectively horizontal axis of image sensor of camera 1 (1 a), camera 2 (1 b) and camera n (1 c).
  • Distance of point desired to be found out from the camera 1 sensor horizontal axis (d1 ), distance of point desired to be found out from the camera 2 sensor horizontal axis (d2) and distance of point desired to be found out from the camera n sensor horizontal axis (dn) are shown in Figure
  • j2 (4b) and ⁇ 3 (4c) are the angles made by camera 1 sensor horizontal axis (2a), camera 2 sensor horizontal axis (2b) and camera n sensor horizontal axis (2c) with North (X axis) respectively.
  • Line 1 (5a), line 2 (5b) and line n (5c) are the lines respectively drawn perpendicular to camera 1 sensor horizontal axis (2a), camera 2 sensor horizontal axis (2b) and camera n sensor horizontal axis (2c) from the point to be found out (m).
  • Target (6) is the reference point.
  • Point to be found out (m) is the point desired to be on the earth (where ammunition or laser designator marking is made).
  • Earth projection polygon of the pixel corresponding to ammunition trace point on camera 1 image sensor (8a) and earth projection polygon of the pixel corresponding to ammunition trace point on camera 2 image sensor (8b) are respectively the areas of a pixel represented on the earth n camera sensors by camera 1 (1 a) and camera 2 (1 b).
  • Sensor pixel (9) is the pixel on the camera sensor.
  • the short edge of the area represented on the earth by a pixel on camera sensor (10) is shown in figure 2.
  • Figure 2 also shows the longest edge of the polygon constituted by set of earth projection polygon of the pixel corresponding to ammunition trace point on camera 1 image sensor and earth projection polygon of the pixel corresponding to ammunition trace point on camera 2 image sensor (k).
  • Total camera horizontal pixel (12) (XP) is total number of pixels on the camera sensor horizontal axis.
  • Total camera vertical pixel (13) (YP) is total numbers of pixels on the camera sensor vertical axis.
  • Camera sensor homograph (15) is the area projected on the earth by camera sensor.
  • Lower distance (16) is the horizontal distance of the camera sensor to target (6).
  • Side distance (c) is the distance of the camera sensor to target (6).
  • Height (h) is the height of the camera sensor to target.
  • ⁇ angle (19) is the elevation angle of camera sensor plane with the target (6)
  • Camera sensor plane (20) is the plane of the camera sensor.
  • the unit vector made by the camera sensor plane in line with target (q), unit vector perpendicular to unit vector made by camera sensor plane in line with target and extending upward (n) and vertical length of a pixel of a camera sensor on the earth projection (r) are shown in figure 3.
  • the camera image topology on the tower (24) is the topology of the camera image from the camera on the tower. Horizontal range of view of camera can be broadened as per proximity and focal settings. This is called the earth width seen by camera (25). - Camera focal distance (f) represents focal distance of the camera lens.
  • the calculations conducted in the Video Ammunition and Laser Position Assessment method can be collected under the following headings. The processes conducted under each said heading are described below:
  • Horizontal pixel value earth width seen by camera / total camera sensor horizontal pixel number or:
  • the distance required to be found at a camera is calculated by use of distance of target (6) to the exact centre of horizontal and vertical image of the camera, that is, camera 1 (1 a) or camera 2 (1 ) or camera n (3).
  • distance of target (6) is calculated by use of distance of target (6) to the exact centre of horizontal and vertical image of the camera, that is, camera 1 (1 a) or camera 2 (1 ) or camera n (3).
  • the calculated distances d1 and d2 are the distances from camera horizontal axis. Similar calculations are made for the point of interest for other cameras.
  • Point to be found out (m) with respect to any camera horizontal axis that is, camera 1 sensor horizontal axis (2a), camera 2 sensor horizontal axis (2b) or camera n sensor horizontal axis (2c) constitute the distance of point desired to be found on camera 1 sensor horizontal axis (d1 ), the distance of point desired to be found on camera 2 sensor horizontal axis (d2), the distance of point desired to be found on camera n sensor horizontal axis (dn) according to its definition.
  • d1 the distance of point desired to be found on camera 1 sensor horizontal axis
  • d2 the distance of point desired to be found on camera 2 sensor horizontal axis
  • dn the distance of point desired to be found on camera n sensor horizontal axis
  • ⁇ value is the slope of the line.
  • a T .A.x A T b (both sides are multiplied by A T )
  • Solution method 3 For any A, SVD solution (If A or A T .A is inappropriate or non invertible matrix)
  • S* matrix is the inverse form of S matrix members in 1/s form. If s value is close to 0, such as s ⁇ 1 e-3 then 1/s value is taken as 0. Upon solution of the equations, xm and ym values are found.
  • Video Ammunition and Laser Position Assessment Algorithm/Method is preferably achieved by means of software (preferably mission conduct software (42b)) in a computer (preferably mission conduct station (42)) in a system of which characteristics are described below.
  • MWIR camera 1 (33a), MWIR camera 2 (33b), MWIR camera 3 (33c), MWIR camera 4 (33d) and MWIR camera n (33e)
  • transmit the videos of the target taken from their positions to video servers preferably video server 1 (37a), video server 2 (37b), video server 3 (37c)).
  • MWIR cameras perform video settings (zoom, focus etc.) with respect to the target to be assessed.
  • Pan- Pan-Tilts preferably Pan-Tilt 1 (34a), Pan-Tilt 2 (34b), Pan-Tilt 3 (34c), Pan- Tilt 4 (34d) and Pan-Tilt n (34e) rotate the cameras located on them towards the target to be assessed.
  • Pan-Tilts communicate with mission controller workstation (41 a) via serial devices, preferably serial device 1 (38a), serial device 2 (38b), serial device 3 (38c).
  • Video servers preferably video server 1 (37a), video server 2 (37b) and video server 3 (37c), compress the raw video received from cameras and transmit to mission controller workstation (41 a).
  • Serial devices preferably serial device 1 (38a), serial device 2 (38b) and serial device 3 (38c) provide communication of mission controller workstation (41 a) and MWIR cameras (MWIR camera 1 (33a), MWIR camera 2 (33b), MWIR camera 3 (33c), MWIR camera 4 (33d) ve MWIR camera n (33e)) and Pan-Tilts (Pan-Tilt 1 (34a), Pan-Tilt 2 (34b), Pan-Tilt 3 (34c), Pan-Tilt 4 (34d) and Pan-Tilt n (34e)) with one of RS232, RS-422 and RS482 protocols.
  • MWIR cameras MWIR camera 1 (33a), MWIR camera 2 (33b), MWIR camera 3 (33c), MWIR camera 4 (33d) ve MWIR camera n (33e)
  • Pan-Tilts Pan-Tilt 1 (34a), Pan-Tilt 2 (34b),
  • Local Area Network (39) is Ethernet based local area network and provides communication of serial devices and video server with mission controller workstation (41 a) and mission conduct workstation (42).
  • SWIR cameras preferably SWIR camera 1 (40a) and SWIR camera 2 (40b) capture laser designator marker.
  • Mission controller workstation (41 a) is a computer having mission controller software (41 b) and sends commands to all cameras in VALAS (MWIR camera 1 (33a), MWIR camera 2 (33b), MWIR camera 3 (33c), MWIR camera 4 (33d), MWIR camera n (33e), SWIR camera 1 (40a) and SWIR camera 2 (40b)) and Pan-Tilts (Pan-Tilt 1 (34a), Pan-Tilt 2 (34b), Pan-Tilt 3 (34c), Pan-Tilt 4 (34d) and Pan-Tilt n (34e)) via mission controller software (41 b) and thus enables the cameras and Pan-Tilts to execute settings related to the target to be assessed (target 1 (35a), target 2 (35b), target n (35c), target 3 (35d)) and after completion of settings, transmits the information concerning completion of commands as UDP messages to mission conduct workstation (42a) via LAN (Local Area Network).
  • Mission Conduct Workstation (9a) is a computer wherein mission conduct software (9b) is installed and finds the place where the ammunition hits or laser designator illuminates, and assesses the captured image and calculates the distance of the ammunition or laser designator marker to the target and angle thereof with respect to world geographic north via task software (9b) with help of image processing algorithms.
  • the distance between target and cameras (MWIR, SWIR) and relative angles in respect to world geographic north are calculated by use of Real Time Kinematics GPS (GPS: Global Positioning system) or measurement/calculation station (43) (total station) and the results thereof are used as input.
  • GPS Global Positioning system
  • measurement/calculation station (43) total station

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to Video Ammunition and Laser Position Assessment Method in realizing calculation of positions of the traces left by ammunition or laser designator markers in camera images on the earth and it is characterized in that, by means of software, it conducts: calculation of horizontal length of a pixel of a camera sensor on the earth projection, calculation of vertical length of a pixel of a camera sensor on the earth projection (r), admission of angle made by camera sensor horizontal axis measured by use of real time kinematics GPS/measurement-calculation station with North as the horizontal axis angle of the camera sensor, calculation of the distance of the point from the camera horizontal axis after the target (6) is positioned in exact middle of the horizontal and vertical image of the camera, calculation of coordinates of the point, by use of the distance and horizontal axis angle obtained from more than one camera.

Description

DESCRIPTION
Video Ammunition and Laser Position Assessment Method The Related Art
The invention relates to Video Ammunition and Laser Position Assessment Algorithm/Method (VALPAA) realizing calculation of positions of the traces left by ammunition or laser designator marker in camera images on the earth.
VALPAA is developed for use in Video Ammunition and Laser Assessment System (VALAS) and is used to calculate exactly where the ammunition and/or laser designator marking captured in VALAS is on the earth.
Background of the Invention
In the related prior art, assessment of ammunition shot at targets during military shooting fields (for instance TURKEY KONYA EHTES field and Saudi Arabia TAIF Range pre-modernisation) is made by operators manually after shooting by use of optic nivo at 2 different towers. In the said case, assessment of target is not only slow but also open to user mistake as the calculations are made by 2 different operators. A certain time period is to be waited before starting assessment of second target after completion of assessment of one target during training activities since the said assessment is slow.
Another available field (KOREA Taebaek Field) has video based assessment. However, this system shows extreme dependence to conditions such as night/day time, season conditions (summer, winter, fall, spring), type of ammunition, sun shining, open/cloudy weather, cloud passing etc., and requires manual attempt. System supports only one (1 ) target. The said system does not have laser assessment system.
Day time cameras are used in the above mentioned video based assessment system (in KOREA Taebaek Field). When weather conditions change, the image taken by camera change considerably and sometimes immediately. (Subject to reflection of the sunshine at a time, passing of cloud or even season conditions, green or yellow tones etc. are dominant in some seasons) In such a case, automatic assessment algorithms fail to act accordingly in case of immediate changes since the algorithms are not as complex as human eye. In such case the operator/user is expected to intervene the software continuously, rewind the recorded videos and mark certain places of the video and execute position calculation manually.
As a result, due to above described disadvantages and inadequacy of existing solutions it has been necessary to make development in the related art.
Purpose of the Invention
The invention has been developed with inspiration from existing situation and aims to eliminate the above mentioned disadvantages.
The purpose of the Video Ammunition and Laser Position Assessment Algorithm disclosed hereunder is to conduct calculation of position of the ammunitions or laser designator marker left in camera images by using camera and target positions and angles without need for any interactions from the operator. The value calculated thereby is free of errors. Additionally, algorithm also allows use of two (2) or more cameras. Thus the measurements can be made even more accurately.
In order to achieve the above mentioned purposes, a Video Ammunition and Laser Position Assessment Method realizing calculation of positions of the traces left by ammunition or laser designator marker in camera images on the earth have been developed. This method is characterized in that, by use of software installed on a computer it realizes:
- calculation of horizontal length of a pixel of a camera sensor on the earth projection,
- calculation of vertical length of a pixel of a camera sensor on the earth projection, - admission of angle made by camera sensor horizontal axis measured by use of real time kinematics GPS/measurement-calculation station with North as the horizontal axis angle of the camera sensor,
- calculation of the distance of the point needed to be found out from the camera horizontal axis after the target is positioned in the exact middle of the horizontal and vertical image of the camera,
- calculation of coordinates of the point to be found out, by use of the distance and horizontal axis angle obtained from more than one camera. The structural and characteristics features of the invention and all advantages will be understood better in detailed descriptions with the figures given below and with reference to the figures, and therefore, the assessment should be made taking into account the said figures and detailed explanations. Brief Description of the Drawings
Figure 1 shows various parameters used in the video ammunition and laser position assessment method disclosed hereunder.
Figure 2 shows the farthest place in sensor intersection of camera resolution pixel. Figure 3 shows sensor plane,
Figure 4 shows the view created as a result of perpendicular viewing the camera sensor in respect to optical axis.
Figure 5 shows homographs of the camera sensor, that is, earth projection.
Figure 6 shows top view and enlarged view of Figure 3 to make the facing camera sensor more understandable.
Figure 7 is the drawing showing interrelations of components contained in Video Ammunition and Laser Assessment System (VALAS).
Description of Part References
1 a. Camera 1
1 b. Camera 2
1 c. Camera n
2a. Camera 1 sensor horizontal axis 2b Camera 2 sensor horizontal axis
2c. Camera n sensor horizontal axis
d1 . Distance of point to be found out from the camera 1 sensor on horizontal axis d2. Distance of point to be found out from the camera 2 sensor on horizontal axis dn. Distance of point to be found out from the camera n sensor on horizontal axis γ1. Angle between camera 1 sensor horizontal axis and North
γ2. Angle between camera 2 sensor horizontal axis and North
γ3. Angle between camera n sensor horizontal axis and North
5a. Line 1
5b. Line 2
5c. Line n
6. Target
m. Point to be found out
8a. Earth projection polygon of the pixel corresponding to ammunition trace point on camera 1 image sensor
8b. Earth projection polygon of the pixel corresponding to ammunition trace point on camera 2 image sensor
9. Sensor pixel
10. Short edge of area represented by a pixel on the earth existing on camera sensor
k. The longest edge of the polygon constituted by set of earth projection polygon of the pixel corresponding to ammunition trace point on camera 1 image sensor and earth projection polygon of the pixel corresponding to ammunition trace point on camera 2 image sensor
12. Total camera horizontal pixel
13. Total camera vertical pixel
15. Camera sensor homograph
16. Lower distance
c. Side distance
h. Height
β- Elevation angel targeted by camera sensor plane
20. Sensor plane
q- Unit vector made by camera sensor plane in line with target
n. Unit vector perpendicular to unit vector made by camera sensor plane in line with target and extending upward r. Vertical length of a pixel of a camera sensor on the earth projection
24. Camera image topology on the tower
25. Earth width seen by camera
f. Camera focal distance
27. Camera 1 sensor axis
28. Camera 2 sensor axis
29. Camera n sensor axis
30. Camera 1 optic axis
31 . Camera 2 optic axis
33a. MWIR Camera 1
33b. MWIR Camera 2
33c. MWIR Camera 3
33d. MWIR Camera 4
33e. MWIR Camera n
34a. Pan-Tilt 1
34b. Pan-Tilt 2
34c. Pan-Tilt 3
34d. Pan-Tilt 4
34e. Pan-Tilt n
35a. Target 1
35b. Target 2
35c. Target n
35d. Target 3
37a. Video Server 1
37b. Video Server 2
37c. Video Server 3
38a. Serial Device 1
38b. Serial Device 2
38c. Serial Device 3
39. Local Area Network
40a SWIR Camera 1
40b. SWIR Camera 2
41 a. Mission controller workstation (computer)
41 b. Mission controller software 42a. Mission conduct station (computer)
42b. Mission conduct software
43. Real Time Kinematics GPS/Measurement-calculation station i= 1 , 2, n and xi, yi: coordinates
The drawings are not necessarily to be scaled and the details not necessary for understanding the present invention might have been neglected. In addition, the components which are equivalent to great extent at least or have equivalent functions at least have been assigned the same number.
Detailed Description of the Invention
In this detailed description, the preferred embodiments of the Video Ammunition and Laser Position Assessment Algorithm/Method (VALPAA) being subject of this invention have been disclosed solely for the purpose of better understanding of the subject.
The components and properties of such components used in the Video Ammunition and Laser Position Assessment Algorithm/Method (VALPAA) disclosed hereunder are given below.
- Camera 1 (1 a), camera 2 (1 b) and camera n (1 c) captures images of the target (6).
- Camera 1 sensor horizontal axis (2a), camera 2 sensor horizontal axis (2b) and camera n sensor horizontal axis (2c) are respectively horizontal axis of image sensor of camera 1 (1 a), camera 2 (1 b) and camera n (1 c). Distance of point desired to be found out from the camera 1 sensor horizontal axis (d1 ), distance of point desired to be found out from the camera 2 sensor horizontal axis (d2) and distance of point desired to be found out from the camera n sensor horizontal axis (dn) are shown in Figure
1 .
- γΐ (4a), j2 (4b) and γ3 (4c) are the angles made by camera 1 sensor horizontal axis (2a), camera 2 sensor horizontal axis (2b) and camera n sensor horizontal axis (2c) with North (X axis) respectively.
- Line 1 (5a), line 2 (5b) and line n (5c) are the lines respectively drawn perpendicular to camera 1 sensor horizontal axis (2a), camera 2 sensor horizontal axis (2b) and camera n sensor horizontal axis (2c) from the point to be found out (m).
Target (6) is the reference point.
Point to be found out (m) is the point desired to be on the earth (where ammunition or laser designator marking is made).
Earth projection polygon of the pixel corresponding to ammunition trace point on camera 1 image sensor (8a) and earth projection polygon of the pixel corresponding to ammunition trace point on camera 2 image sensor (8b) are respectively the areas of a pixel represented on the earth n camera sensors by camera 1 (1 a) and camera 2 (1 b).
Sensor pixel (9) is the pixel on the camera sensor. The short edge of the area represented on the earth by a pixel on camera sensor (10) is shown in figure 2. Figure 2 also shows the longest edge of the polygon constituted by set of earth projection polygon of the pixel corresponding to ammunition trace point on camera 1 image sensor and earth projection polygon of the pixel corresponding to ammunition trace point on camera 2 image sensor (k).
Total camera horizontal pixel (12) (XP) is total number of pixels on the camera sensor horizontal axis.
Total camera vertical pixel (13) (YP) is total numbers of pixels on the camera sensor vertical axis.
Camera sensor homograph (15) (earth projection) is the area projected on the earth by camera sensor.
Lower distance (16) is the horizontal distance of the camera sensor to target (6).
Side distance (c) is the distance of the camera sensor to target (6).
Height (h) is the height of the camera sensor to target.
β angle (19) is the elevation angle of camera sensor plane with the target (6)
Camera sensor plane (20) is the plane of the camera sensor.
The unit vector made by the camera sensor plane in line with target (q), unit vector perpendicular to unit vector made by camera sensor plane in line with target and extending upward (n) and vertical length of a pixel of a camera sensor on the earth projection (r) are shown in figure 3.
The camera image topology on the tower (24) is the topology of the camera image from the camera on the tower. Horizontal range of view of camera can be broadened as per proximity and focal settings. This is called the earth width seen by camera (25). - Camera focal distance (f) represents focal distance of the camera lens. The calculations conducted in the Video Ammunition and Laser Position Assessment method can be collected under the following headings. The processes conducted under each said heading are described below:
A. Viewing the camera sensor from optical axis at perpendicular angle and projection of this sensor on the earth
B. Calculation of horizontal length (z) of earth projection of camera sensor pixel
C. Calculation of vertical length of a pixel earth projection of a camera sensor (r) (2 or more camera requirement)
D. Measurement of horizontal axis angle of camera sensor
E. Distance required to be calculated for camera when a case requiring assessment of target occurs
F. Finding out related point on the earth by use of distance obtained from more than one camera and horizontal axis angle
A. Viewing the camera sensor from optical axis at perpendicular angle and projection of this sensor on the earth
When the projection of a camera sensor made on the earth, that is, camera sensor homograph (15) and earth width seen by camera (25) is studied, it can be understood from the length of the vertical projection of a camera pixel on the earth (r) and earth width seen by camera (25) that it corresponds to different sizes on the earth.
B. Calculation of horizontal length of earth projection of camera sensor pixel
Horizontal pixel value = earth width seen by camera / total camera sensor horizontal pixel number or:
z = 2t / XP
For instance, assume that a camera having total horizontal sensor of 640 pixel has visible area of 100 m, then each pixel on horizontal line approximately corresponds to z = 100/640 = 0.15625m = 15.625 cm.
C. Calculation of vertical length of a pixel earth proiection of a camera sensor (r) (2 or more camera requirement)
If we watched the earth by camera (camera 1 (1 a), camera 2 (1 b), camera n (1 c)) exactly from the top, the horizontal pixel value z would be equal to vertical length of earth projection of a pixel of camera sensor (r) which is the vertical pixel value. In this case, from basic trigonometry definition: Cos(β) = z / r or r = z / Cos(β) or r = 2t / (XP* Cos(β))
From basic trigonometry definition, angle β
Cοs(β) = height / side distance = h/c. When all data are combined
r = (2t * c) / ( XP * h)
For example, to calculate vertical pixel projection value of camera of total 640 pixel and looking 100 m horizontal 2000 m further: by using the previous example vertical value is:
r = (100 * 2000) / (640 * 20) = 15.625m.
From this point, typically, to find place of a point based on horizontal projection and vertical projection of a camera sensor pixel (9) on the earth, it can be seen that it will not be possible to handle it by means of only one camera if a solution under a meter accuracy is desired. Because while an object can be located within a sensitivity up to 15 cm horizontally, the location can only be located within a sensitivity up to 15m on vertical axis. If cameras do not look each other at 0 or 180 degrees (γ1-γ2, γ2-γ3, γ1-γ3 relative angles) a new method can be developed, and under meter accuracy horizontal and vertical values of the point of interest can be found with respect to target (6) (Identification of a point on earth by use of distance and horizontal axis angle obtained from more than one camera is described under F). D. Measurement of horizontal axis angle of camera sensor
By use of real time kinematics GPS/measurement-calculation station (43), places of camera (camera 1 (1 a), camera 2 (1 b), camera n (1 c)) and target (6) are identified and from the said measurement devices, angle between camera 1 sensor horizontal axis and North (γΐ), the angle between camera 2 and North (γ2) and angle between camera n sensor horizontal axis and North (γ3) can be directly taken.
E. Calculation of distance required to be calculated for camera when a case requiring assessment of target (6) occurs
When a case is required to be assessed occurs at the target (6), the distance required to be found at a camera is calculated by use of distance of target (6) to the exact centre of horizontal and vertical image of the camera, that is, camera 1 (1 a) or camera 2 (1 ) or camera n (3). For instance, for a camera facing 100 m from the horizontal axis for a point of interest in image, having 640 pixel horizontal width and after horizontal placement, if we deal with pixel number 10 from the left of the exact mid-point, d1 = -10 * z = -10 * 100 / 640 = -1 .5625m If we deal with pixel number 20 on the right; d1 = +20 * z = +20 * 100 / 640 = +3.125m.
The calculated distances d1 and d2 are the distances from camera horizontal axis. Similar calculations are made for the point of interest for other cameras.
F. Finding out related point on the earth by use of distance obtained from more than one camera and horizontal axis angle
Point to be found out (m) with respect to any camera horizontal axis, that is, camera 1 sensor horizontal axis (2a), camera 2 sensor horizontal axis (2b) or camera n sensor horizontal axis (2c) constitute the distance of point desired to be found on camera 1 sensor horizontal axis (d1 ), the distance of point desired to be found on camera 2 sensor horizontal axis (d2), the distance of point desired to be found on camera n sensor horizontal axis (dn) according to its definition. In order to find the point (m), which is intersection point of these horizontal axis, d1 calculated under the heading E. The distance required to be calculated for camera when a case requiring assessment of target (6) occurs and angle between camera 1 sensor horizontal axis and North (γΐ), angle between camera 2 sensor horizontal axis and North (γ2) and angle between camera n sensor horizontal axis and North (γ3) defined under D. Measurement of horizontal axis angle of camera sensor are used together.
For example, for any camera, from i = 1 , 2, ...,n using basic trigonometry definitions, (Cos and Sin definition) xi = di * Cos(γi)
yi = di * Sin(γi)
If the coordinates of the target (6) is xm, ym, using the analytical geometry definition of line where slope of a line and a point where the line passes can be written as, ym - yi = ε * (xm - xi)
Here ε value is the slope of the line. Slope of line is ε = Tan(γi± 90º) according to Euclid geometry line definition. ±90º, comes from the definition of angle between camera 1 sensor horizontal axis and North (γΐ), angle between camera 2 sensor horizontal axis and North (γ2) and angle between Camera n sensor horizontal axis and North (γ3). From definitions of trigonometry: Tan(γi) = Sin(γi)/Cos(γi),
ε = Tan(γi±90º) = - Cos(γi)/Sin(γi) (±, and - signs are controlled) and (xm, ym) passing through any camera (xi, yi) put into place in the line equation: ym - yi = - Cos(γi)/Sin(γi) * (xm - xi)
If the equation is amended: ym * Sin(γi) - yi * Sin(γi) = -Cos(γi) * xm + Cos(γi) * xi Cos(γi) * xm + Sin(γi) ym = Sin(γi) * γi + Cos(γi) * xi If xi and yi definitions are placed in the equations, Cos(γi) * xm + Sin(γi) ym = Sin(γi) * di * Sin(γi) + Cos(γi) * di * Cos(γi) Cos(γi) * xm + Sin(γi) ym = di * (Sin2(γi) + Cos2(γi)) . If (Sin2(γi) + Cos2(γi)) = 1 the final equation is obtained.
Cos(γi) * xm + Sin(γi) ym = di When the last equation is considered, we have 2 unknown (xm, ym) equation. As it is seen from the equation, to solve the equation with 2 unknown, we need at least 2 equations. From equations it is understood that, we need at least 2 cameras' position values that is, γi, di. Considering 2 or more equations, we get the following form:
Cos(γ1 ) * xm + Sin(y1 ) ym
Cos(y2) * xm + Sin(y2) ym
Cos(yn) * xm + Sin(yn) ym
Putting the last equations in matrix form
Figure imgf000013_0001
This equation is in A.x=b format in Linear Algebra and is solved by use of the standard mathematical methods (Reference 1 ), like singular value decomposition. Sample solutions are given below. Solution method 1 : If square A is matrix and inversion can be taken
Ax = b
x = A 1b (both sides are multiplied by A 1 , A~1A=I and so lx=x is used) Solution method 2: If square A is not matrix and AT.A reverse can be taken
Ax=b
AT.A.x = ATb (both sides are multiplied by AT )
x = (AT.A) 1 ATb ((AT.A) 1 (both sides are multiplied, and only x remains on the left side) Solution method 3: For any A, SVD solution (If A or AT.A is inappropriate or non invertible matrix)
It is known from singular value theory of linear algebra that
A=U.S.VT enables 3 matrices. Because of properties of U, S and V matrices x = V.S*.UTY
S* matrix is the inverse form of S matrix members in 1/s form. If s value is close to 0, such as s<1 e-3 then 1/s value is taken as 0. Upon solution of the equations, xm and ym values are found.
Video Ammunition and Laser Position Assessment Algorithm/Method (VALPAA) is preferably achieved by means of software (preferably mission conduct software (42b)) in a computer (preferably mission conduct station (42)) in a system of which characteristics are described below.
- MWIR cameras, preferably MWIR camera 1 (33a), MWIR camera 2 (33b), MWIR camera 3 (33c), MWIR camera 4 (33d) and MWIR camera n (33e), transmit the videos of the target taken from their positions to video servers (preferably video server 1 (37a), video server 2 (37b), video server 3 (37c)).
MWIR cameras perform video settings (zoom, focus etc.) with respect to the target to be assessed.
- Pan-Tilts, preferably Pan-Tilt 1 (34a), Pan-Tilt 2 (34b), Pan-Tilt 3 (34c), Pan- Tilt 4 (34d) and Pan-Tilt n (34e) rotate the cameras located on them towards the target to be assessed. Pan-Tilts communicate with mission controller workstation (41 a) via serial devices, preferably serial device 1 (38a), serial device 2 (38b), serial device 3 (38c).
Video servers, preferably video server 1 (37a), video server 2 (37b) and video server 3 (37c), compress the raw video received from cameras and transmit to mission controller workstation (41 a).
Serial devices, preferably serial device 1 (38a), serial device 2 (38b) and serial device 3 (38c) provide communication of mission controller workstation (41 a) and MWIR cameras (MWIR camera 1 (33a), MWIR camera 2 (33b), MWIR camera 3 (33c), MWIR camera 4 (33d) ve MWIR camera n (33e)) and Pan-Tilts (Pan-Tilt 1 (34a), Pan-Tilt 2 (34b), Pan-Tilt 3 (34c), Pan-Tilt 4 (34d) and Pan-Tilt n (34e)) with one of RS232, RS-422 and RS482 protocols.
Local Area Network (39) is Ethernet based local area network and provides communication of serial devices and video server with mission controller workstation (41 a) and mission conduct workstation (42).
SWIR cameras, preferably SWIR camera 1 (40a) and SWIR camera 2 (40b) capture laser designator marker.
Mission controller workstation (41 a) is a computer having mission controller software (41 b) and sends commands to all cameras in VALAS (MWIR camera 1 (33a), MWIR camera 2 (33b), MWIR camera 3 (33c), MWIR camera 4 (33d), MWIR camera n (33e), SWIR camera 1 (40a) and SWIR camera 2 (40b)) and Pan-Tilts (Pan-Tilt 1 (34a), Pan-Tilt 2 (34b), Pan-Tilt 3 (34c), Pan-Tilt 4 (34d) and Pan-Tilt n (34e)) via mission controller software (41 b) and thus enables the cameras and Pan-Tilts to execute settings related to the target to be assessed (target 1 (35a), target 2 (35b), target n (35c), target 3 (35d)) and after completion of settings, transmits the information concerning completion of commands as UDP messages to mission conduct workstation (42a) via LAN (Local Area Network).
Mission Conduct Workstation (9a) is a computer wherein mission conduct software (9b) is installed and finds the place where the ammunition hits or laser designator illuminates, and assesses the captured image and calculates the distance of the ammunition or laser designator marker to the target and angle thereof with respect to world geographic north via task software (9b) with help of image processing algorithms. - The distance between target and cameras (MWIR, SWIR) and relative angles in respect to world geographic north are calculated by use of Real Time Kinematics GPS (GPS: Global Positioning system) or measurement/calculation station (43) (total station) and the results thereof are used as input.
References:
1 : Matrices: Theory and Application 2E Springer, D. SERRE 2010, page 216-217).

Claims

1. A video ammunition and laser position assessment method realizing calculation of positions of the traces left by ammunition and laser designator markers in camera images on the earth and it is characterized in that, by means of a software in a computer:
- calculation of horizontal length of a pixel of a camera sensor on the earth projection,
- calculation of vertical length of a pixel of a camera sensor on the earth projection (r),
- admission of angle made by camera sensor horizontal axis measured by use of real time kinematics GPS/measurement-calculation station with North as the horizontal axis angle of the camera sensor,
- calculation of the distance of the point needed to be found out from the camera horizontal axis after the target (6) is positioned in the exact middle of the horizontal and vertical image of the camera,
- calculation of coordinates of the point to be found out, by use of the distance and horizontal axis angle obtained from more than one camera.
2. A method according to claim 1 and it is characterized in that in the process step of calculation of horizontal length of a pixel of a camera sensor on the earth projection, the equation z = 2t / XP is used where
z: z horizontal pixel value
2t: total camera horizontal pixel number
XP: width of earth seen by camera
3. A method according to claim 1 or claim 2 and it is characterized in that in the process step of calculation of vertical length of a pixel of a camera sensor on the earth projection (r), the equation r = (2t * c) / ( XP * h) is used, where r: length of a camera pixel perpendicular to earth
β elevation angle of camera sensor plane with the target
XP: earth width seen by camera
h: height
c: side distance
4. A method according to claims 1 , 2 or 3 and it is characterized in that the process step of calculation of the distance of the point needed to be calculated from the camera horizontal axis after the target (6) is positioned in exact middle of the horizontal and vertical image of the camera, the equations of
d1 = -a * z
d1 = +b * z
are used where
a: the distance seen from camera horizontal axis
b: pixel used in the calculation
d1 : distance from camera horizontal axis
z: camera horizontal pixel number
pixels on the left just in the mid-point of camera
"+": pixels on the right just in the mid-point of camera.
5. A method according to claims 1 , 2, 3 or 4 and it is characterized in that in the process step of calculation of coordinates of the point to be found out, by use of the distance and horizontal axis angle obtained from more than one camera, the equations of
xi = di * Cos(γi)
yi = di * Sin(γi) ε = Tan(γi±90º)
ym - yi = ε * (xm - xi)
Tan(γi) = Sin(γi)/Cos(γi) ε = Tan(γi±90º) = - Cos(γi)/Sin(γi) ym - γi = - Cos(γi)/Sin(γi) * (xm - xi)
ym * Sin(γi) - γi * Sin(γi) = -Cos(γi) * xm + Cos(γi ) * xi
Cos(γi) * xm + Sin(γi) ym = Sin(γi) * γi + Cos(γi) * xi
Cos(γi) * xm + Sin(γi) ym = Sin(γi) * di * Sin(γi) + Cos(γi) * di * Cos(γi)
Cos(γi) * xm + Sin(γi) ym = di * (Sin2(γi) + Cos2(γi)
(Sin2(γi) + Cos2(γi)) = 1
Cos(γi) * xm + Sin(γi) ym = di
Figure imgf000019_0001
are used where xm, ym: Coordinate of the point desired to be found out, For any camera i=1 , 2, n,
cos: cosine
sin: sinus
tan: tangent
γ: angle between camera sensor horizontal axis and North.
PCT/TR2017/050283 2016-06-22 2017-06-22 Video ammunition and laser position assessment method WO2017222490A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TR201608607 2016-06-22
TR2016/08607 2016-06-22

Publications (2)

Publication Number Publication Date
WO2017222490A2 true WO2017222490A2 (en) 2017-12-28
WO2017222490A3 WO2017222490A3 (en) 2018-02-08

Family

ID=59997408

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/TR2017/050283 WO2017222490A2 (en) 2016-06-22 2017-06-22 Video ammunition and laser position assessment method

Country Status (1)

Country Link
WO (1) WO2017222490A2 (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7746450B2 (en) * 2007-08-28 2010-06-29 Science Applications International Corporation Full-field light detection and ranging imaging system
JP2011203057A (en) * 2010-03-25 2011-10-13 Tokyo Electric Power Co Inc:The Distance measuring instrument for flying object and flying object position measuring instrument
KR101199913B1 (en) * 2011-01-27 2012-11-09 김은석 Cross type stereo vision system and method for measuring distance

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
D. SERRE: "Matrices: Theory and Application 2E", 2010, SPRINGER, pages: 216 - 217

Also Published As

Publication number Publication date
WO2017222490A3 (en) 2018-02-08

Similar Documents

Publication Publication Date Title
CN108965809B (en) Radar-guided video linkage monitoring system and control method
CN109523471B (en) Method, system and device for converting ground coordinates and wide-angle camera picture coordinates
WO2019076304A1 (en) Binocular camera-based visual slam method for unmanned aerial vehicles, unmanned aerial vehicle, and storage medium
CN112184890B (en) Accurate positioning method of camera applied to electronic map and processing terminal
CN110799921A (en) Shooting method and device and unmanned aerial vehicle
US20190370983A1 (en) System and method for real-time location tracking of a drone
CN110319772B (en) Visual large-span distance measurement method based on unmanned aerial vehicle
CN110910459B (en) Camera device calibration method and device and calibration equipment
US11678647B2 (en) Multiscopic whitetail scoring game camera systems and methods
CN105424006A (en) Unmanned aerial vehicle hovering precision measurement method based on binocular vision
US8547375B2 (en) Methods for transferring points of interest between images with non-parallel viewing directions
CN104394356B (en) A kind of video camera is associated with the automatic zoom method of control with clipping the ball
CN108846084B (en) System and method for generating live-action map
JP2011095112A (en) Three-dimensional position measuring apparatus, mapping system of flying object, and computer program
CN114785951B (en) Positioning tracking method based on linkage of high-tower monitoring equipment and unmanned aerial vehicle
WO2019205087A1 (en) Image stabilization method and device
CN112585956B (en) Track replay method, system, movable platform and storage medium
EP3385747B1 (en) Method, device and system for mapping position detections to a graphical representation
CN111627048B (en) Multi-camera cooperative target searching method
Veth et al. Two-dimensional stochastic projections for tight integration of optical and inertial sensors for navigation
Doskocil et al. Measurement of distance by single visual camera at robot sensor systems
WO2017222490A2 (en) Video ammunition and laser position assessment method
Bhanu et al. Inertial navigation sensor integrated motion analysis for obstacle detection
CN111402324A (en) Target measuring method, electronic equipment and computer storage medium
US10663258B2 (en) Gunnery control system and gunnery control method using the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17777667

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17777667

Country of ref document: EP

Kind code of ref document: A2