US20130192451A1 - Anti-sniper targeting and detection system - Google Patents
Anti-sniper targeting and detection system Download PDFInfo
- Publication number
- US20130192451A1 US20130192451A1 US13/385,040 US201213385040A US2013192451A1 US 20130192451 A1 US20130192451 A1 US 20130192451A1 US 201213385040 A US201213385040 A US 201213385040A US 2013192451 A1 US2013192451 A1 US 2013192451A1
- Authority
- US
- United States
- Prior art keywords
- target
- sniper
- camera
- fire
- detection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/14—Indirect aiming means
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/14—Indirect aiming means
- F41G3/147—Indirect aiming means based on detection of a firing weapon
Definitions
- This application relates to pre-empting, target & weapon tracking, counteracting the attack of snipers by detection of snipers, or potential snipers, rapidly determining, assigning, coordinating, and transferring, the target information between anti-sniper systems.
- the sniper position information can be used by counteracting forces through target information sharing and/or by rapidly positioning a counter-sniper weapon.
- the counter-sniper weapon robotic arm can rapidly zoom, pan, and tilt a camera (infrared or otherwise as appropriate), based on target bearing, elevation, range, and wind condition calculations, immediately be moved upon sniper position to rapidly and accurately counter fire against any sniper or multiple snipers. Small human adjustments of pan, tilt, and zoom can be made upon human verification of target from rapidly zoomed camera.
- multiple systems can be designed to cooperate to nearly simultaneously coordinate, assign, and communicate target data and counter-fire on automatically or semi-automatically assigned multiple sniper targets in response all at once, where targets can be chosen programmatically (automatically or semi-automatically) optimally by relative unit positions.
- Targets can also be assigned based on terrain occlusions, for maximizing safety of units based on these terrain occlusions from a terrain occlusion (line of sight from target) data base or calculation, and/or from system instrumentation of terrain (such as from three dimensional depth cameras).
- Snipers can be dealt with in multiple stages: pre-detection, barrel/glint (pre-fire) detection, fire detection, bullet trajectory tracking, and fire return, as snipers come and go.
- the combination of stereoscopic/spherical/depth (omni-directional) cameras as well as a spherical/omni-directional microphone system and a radar system can be used to measure target range.
- Other techniques to determine target range can be optic flow estimation, laser range finding, terrain database information or any other suitable technique. If a muzzle flash or heated muzzle can be detected optically, because the speed of light is much greater than the speed of sound through air, the muzzle flash and muzzle sound detection can be used to determine range by taking the time difference at the start of the muzzle flash from the start of the optical detection and multiplying it by the speed of sound in air of which can be optimized using air pressure & temperature sensors if needed. Sensor integration and synthesis can be achieved by weighting the probability of accuracy, precision, and tolerances. Many of these techniques are well known in the art.
- Pre-sniper detection techniques using radar signature reflection of gun barrel are described in U.S. Pat. No. 8,049,659; as well as described in T. CIPARA; “Using Radar Signals to Safeguard Our Troops”; Mar. 15, 2011; George Mason University; Fairfax, Va.; USA.
- Other pre-sniper detection techniques using detection of glint or reflection from a scope, binocular, or even a human eye lens are described in part in U.S. Pat. App. No. 2008/0136626 and also described in part in H. HASHARON; “SLD500 Sniper Locator CILAS”; 2006; Defense Update International Online Defense Magazine; Issue 2; Israel; N. SHACHTMAN; “Lasers Stop Snipers Before they Fire”; Apr.
- ASFAW “Impact of Pose and Glasses on Face Detection Using the Red Eye Effect”; May 2003; CCECE 2003; IEEE; Montreal; Canada.
- a counter measure to snipe scopes is described in M. NAIMARK; “How to ZAP a Camera: Using Lasers to Temporarily Neutralize Camera Sensors”; October 2002; USA.
- SIMONIS “Nanotechnology: innovation opportunities for tomorrow's defense”; March 2006; TNO Science & Industry; Netherlands; G. SIMON; “Sensor Network-Based Counter-sniper System”; Nov. 3, 2004; SenSys'04; Baltimore, Md., USA; A. WHITE; “Fighting fire with fire: technology finds a solution to sniper attacks”; June 2009; pg. 52-57; Jane's International Defense Review; Englewood, Colo.; USA; J. KELLER; “Sniper-detection systems to provide perimeter security for Army forces in Afghanistan to come from Raytheon BBN”; Feb. 15, 2011; Military & Aero. Elect.; USA; T. V.
- RAFAEL Anti-Sniper Systems Finding Their Range”; Nov. 3, 2005; Defense Industry Daily; USA; M. C. ERTEM; “An acoustic sensor for the viper infrared sniper detection system”; August 1999; Maryland Advanced Development Laboratory; Greenbelt, Md.; USA.
- the trajectory tracking of bullets fired is described in X. L. ZHANG; “Real-time tracking of bullet trajectory based on chirp transform in a multi-sensor multi-freq radar”; May 10, 2010; Radar Conference, 2010 IEEE; USA; Y. ZHANG; “Real-time acquisition and tracking of sniper bullets using multi-sensor multi-frequency radar techniques”; Aug. 31, 2009; SSP '09. IEEE; USA; BROWN, E. R.; “Ku-band retrodirective radar for ballistic projectile detection and tracking”; May 4, 2009; Radar Conference, 2009 IEEE; Pasadena, Calif.; USA.
- D. CRANE “Anti-Sniper/Sniper Detection/Gunfire Detection Systems at a Glance”; New and Future Technology; Jul. 19, 2006; Defense Review; Miami, Fla.; USA; P. SARKA; “iRobot and Boston Univ. Photonics Center Unveil Advanced Sniper Detection System for iRobot Packbot”; Oct. 3, 2005; iRobot Corp. Press Release; R. DOUGLAS; “The Objective Force Soldier/Soldier Team—Volume II—The Science and Technology Challenges”; November 2001; Army Science Board SAAL-ASB; Arlington, Va.; USA; C.
- a rapid and accurate sniper counter acting force response system that can not only allow operators to immediately respond but can also pre-empt the sniper by identifying sniper targets in advance using detection of movement or presence of infrared signatures of objects using frame by frame image, as well as gun barrel radar detection, processing adjusting for vehicle motion and vehicle position and utilizing existing information about the terrain.
- a fast autonomous, robotically gimbaled, zoom-able camera an operator can quickly scan and verify suspect targets. This can be done as a vehicle progresses through the field of operation, by target locking and tracking, while allowing the operator to simply press a “next target” (or “last target”, or “next coordinated assigned target”, like a target slide show) activation to quickly evaluate suspect targets in order to differentiate real from non-real targets.
- the return fire weapon and rapid zoom camera can help an operator further evaluate, from a great distance, what the target is holding or doing, and if the target is verified as threatening, the anti-sniper system can fire at the target with great accuracy.
- Highly robust smooth image stabilizers, gimbals, and laser locking techniques along with gyroscopes can help stabilize and fix and follow the highly zoomed (infrared and/or other) camera onto the target while the vehicle is still in motion further enhancing the operator to verify if a target is threatening or not in advance of being sniped, allowing a pre-emptive snipe at a sniper.
- Anti-sniper systems can share critical data and coordinate actions with each other in real time in a firefight such as friendly positions as well as target positions, and friendly weapon vectors, trajectories, and friendly as well as target firing times.
- the anti-sniper camera system can also be made to incorporate a multitude of zoomed cameras per target, as well as multiple robotic anti-sniper weapons so that even more rapid target assessment and response can be made.
- the anti-sniper system objective is ultimately to act as a very significant assured deterrent to firing any weapon at the anti-sniper system. It is to re-assert balance in asymmetrical warfare as well as mutual assured destruction of equal system capability, or even verifiable threat, of any gun weapon firing or pointing, thus making it a tremendous counter incentive to firing a gun, or even threatening any force carrying a fully autonomous (with manual override) integrated anti-sniper system. It greatly reduces the element of chance involved, and it is a powerful deterrent to not only firing a weapon, but even pointing it.
- FIG. 1 is a block diagram of the overall anti-sniper targeting and detection sharing system.
- FIG. 2 shows prior art calculations of how range can be estimated by combining acoustic sensors (microphones) with optics and taking advantage of the speed of light being much greater than that of sound to determine range of gun fire.
- FIG. 3 shows the anti-sniper targeting and detection sharing system on top of a armored personnel vehicle, with spherical stereoscopic camera & microphone system, gimbaled weapon system with laser tracking and zoom IR (or other suitable) camera mounted on pole arm, with wind sensors, differential global positioning system, radar, glint detector, on top of spherical stereoscopic camera & microphone system.
- FIG. 4 shows a planar geometry of the field of view of a camera with projection of a target onto the field of view used to calculate target angles and range.
- FIG. 5 shows a planar geometry of a pair of stereoscopic cameras projected onto the plane of the robotic camera weapon laser arm.
- FIG. 6 shows a three dimensional perspective geometry of a stereoscopic camera pair with the robotic camera weapon laser arm, where the calculations for rapidly and automatically determining the angles, and range to position the zoomed camera/weapon system onto detected target(s) in rapid succession.
- FIG. 7 is a flow chart of the system process of detecting targets and rapidly automatically positioning the zoom camera gyro stabilized laser weapon system onto the potential targets.
- FIG. 8 is a coordinated sniper fires pre-detect, glint/barrel detects, fire detects, trajectory track, and coordinated fire return stage diagram showing how the coordinated systems function in action.
- FIG. 1 shows a system block diagram of the anti-sniper targeting and detection system 2 .
- a pre-snipe omni-directional sensor scope/eye glint/barrel IR/radar detection system 34 is shown connected to computer system 10 to detect evidence of snipers, scopes, other optical sensors as well as gun barrels, if they are present, before a sniper is able to fire.
- Countermeasures to anti-sniper detection use anti-reflective layers, as well as honeycomb shapes on scope lenses. Some of these can be overcome by using different radar techniques such as varying the radar frequency to resonate at the shape of the honeycomb, or varying the frequency in the range of the possible shapes.
- the anti-sniper system 2 is shown utilizing a spherical or omni-directional high speed stereoscopic IR and/or visible depth camera and stereoscopic or omni-directional microphone system 4 that contains a spherical (omni-directional) high speed stereoscopic infrared (IR, or other appropriate, such as a RGB—red, green, blue, ranging, time of flight) depth camera system 6 as well as a spherical omni-directional microphone system for left ear orientation 8 A as well as a spherical microphone system for right ear orientation 8 B.
- IR spherical omni-directional high speed stereoscopic infrared
- the spherical (omni-directional) microphone system can not only be used to detect source bearing (azimuth and elevation) of initial ordinance firing, but also detect trajectory from bullet whizzing sound if initial firing was not acoustically detected such as from a weapon silencer.
- the computer or micro-controller system 10 can have terrain and earth curvature data to use in projectile calculations.
- the computer also can process the target data from the camera/microphone system 4 as well as from other sensors.
- the sensors can include a Differential Global Positioning System (DGPS) 14 , bullet trajectory radar system 32 , accelerometers, compass, gyros 12 used to stabilize zoom-able gimbaled IR camera IR and/or visible camera 16 and weapon 18 , wind direction, air temperature, air pressure, wind speed, or other sensors 20 , to calculate source and trajectory to/from target information.
- Target information, from and to other anti-sniper systems 2 for real-time triangulation and target location fixes, is done through high speed wireless communications 26 .
- Bullet trajectory radar 32 can provide near instantaneous ordinance collision avoidance warning commands by determining ordinance trajectory path and anti-sniper system 2 unit positions.
- Microphones 8 A can be used to detect bullet impact sounds to verify trajectory tracking performance from trajectory radar 32 .
- the sound recorded from bullets whizzing by in the air near the microphones 8 A can also be used to verify trajectory tracking performance from trajectory radar 32 .
- On an anti-sniper system 2 HUD display input control 24 this can be annunciated on speakers in or outside computer 10 or displayed, such as halt, duck, move left, move right, move forward, and move backward, on bullets fired, detected, and tracked at long range.
- Other wireless communications 26 data can be sent to and from other remote systems 27 , to be relayed or routed, such as through satellites, drones, aircraft, or other vehicles.
- Target data that is processed is used to rapidly position gimbaled weapon system with laser designator 18 that can be mechanically connected to automatically or manually zoom-able gimbaled IR camera and/or visible camera 16 through computer 10 .
- Multiple sniper targets can be assigned, shuffled, prioritized, and have status tracked and shared amongst multiple anti-sniper systems 2 to autonomously coordinate a rapid anti-sniper response optimally assigning sniper targets to each crew based on unit position, status, and weapons capabilities.
- the robotic gimbaled weapon system with laser designator 18 and zoom-able gimbaled visible and IR camera 16 can rapidly and automatically swing into position of highest probability of snipers based on prior history/intelligence data, and also has manual operational over-ride capability by human operator.
- the robotic gimbaled weapon system with laser designator 18 and zoom-able gimbaled visible and IR camera 16 can be made to move at high speed, faster than any human can move, and be made more accurate and precise at dynamically firing back, even while vehicle is in motion, than a human sniper by using gyros with high speed actuators, with automatically stabilizing shock absorbing mast/boom, where the human decision is made to fire from the zoomed scope view.
- the gimbaled weapon 18 itself can be a high powered laser.
- the gimbaled weapon 18 can also act just as a target designator to work in coordination with an aircraft or ship, or other weapon system.
- Computer 10 can display target data including zoomed target on a HUD (Heads Up Display) with input controls and speaker/alarm 24 for user 30 .
- HUD Heads Up Display
- user 30 determines that a target is real and is a threat, can fire at target using gimbaled weapons (rifle, automatic weapon, missile, high powered laser, or other weapon) system with laser designator 18 controlled by user weapon fire control 22 via fire control switch 28 .
- the anti-sniper system 2 can work independently of other anti-sniper system 2 units while at the same time also join in to work as a coordinated crew or to break off if needed.
- the sensors of the anti-sniper system 2 can self-check and report if they are valid, invalid, and failed by marking the sensor data accordingly.
- the anti-sniper system 2 detection and fire response can incorporate neural-fuzzy reinforcement learning technology to optimize detection and response.
- a high speed rapid response such as returning sniper fire when fired upon from a high speed passing vehicle can be incorporated into the anti-sniper system 2 .
- Incorporating autonomous (or semi-autonomous) zoom camera system as well as a radar can be useful at preventing false alarms that could be triggered in acoustic and fire flash detection systems alone due to other events such as from firecrackers being ignited.
- Multi anti-sniper system 2 target assignments can be both independent and dependent, or handled by a ranked order of anti-sniper systems 2 such that one anti-sniper system unit 2 acts as a target assignment controller, of which can automatically hand off target assignment control to other anti-sniper units 2 as they are removed and added to the system.
- FIG. 2 illustrates using a fire detection to determine the target range by taking the difference between two measured events, the first non-friendly identified gun fire sound wave of highest magnitude subtracted from the detected non-friendly identified muzzle flash heat signature pixel in the infrared camera.
- Friendly's and non-friendly's can clearly be identified and displayed on a Head's Up Display (such as in the user 30 HUD 24 of FIG. 1 ).
- the gun fire sound has a peak signal 38 shown in the upper graph of microphone fire detection magnitude 50 amongst sound echo's 40 and noise 52 where the start of the gunfire sound signal starts at t s 42 .
- the sound from known friendly fire can be filtered out, based on known time, duration and pulse width of friendly fire, and relative friendly firing position (all wirelessly transferred within the system), thus reducing false alarms and system confusion during a fire fight with lots of bullets being fired.
- the lower graph shows the IR camera muzzle flash/heat detection magnitude 53 where the peak detection 44 at time t f 46 shown amongst signal noise 54 .
- the range to sniper can then be calculated by subtracting the two times and multiplying by the speed of sound in air as shown 48 .
- friendly acoustic fire sounds can be filtered, so can friendly positions muzzle flashes can be identified and filtered, via high speed real-time encrypted position network transfer, where laser communications can be used in case of spread spectrum radio frequency jamming is occurring.
- FIG. 3 shows the anti-sniper targeting and detection system applied to an armored personnel vehicle 56 on ground surface 62 where a mounting post 58 is used to support the spherical high speed stereoscopic depth camera and omni-directional microphone system 4 as well as fire control arm 60 with zoom-able gimbaled infrared (or other appropriate) camera 16 with weapon that has a laser designator system 18 whereby if a target is an unfriendly tank, or similar, a drone, or aerial strike can be called in on target.
- spherical high speed stereoscopic camera and microphone system 4 Mounted on top of the spherical high speed stereoscopic camera and microphone system 4 is accelerometers, gyros 12 as well as wind speed and direction sensors 20 along with a differential global positioning system 14 all used to more accurately aim fire control arm onto target 76 on mountain terrain 64 .
- Next target 78 in system targeting sequence is where fire control arm 60 can rapidly move to and zoom to automatically.
- the field of view 68 including sky 66 , and mountains 64 , of one camera of the spherical stereoscopic camera system 4 are shown with edges 70 .
- Gyros 12 can also be mounted in fire control arm 60 as well as on camera 16 , with any means necessary for shock absorption.
- Zoom camera 16 can also be mounted on an independent robotic arm (not shown) of the fire arm 60 such that the zoomed view is maintained even while firing.
- FIG. 4 shows the surface geometry of one camera 6 with field of view 68 with field of view projection edges 70 with target horizontal projection point 80 .
- the angle of the target, ⁇ T can be calculated by the distances shown, given the angle of the field of view (2 ⁇ H ).
- FIG. 5 shows the surface geometry of a stereoscopic camera pair 6 with control fire arm 16 , 18 all projected on one plane with target 80 also projected onto the same plane.
- the horizontal angle of the control fire arm 16 , 18 can be positioned onto the target, given the control arm angle which can be rotated into position so no target is occluded for optimal zooming, laser designating and firing.
- all targets can be easily identified and processed to rapidly calculate and position the fire control arm onto the target.
- Target locking can be done in priority order such as nearest range target, or highest up target, or any other desired order. Range to target can also be determined if a camera used is a depth sensor camera.
- FIG. 6 shows a three dimensional perspective geometry of one spherical stereoscopic camera pair with microphone 4 (that would receive the highest amplitude gun fire sound) mounted on support post 58 and the fire control arm 16 , 18 aimed at target 76 projected on target plane camera viewing plane 68 via straight line vector 72 to target 76 .
- Return fire trajectory is not shown, but would be a slight arc, due to gravitation on path to the target.
- Center of right camera field of view 82 as well as center left camera field of view 84 is shown on target plane camera viewing plane 68 .
- Target 76 is shown projected at 80 to plane perpendicular to target plane 68 that intersects fire control arm 16 , 18 mounting plane.
- the horizontal angle ⁇ g as well as vertical angle ⁇ of the fire control arm 16 , 18 can then be calculated and thus rapidly moved to the target vector for each target 76 or other targets in assigned or selected sequence, from the distances provided from the stereoscopic camera, as well as the fire control arm rotated position ⁇ g .
- FIG. 7 shows a flow chart of the anti-sniper targeting system, where the system scans via radar/laser/optics for evidence of human presence/movement(s), barrel, scope, eye, binocular, or sensor glint, as well as muzzle event(s) in IR images as well as from microphones in process block 100 and then determines if they are detected at decision block 102 the location of the target(s) as well as the ranges are computed at process block 104 .
- valid target information as well as other sensor data is shared amongst networked anti-sniper systems wirelessly.
- Targets are assigned to anti-sniper systems such that all target assignments are distributed optimally such that they are easiest and most effective to hit back, such as closest and best line of sight bearing to anti-sniper unit.
- Zoomed target camera data can also be shared amongst anti-sniper systems.
- Target assignment can be one to one, or by many to one, or assigned in a maximally optimal tactical manner.
- the wind speed, wind direction, temperature, and pressure or other ballistic factor sensor is measured at process block 106 and distributed amongst units, and estimated by taking the average from multiple valid sensors to adjust for any firing compensation required over long ranges.
- the fire control arm is then moved to a coordinated assigned target vector adjusting firing angle for aggregated sensor wind speed, direction, temperature, air pressure, range, and zoomed camera to target, to be adjusted automatically and/or manually, and fire if commanded at process block 108 where the status of the target post fire, if hit, has an assigned probability of disabled, and this probability is reported to other anti-sniper systems.
- the system checks for shutdown at decision block 110 , if not, the next target check occurs at decision block 112 , if yes, then the control arm is rapidly positioned onto the next target. If there are no new targets, then further target scanning occurs.
- FIG. 8 shows how the coordinated anti-sniper systems can work in action to detect, and suppress sniper fire. This is shown in five stages: STAGE I 200 , pre-detect; STAGE II 202 , Barrel/Glint detect; STAGE III 204 , fire detect; STAGE IV 206 , trajectory track; STAGE V 208 , coordinated fire return.
- Vehicles in motion can fire in response while in motion using calculations to adjust, based on the vehicles telemetry data or the first vehicle 56 A may detect the sniper, then become out of line of sight of the sniper, where the second vehicle 56 B may become in range (line of sight), and be able to automatically be assigned the target because of position, and programmatically respond near instantly with camera near instantaneous zoom onto sniper for verification and firing upon as the real time target data is wirelessly transferred from 56 A to 56 B.
- This data is able to be passed on to all vehicles and personnel ( 56 A, 56 B, 56 C, and 56 D) in the group so that each vehicle and personnel passing can fire back and the crew can visually identify and verify target before firing.
- warning terrain zones can be clearly marked in real time of each unit's HUD (particularly useful for dismounted unit 56 D) for anything within line of sight or within projectile range of hostile detected.
- This automatic target sharing system with a manual override is not limited to vehicles; it can be applied to aircraft, ships, or a combination.
- FIG. 8 Three armored personnel vehicles 56 A, 56 B, and 56 C are shown with anti-sniper systems 2 where the forward personnel vehicle 56 A can be made specialized in IED (Improvised Explosive Device)/land mine detection.
- Dismounted person 56 D with miniaturized (trimmed down) soldier carried anti-sniper system 2 is shown between armored personnel vehicles 56 A and 56 B.
- the vehicles 56 A, 56 B, and 56 C and dismounted person 56 D are shown travelling from left to right in front of mountainous terrain 64 under sky 66 .
- a tree 67 is shown next to a building 57 with hidden sniper 71 D.
- Occlusion regions that are out of sight of the anti-sniper system 2 sensors are shown as 90 A and 90 B. These occlusion regions can be calculated and displayed on the anti-sniper HUD using terrain databases, and from data from depth sensors.
- Other undetected sniper positions are shown as 71 A, 71 B, and 71 C shown along mountain ridges 64 .
- Detected sniper targets 71 A, 71 B, and 71 C are displayed with squares on them, recorded, encrypted, and wirelessly shared amongst anti-sniper units in the system in real time (three armored personnel vehicles 56 A, 56 B, and 56 C as well as dismounted person 56 D) and can also be encrypted, and communicated in real-time to other systems ( 27 through wireless communications 26 of FIG. 1 such as through satellite) in the network.
- target data was passed to dismounted person 56 D as a preemptive warning and anti-sniper system ( 2 of FIG. 1 ) automatically recommends to position self in an anti-sniper firing position on top of small hill 64 based on terrain data calculations as shown by indicating optimal positioning directions and annunciating sniper warnings in dismounted person's 56 D HUD.
- Dismounted person 56 D scopes out and aims at sniper 71 A with line of sight 88 waiting further command and automatically reporting to anti-sniper system ( 2 of FIG. 1 ) units that target 71 A is acquired by dismounted person 56 D.
- Armored personnel vehicles 56 A, 56 B, and 56 C are shown moved further forward to the right to optimize targeting of detected targets as recommended, such as by annunciation of “unit ‘ 56 A’ move forward 30 meters” by anti-sniper system ( 2 of FIG. 1 ) where sniper 71 D inside building 57 is spotted by armored personnel vehicle 56 A anti-sniper system ( 2 of FIG. 1 ) via glint/barrel detection path 86 where vehicle weapon is automatically positioned and locked onto planned trajectory path 87 .
- Armored personnel vehicle 56 B is shown in front of tree 67 locked onto detected sniper 71 C via glint/barrel sighting 86 and planned fire real-time response calculated and sensed optimal trajectory 87 arm is rapidly and autonomously rotated and adapted into position based on real wind magnitude and angle acquired from wind direction and magnitude sensors of which can be autonomously corrected from apparent wind from vehicle motion as well as aggregated amongst sensors.
- Friendly unit positions ( 56 A, 56 B, 56 C, and 56 D), weapon angle, weapon target, weapon or other ordinance firing (to differentiate with enemy fire/explosion, such as by HUD color display to indicate if a firing/explosion cause was friendly or hostile), are autonomously reported, shared, and tracked in real time, encrypted and wirelessly transferred to avoid friendly fire incidents, and to optimize coordination and response of target sharing where data can be displayed/annunciated on all personnel HUD's.
- Friendly's are clearly identified and marked in HUD display.
- Sniper targets 71 B and 71 A are also detected by IR glint and/or radar barrel detection by armored personnel vehicle 56 A and 56 B as shown by the detection paths 86 where armored personnel vehicle 56 B has sniper 71 C locked on with automated positioning fire arm planned trajectory path 87 where sniper 71 C fire detection weapon IR muzzle flash indicates sniper 71 C weapon is fired where position is further verified if not preemptively detected.
- Snipers can also be preemptively detected, targeted, and tracked (via image processing or otherwise) by IR heat signature which can be very visibly distinguished from terrain.
- Sniper 71 B is shown scoped out on line of sensor sight 86 by armored personnel vehicle 56 C where weapon is rapidly automatically positioned, adapted, and optimized based on calculated real wind and vehicle motion along with trajectory equations by sniper's 71 B three dimensional relative positions. Snipers 71 A and 71 C are also detected by armored personnel vehicle 56 C as shown by line of sensor sight paths 86 . Sniper targets 71 A, 71 B, 71 C, and 71 D status, such as firing trajectories, or no longer firing, appears injured, or killed, are reported and distributed as marked probabilities in real time. In STAGE III if no snipers were preemptively detected, they can be otherwise detected at this STAGE III by their weapon firing.
- targets were verified hostile, engaged, and destroyed in rapid near simultaneous succession as shown by the X's. If targets were missed or new targets were found, targets can be re-assigned and transferred between units rapidly and autonomously in real time with target status (Engaged, New, Detected, Tracked, Lost, Partly Destroyed, Destroyed) continually updated where numbers of rounds along with round types, round source into target can be tracked, recorded, and reported as well as used for computing probability of target disabled.
- a performance recording and measurement system can be incorporated for post battle analysis, whereby the successful battles, and unsuccessful, can be reinforced into the neural-fuzzy reinforcement system, based on parameters such as injury types, numbers, and fatalities, if any. To improve the system through the Monte Carlo process.
- each anti-sniper system 2 can share gun position using gun orientation sensors to provide gun barrel angle, and complete calculated planned fire trajectory that can be displayed on the HUD of each anti-sniper system 2 user where anti-sniper system 2 users are clearly identified on the HUD's.
- the anti-sniper targeting and detection system operates by automatically detecting target(s), in advance, calculating the target positions relative to the system, computing the fire control arm angles based on aggregate sensor data and trajectory calculations, and rapidly moving the fire control arm to the target in an assigned shared priority sequence where targets can be viewed, assessed, verified as unfriendly, and fired upon in rapid succession. If the vehicle and targets are moving, the fire control arm and target data can be continually adjusted accordingly.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
An anti-sniper targeting system where a spherical omni-directional depth stereoscopic camera, a radar, and microphone identify, detect, and determine target positions and bearings, detect target weapon flash, detect glint, track bullet trajectory, coordinate, track, share, and assign targets. Target bearings and ranges are determined by sound and heat signature detection from an infrared camera, from glint, and radar to rapidly position a fire control arm with a camera onto assigned targets rapidly from calculations on target positions and optimal trajectory. It can account for firing corrections due to target range and wind effects using wind sensors, pressure, temperature, and earth curvature accommodating for bullet trajectory over large ranges. It can be an offensive sniper system whereby a target is locked in spite of movements, such as from a vehicle using stabilizing gyros and accelerometers, image processing, or sensor data to adjust for movements.
Description
- This application claims the benefit of the filing dates of U.S. Patent applications No. 61/626,702; 2010/0238161 A1; 61/575,131; Ser. No. 61/626,701; Ser. No. 61/571,113; U.S. patent application Ser. No. 12/460,552 35 U.S.C. sec. 119 and 120 particularly the Sep. 30, 2011 filing date of 61/626,702.
- None.
- None.
- This application relates to pre-empting, target & weapon tracking, counteracting the attack of snipers by detection of snipers, or potential snipers, rapidly determining, assigning, coordinating, and transferring, the target information between anti-sniper systems. The sniper position information can be used by counteracting forces through target information sharing and/or by rapidly positioning a counter-sniper weapon. The counter-sniper weapon robotic arm can rapidly zoom, pan, and tilt a camera (infrared or otherwise as appropriate), based on target bearing, elevation, range, and wind condition calculations, immediately be moved upon sniper position to rapidly and accurately counter fire against any sniper or multiple snipers. Small human adjustments of pan, tilt, and zoom can be made upon human verification of target from rapidly zoomed camera. Pooled together, multiple systems can be designed to cooperate to nearly simultaneously coordinate, assign, and communicate target data and counter-fire on automatically or semi-automatically assigned multiple sniper targets in response all at once, where targets can be chosen programmatically (automatically or semi-automatically) optimally by relative unit positions. Targets can also be assigned based on terrain occlusions, for maximizing safety of units based on these terrain occlusions from a terrain occlusion (line of sight from target) data base or calculation, and/or from system instrumentation of terrain (such as from three dimensional depth cameras). Snipers can be dealt with in multiple stages: pre-detection, barrel/glint (pre-fire) detection, fire detection, bullet trajectory tracking, and fire return, as snipers come and go.
- The combination of stereoscopic/spherical/depth (omni-directional) cameras as well as a spherical/omni-directional microphone system and a radar system can be used to measure target range. Other techniques to determine target range can be optic flow estimation, laser range finding, terrain database information or any other suitable technique. If a muzzle flash or heated muzzle can be detected optically, because the speed of light is much greater than the speed of sound through air, the muzzle flash and muzzle sound detection can be used to determine range by taking the time difference at the start of the muzzle flash from the start of the optical detection and multiplying it by the speed of sound in air of which can be optimized using air pressure & temperature sensors if needed. Sensor integration and synthesis can be achieved by weighting the probability of accuracy, precision, and tolerances. Many of these techniques are well known in the art.
- Pre-sniper detection techniques using radar signature reflection of gun barrel are described in U.S. Pat. No. 8,049,659; as well as described in T. CIPARA; “Using Radar Signals to Safeguard Our Troops”; Mar. 15, 2011; George Mason University; Fairfax, Va.; USA. Other pre-sniper detection techniques using detection of glint or reflection from a scope, binocular, or even a human eye lens are described in part in U.S. Pat. App. No. 2008/0136626 and also described in part in H. HASHARON; “SLD500 Sniper Locator CILAS”; 2006; Defense Update International Online Defense Magazine;
Issue 2; Israel; N. SHACHTMAN; “Lasers Stop Snipers Before they Fire”; Apr. 26, 2007; Wired; New York, N.Y.; USA; as well as described in Torrey Pines Logic product brochures for their optical detection products, “Mirage 1200”, “Sentinel S30”, “Beam 50/60”, and “Beam 1000”; San Diego, Calif. where the “Mirage 1200” is mentioned in D. CRANE; “Torrey Pines Logic Mirage-1200 and Myth-350 Handheld Sniper Detection Systems”; Dec. 8, 2008; Defense Review; Miami, Fla.; USA; H. HASHARON; “SLD500 Sniper Locator CILAS”; 2006; Defense Update International Online Defense Magazine;Issue 2; Israel; Y. ASFAW; “Impact of Pose and Glasses on Face Detection Using the Red Eye Effect”; May 2003; CCECE 2003; IEEE; Montreal; Canada. A counter measure to snipe scopes is described in M. NAIMARK; “How to ZAP a Camera: Using Lasers to Temporarily Neutralize Camera Sensors”; October 2002; USA. - Acoustic detection of gun fire is described in U.S. Pat. Nos. 7,796,470 and 6,178,141, as well as in U.S. Pat. App. No. 2010/0226210 and also described in G. L. DUCKWORTH; “Fixed and wearable acoustic counter-sniper systems for law enforcement”; Nov. 3, 1998; SPIE Proceedings Vol. 3577; Boston, Mass.; USA; M. V. SCANLON; “Networked Acoustic Sensor Array's Performance During 2004 Horizontal Fusion—Warrior's Edge Demonstration”; December 2004; US Army Research Laboratory; USA; J. DUNNIGAN;“Sniper Detectors Arrive”; Aug. 22, 2009; Strategy World; USA; F. SIMONIS; “Nanotechnology: innovation opportunities for tomorrow's defense”; March 2006; TNO Science & Industry; Netherlands; G. SIMON; “Sensor Network-Based Counter-sniper System”; Nov. 3, 2004; SenSys'04; Baltimore, Md., USA; A. WHITE; “Fighting fire with fire: technology finds a solution to sniper attacks”; June 2009; pg. 52-57; Jane's International Defense Review; Englewood, Colo.; USA; J. KELLER; “Sniper-detection systems to provide perimeter security for Army forces in Afghanistan to come from Raytheon BBN”; Feb. 15, 2011; Military & Aero. Elect.; USA; T. V. BROOK; “High-tech device helps U.S. troops pinpoint snipers”; Mar. 2, 2011; USA Today; USA; C. HUGHES; “British troops to get iPod-sized ‘sniper finders’ to take on deadly sharpshooters in Afghanistan”; Mar. 8, 2011; Daily Mirror; UK; T. HORNYAK; “U.S. troops getting wearable gunshot detectors”; Mar. 21, 2011; CNET News; San Francisco, Calif.; USA; A. BARRIE; “Sniper Detectors Coming to America's Heartland”; Dec. 22, 2011; FOX NEWS; USA
- The barrel flash detection is described in U.S. Pat. Nos. 7,947,954; 3,699,341; and in U.S. Pat. App. No. 2011/0095187 as well as in A. GOLDBERG; “Infrared Signatures of the Muzzle Flash of a 120 mm Tank Gun and their Implications for the Kinetic Energy Active Protection System (KEAPS)”; October 2001; USA; M. ISAAC et. al.; “Infrared Detects Sniper Gunfire”; Oct. 29, 2005; Wired; New York, N.Y.; USA; S. A. MOROZ; “Airborne Deployment of and Recent Improvements to the Viper Counter Sniper System”; 1999; Naval Research Laboratory; Washington, D.C.; USA.
- Combined acoustic and optical fire detection systems are described in RAFAEL; “Anti-Sniper Systems Finding Their Range”; Nov. 3, 2005; Defense Industry Daily; USA; M. C. ERTEM; “An acoustic sensor for the viper infrared sniper detection system”; August 1999; Maryland Advanced Development Laboratory; Greenbelt, Md.; USA.
- The trajectory tracking of bullets fired is described in X. L. ZHANG; “Real-time tracking of bullet trajectory based on chirp transform in a multi-sensor multi-freq radar”; May 10, 2010; Radar Conference, 2010 IEEE; USA; Y. ZHANG; “Real-time acquisition and tracking of sniper bullets using multi-sensor multi-frequency radar techniques”; Aug. 31, 2009; SSP '09. IEEE; USA; BROWN, E. R.; “Ku-band retrodirective radar for ballistic projectile detection and tracking”; May 4, 2009; Radar Conference, 2009 IEEE; Pasadena, Calif.; USA.
- Systems and techniques that are able to return fire against snipers, or avoid fire, are described in U.S. Pat. Nos. 6,357,158; 7,484,451 and U.S. Pat. App. Nos. 2009/0320348, 2009/0292467, 2009/0290019, 2008/0291075, as well as in N. F. EVANS; “British Artillery Fire Control Ballistics & Data”; Apr. 11, 2010; Australia; F. FLINCH et. al.; “External Ballistics”; Jan. 10, 2012; Wikipedia; San Francisco, Calif.; USA.
- A general discussion of various other anti-sniper systems is included in D. CRANE; “Anti-Sniper/Sniper Detection/Gunfire Detection Systems at a Glance”; New and Future Technology; Jul. 19, 2006; Defense Review; Miami, Fla.; USA; P. SARKA; “iRobot and Boston Univ. Photonics Center Unveil Advanced Sniper Detection System for iRobot Packbot”; Oct. 3, 2005; iRobot Corp. Press Release; R. DOUGLAS; “The Objective Force Soldier/Soldier Team—Volume II—The Science and Technology Challenges”; November 2001; Army Science Board SAAL-ASB; Arlington, Va.; USA; C. CALLAN; “Sensors to Support the Soldier”; Feb. 3, 2005; pgs. 41-84; Jason the MITRE Corporation; McLean, Va.; USA; P. A. BUXBAUM; “Pinpointing Sniper Perches”; August, 2010; SOTECH 8.6; pgs. 11-14; KMI Media Group; Rockville, Md.; USA; A. NATIVI; “Counter-sniper Systems Detect Hidden Shooters”; Dec. 22, 2011; Aviation Week—The McGraw-Hill Co.; USA.
- We are unaware of any anti-sniper system that incorporates and integrates all of the methods described in the prior art, as well as providing a seamless response at every stage of sniper interaction from pre-detection, to pre-fire warning, target assignment, fire detection, trajectory tracking, to coordinated fire response, as well as neural-fuzzy reinforcement optimization. For an example application of reinforcement optimization learning see M. MCPARTLAND, “Reinforcement Learning in First Person Shooter Games”; March, 2011; IEEE Transactions on Computational Intelligence and AI Games; Vol. 3 No. 1; USA. This invention fully integrates all of the methods, and is fully applied at every stage of sniper interaction, where the anti-sniper system can automatically respond as if well prepared through continuous vigilant sensor monitoring to snipe the sniper in advance. Doing this by continuously and autonomously monitoring and tracking the target(s) as well as atmospheric conditions such as wind speed, wind direction, temperature, air pressure, unit positions, and incorporating this data in real time with target bearing and planned computed optimal counter sniper-bullet trajectories based on ballistics (e.g. bullet/projectile mass, wind speed, distance to target, etc.).
- A rapid and accurate sniper counter acting force response system that can not only allow operators to immediately respond but can also pre-empt the sniper by identifying sniper targets in advance using detection of movement or presence of infrared signatures of objects using frame by frame image, as well as gun barrel radar detection, processing adjusting for vehicle motion and vehicle position and utilizing existing information about the terrain. With a fast autonomous, robotically gimbaled, zoom-able camera an operator can quickly scan and verify suspect targets. This can be done as a vehicle progresses through the field of operation, by target locking and tracking, while allowing the operator to simply press a “next target” (or “last target”, or “next coordinated assigned target”, like a target slide show) activation to quickly evaluate suspect targets in order to differentiate real from non-real targets. The return fire weapon and rapid zoom camera can help an operator further evaluate, from a great distance, what the target is holding or doing, and if the target is verified as threatening, the anti-sniper system can fire at the target with great accuracy. Highly robust smooth image stabilizers, gimbals, and laser locking techniques along with gyroscopes can help stabilize and fix and follow the highly zoomed (infrared and/or other) camera onto the target while the vehicle is still in motion further enhancing the operator to verify if a target is threatening or not in advance of being sniped, allowing a pre-emptive snipe at a sniper. Anti-sniper systems can share critical data and coordinate actions with each other in real time in a firefight such as friendly positions as well as target positions, and friendly weapon vectors, trajectories, and friendly as well as target firing times.
- The anti-sniper camera system can also be made to incorporate a multitude of zoomed cameras per target, as well as multiple robotic anti-sniper weapons so that even more rapid target assessment and response can be made. The anti-sniper system objective is ultimately to act as a very significant assured deterrent to firing any weapon at the anti-sniper system. It is to re-assert balance in asymmetrical warfare as well as mutual assured destruction of equal system capability, or even verifiable threat, of any gun weapon firing or pointing, thus making it a tremendous counter incentive to firing a gun, or even threatening any force carrying a fully autonomous (with manual override) integrated anti-sniper system. It greatly reduces the element of chance involved, and it is a powerful deterrent to not only firing a weapon, but even pointing it.
-
FIG. 1 is a block diagram of the overall anti-sniper targeting and detection sharing system. -
FIG. 2 shows prior art calculations of how range can be estimated by combining acoustic sensors (microphones) with optics and taking advantage of the speed of light being much greater than that of sound to determine range of gun fire. -
FIG. 3 shows the anti-sniper targeting and detection sharing system on top of a armored personnel vehicle, with spherical stereoscopic camera & microphone system, gimbaled weapon system with laser tracking and zoom IR (or other suitable) camera mounted on pole arm, with wind sensors, differential global positioning system, radar, glint detector, on top of spherical stereoscopic camera & microphone system. -
FIG. 4 shows a planar geometry of the field of view of a camera with projection of a target onto the field of view used to calculate target angles and range. -
FIG. 5 shows a planar geometry of a pair of stereoscopic cameras projected onto the plane of the robotic camera weapon laser arm. -
FIG. 6 shows a three dimensional perspective geometry of a stereoscopic camera pair with the robotic camera weapon laser arm, where the calculations for rapidly and automatically determining the angles, and range to position the zoomed camera/weapon system onto detected target(s) in rapid succession. -
FIG. 7 is a flow chart of the system process of detecting targets and rapidly automatically positioning the zoom camera gyro stabilized laser weapon system onto the potential targets. -
FIG. 8 is a coordinated sniper fires pre-detect, glint/barrel detects, fire detects, trajectory track, and coordinated fire return stage diagram showing how the coordinated systems function in action. -
FIG. 1 shows a system block diagram of the anti-sniper targeting anddetection system 2. A pre-snipe omni-directional sensor scope/eye glint/barrel IR/radar detection system 34 is shown connected tocomputer system 10 to detect evidence of snipers, scopes, other optical sensors as well as gun barrels, if they are present, before a sniper is able to fire. Countermeasures to anti-sniper detection use anti-reflective layers, as well as honeycomb shapes on scope lenses. Some of these can be overcome by using different radar techniques such as varying the radar frequency to resonate at the shape of the honeycomb, or varying the frequency in the range of the possible shapes. - The
anti-sniper system 2 is shown utilizing a spherical or omni-directional high speed stereoscopic IR and/or visible depth camera and stereoscopic or omni-directional microphone system 4 that contains a spherical (omni-directional) high speed stereoscopic infrared (IR, or other appropriate, such as a RGB—red, green, blue, ranging, time of flight) depth camera system 6 as well as a spherical omni-directional microphone system forleft ear orientation 8A as well as a spherical microphone system forright ear orientation 8B. The spherical (omni-directional) microphone system can not only be used to detect source bearing (azimuth and elevation) of initial ordinance firing, but also detect trajectory from bullet whizzing sound if initial firing was not acoustically detected such as from a weapon silencer. - The computer or
micro-controller system 10 can have terrain and earth curvature data to use in projectile calculations. The computer also can process the target data from the camera/microphone system 4 as well as from other sensors. The sensors can include a Differential Global Positioning System (DGPS) 14, bullettrajectory radar system 32, accelerometers, compass,gyros 12 used to stabilize zoom-able gimbaled IR camera IR and/orvisible camera 16 andweapon 18, wind direction, air temperature, air pressure, wind speed, orother sensors 20, to calculate source and trajectory to/from target information. Target information, from and to otheranti-sniper systems 2 for real-time triangulation and target location fixes, is done through highspeed wireless communications 26.Bullet trajectory radar 32 can provide near instantaneous ordinance collision avoidance warning commands by determining ordinance trajectory path andanti-sniper system 2 unit positions.Microphones 8A can be used to detect bullet impact sounds to verify trajectory tracking performance fromtrajectory radar 32. The sound recorded from bullets whizzing by in the air near themicrophones 8A can also be used to verify trajectory tracking performance fromtrajectory radar 32. On ananti-sniper system 2 HUDdisplay input control 24, this can be annunciated on speakers in oroutside computer 10 or displayed, such as halt, duck, move left, move right, move forward, and move backward, on bullets fired, detected, and tracked at long range. -
Other wireless communications 26 data can be sent to and from otherremote systems 27, to be relayed or routed, such as through satellites, drones, aircraft, or other vehicles. Target data that is processed is used to rapidly position gimbaled weapon system withlaser designator 18 that can be mechanically connected to automatically or manually zoom-able gimbaled IR camera and/orvisible camera 16 throughcomputer 10. Multiple sniper targets can be assigned, shuffled, prioritized, and have status tracked and shared amongst multipleanti-sniper systems 2 to autonomously coordinate a rapid anti-sniper response optimally assigning sniper targets to each crew based on unit position, status, and weapons capabilities. The robotic gimbaled weapon system withlaser designator 18 and zoom-able gimbaled visible andIR camera 16 can rapidly and automatically swing into position of highest probability of snipers based on prior history/intelligence data, and also has manual operational over-ride capability by human operator. The robotic gimbaled weapon system withlaser designator 18 and zoom-able gimbaled visible andIR camera 16 can be made to move at high speed, faster than any human can move, and be made more accurate and precise at dynamically firing back, even while vehicle is in motion, than a human sniper by using gyros with high speed actuators, with automatically stabilizing shock absorbing mast/boom, where the human decision is made to fire from the zoomed scope view. To further enhance the response time, thegimbaled weapon 18 itself can be a high powered laser. Thegimbaled weapon 18 can also act just as a target designator to work in coordination with an aircraft or ship, or other weapon system. -
Computer 10 can display target data including zoomed target on a HUD (Heads Up Display) with input controls and speaker/alarm 24 foruser 30. Ifuser 30 determines that a target is real and is a threat, can fire at target using gimbaled weapons (rifle, automatic weapon, missile, high powered laser, or other weapon) system withlaser designator 18 controlled by userweapon fire control 22 viafire control switch 28. Theanti-sniper system 2 can work independently of otheranti-sniper system 2 units while at the same time also join in to work as a coordinated crew or to break off if needed. The sensors of theanti-sniper system 2 can self-check and report if they are valid, invalid, and failed by marking the sensor data accordingly. Theanti-sniper system 2 detection and fire response can incorporate neural-fuzzy reinforcement learning technology to optimize detection and response. A high speed rapid response, such as returning sniper fire when fired upon from a high speed passing vehicle can be incorporated into theanti-sniper system 2. Incorporating autonomous (or semi-autonomous) zoom camera system as well as a radar can be useful at preventing false alarms that could be triggered in acoustic and fire flash detection systems alone due to other events such as from firecrackers being ignited. - Multi
anti-sniper system 2 target assignments can be both independent and dependent, or handled by a ranked order ofanti-sniper systems 2 such that oneanti-sniper system unit 2 acts as a target assignment controller, of which can automatically hand off target assignment control to otheranti-sniper units 2 as they are removed and added to the system. -
FIG. 2 illustrates using a fire detection to determine the target range by taking the difference between two measured events, the first non-friendly identified gun fire sound wave of highest magnitude subtracted from the detected non-friendly identified muzzle flash heat signature pixel in the infrared camera. Friendly's and non-friendly's can clearly be identified and displayed on a Head's Up Display (such as in theuser 30HUD 24 ofFIG. 1 ). The gun fire sound has apeak signal 38 shown in the upper graph of microphonefire detection magnitude 50 amongst sound echo's 40 andnoise 52 where the start of the gunfire sound signal starts att s 42. The sound from known friendly fire can be filtered out, based on known time, duration and pulse width of friendly fire, and relative friendly firing position (all wirelessly transferred within the system), thus reducing false alarms and system confusion during a fire fight with lots of bullets being fired. The lower graph shows the IR camera muzzle flash/heat detection magnitude 53 where the peak detection 44 attime t f 46 shown amongstsignal noise 54. The range to sniper can then be calculated by subtracting the two times and multiplying by the speed of sound in air as shown 48. Just as friendly acoustic fire sounds can be filtered, so can friendly positions muzzle flashes can be identified and filtered, via high speed real-time encrypted position network transfer, where laser communications can be used in case of spread spectrum radio frequency jamming is occurring. -
FIG. 3 shows the anti-sniper targeting and detection system applied to anarmored personnel vehicle 56 onground surface 62 where a mountingpost 58 is used to support the spherical high speed stereoscopic depth camera and omni-directional microphone system 4 as well asfire control arm 60 with zoom-able gimbaled infrared (or other appropriate)camera 16 with weapon that has alaser designator system 18 whereby if a target is an unfriendly tank, or similar, a drone, or aerial strike can be called in on target. Mounted on top of the spherical high speed stereoscopic camera and microphone system 4 is accelerometers,gyros 12 as well as wind speed anddirection sensors 20 along with a differentialglobal positioning system 14 all used to more accurately aim fire control arm ontotarget 76 onmountain terrain 64.Next target 78 in system targeting sequence is wherefire control arm 60 can rapidly move to and zoom to automatically. The field ofview 68 includingsky 66, andmountains 64, of one camera of the spherical stereoscopic camera system 4 are shown with edges 70.Gyros 12 can also be mounted infire control arm 60 as well as oncamera 16, with any means necessary for shock absorption.Zoom camera 16 can also be mounted on an independent robotic arm (not shown) of thefire arm 60 such that the zoomed view is maintained even while firing. -
FIG. 4 shows the surface geometry of one camera 6 with field ofview 68 with field of view projection edges 70 with targethorizontal projection point 80. The angle of the target, θT, can be calculated by the distances shown, given the angle of the field of view (2×θH). -
FIG. 5 shows the surface geometry of a stereoscopic camera pair 6 withcontrol fire arm target 80 also projected onto the same plane. Given the distance of the fire control arm, the distance between the stereoscopic cameras, the length between the camera focus and the center point of rotation of the control arm, and the distances provided, the horizontal angle of thecontrol fire arm -
FIG. 6 shows a three dimensional perspective geometry of one spherical stereoscopic camera pair with microphone 4 (that would receive the highest amplitude gun fire sound) mounted onsupport post 58 and thefire control arm target 76 projected on target planecamera viewing plane 68 viastraight line vector 72 to target 76. Return fire trajectory is not shown, but would be a slight arc, due to gravitation on path to the target. Center of right camera field ofview 82 as well as center left camera field of view 84 is shown on target planecamera viewing plane 68.Target 76 is shown projected at 80 to plane perpendicular to targetplane 68 that intersectsfire control arm fire control arm target 76 or other targets in assigned or selected sequence, from the distances provided from the stereoscopic camera, as well as the fire control arm rotated position αg. -
FIG. 7 shows a flow chart of the anti-sniper targeting system, where the system scans via radar/laser/optics for evidence of human presence/movement(s), barrel, scope, eye, binocular, or sensor glint, as well as muzzle event(s) in IR images as well as from microphones inprocess block 100 and then determines if they are detected atdecision block 102 the location of the target(s) as well as the ranges are computed atprocess block 104. At process block 104 valid target information as well as other sensor data is shared amongst networked anti-sniper systems wirelessly. Targets are assigned to anti-sniper systems such that all target assignments are distributed optimally such that they are easiest and most effective to hit back, such as closest and best line of sight bearing to anti-sniper unit. Zoomed target camera data can also be shared amongst anti-sniper systems. Target assignment can be one to one, or by many to one, or assigned in a maximally optimal tactical manner. - The wind speed, wind direction, temperature, and pressure or other ballistic factor sensor is measured at
process block 106 and distributed amongst units, and estimated by taking the average from multiple valid sensors to adjust for any firing compensation required over long ranges. The fire control arm is then moved to a coordinated assigned target vector adjusting firing angle for aggregated sensor wind speed, direction, temperature, air pressure, range, and zoomed camera to target, to be adjusted automatically and/or manually, and fire if commanded at process block 108 where the status of the target post fire, if hit, has an assigned probability of disabled, and this probability is reported to other anti-sniper systems. The system checks for shutdown atdecision block 110, if not, the next target check occurs atdecision block 112, if yes, then the control arm is rapidly positioned onto the next target. If there are no new targets, then further target scanning occurs. -
FIG. 8 shows how the coordinated anti-sniper systems can work in action to detect, and suppress sniper fire. This is shown in five stages: STAGE I 200, pre-detect;STAGE II 202, Barrel/Glint detect;STAGE III 204, fire detect;STAGE IV 206, trajectory track;STAGE V 208, coordinated fire return. Vehicles in motion can fire in response while in motion using calculations to adjust, based on the vehicles telemetry data or thefirst vehicle 56A may detect the sniper, then become out of line of sight of the sniper, where thesecond vehicle 56B may become in range (line of sight), and be able to automatically be assigned the target because of position, and programmatically respond near instantly with camera near instantaneous zoom onto sniper for verification and firing upon as the real time target data is wirelessly transferred from 56A to 56B. This data is able to be passed on to all vehicles and personnel (56A, 56B, 56C, and 56D) in the group so that each vehicle and personnel passing can fire back and the crew can visually identify and verify target before firing. Although not shown inFIG. 8 , upon sniper detection, warning terrain zones can be clearly marked in real time of each unit's HUD (particularly useful fordismounted unit 56D) for anything within line of sight or within projectile range of hostile detected. This automatic target sharing system with a manual override is not limited to vehicles; it can be applied to aircraft, ships, or a combination. - In STAGE I 200 of
FIG. 8 threearmored personnel vehicles anti-sniper systems 2 where theforward personnel vehicle 56A can be made specialized in IED (Improvised Explosive Device)/land mine detection.Dismounted person 56D with miniaturized (trimmed down) soldier carriedanti-sniper system 2 is shown betweenarmored personnel vehicles vehicles dismounted person 56D are shown travelling from left to right in front ofmountainous terrain 64 undersky 66. Atree 67 is shown next to abuilding 57 with hiddensniper 71D. Occlusion regions that are out of sight of theanti-sniper system 2 sensors are shown as 90A and 90B. These occlusion regions can be calculated and displayed on the anti-sniper HUD using terrain databases, and from data from depth sensors. Other undetected sniper positions are shown as 71A, 71B, and 71C shown alongmountain ridges 64. - In STAGE II 202 of
FIG. 8 the threearmored personnel vehicles person 56D are shown moved slightly forward withocclusion zones 90A and 90B updated accordingly wherearmored personnel vehicle 56A detects barrel or glint, or detects barrel radar reflection of threesnipers non-occluded sensor sights 86 wherebygimbaled weapon system 18 is automatically locked in and zoomed ontodetector sniper target 71A via planned calculated counterfire trajectory path 87. Detectedsniper targets armored personnel vehicles person 56D) and can also be encrypted, and communicated in real-time to other systems (27 throughwireless communications 26 ofFIG. 1 such as through satellite) in the network. - In STAGE III after sniper units were preemptively detected via IR glint, or radar detection of barrel in STAGE II, target data was passed to dismounted
person 56D as a preemptive warning and anti-sniper system (2 ofFIG. 1 ) automatically recommends to position self in an anti-sniper firing position on top ofsmall hill 64 based on terrain data calculations as shown by indicating optimal positioning directions and annunciating sniper warnings in dismounted person's 56D HUD.Dismounted person 56D scopes out and aims atsniper 71A with line ofsight 88 waiting further command and automatically reporting to anti-sniper system (2 ofFIG. 1 ) units that target 71A is acquired by dismountedperson 56D.Armored personnel vehicles FIG. 1 ) wheresniper 71D inside building 57 is spotted byarmored personnel vehicle 56A anti-sniper system (2 ofFIG. 1 ) via glint/barrel detection path 86 where vehicle weapon is automatically positioned and locked onto plannedtrajectory path 87.Armored personnel vehicle 56B is shown in front oftree 67 locked onto detectedsniper 71C via glint/barrel sighting 86 and planned fire real-time response calculated and sensedoptimal trajectory 87 arm is rapidly and autonomously rotated and adapted into position based on real wind magnitude and angle acquired from wind direction and magnitude sensors of which can be autonomously corrected from apparent wind from vehicle motion as well as aggregated amongst sensors. - Friendly unit positions (56A, 56B, 56C, and 56D), weapon angle, weapon target, weapon or other ordinance firing (to differentiate with enemy fire/explosion, such as by HUD color display to indicate if a firing/explosion cause was friendly or hostile), are autonomously reported, shared, and tracked in real time, encrypted and wirelessly transferred to avoid friendly fire incidents, and to optimize coordination and response of target sharing where data can be displayed/annunciated on all personnel HUD's. Friendly's are clearly identified and marked in HUD display. Sniper targets 71B and 71A are also detected by IR glint and/or radar barrel detection by
armored personnel vehicle detection paths 86 wherearmored personnel vehicle 56B hassniper 71C locked on with automated positioning fire arm plannedtrajectory path 87 wheresniper 71C fire detection weapon IR muzzle flash indicatessniper 71C weapon is fired where position is further verified if not preemptively detected. Snipers can also be preemptively detected, targeted, and tracked (via image processing or otherwise) by IR heat signature which can be very visibly distinguished from terrain. Sniper 71B is shown scoped out on line ofsensor sight 86 by armored personnel vehicle 56C where weapon is rapidly automatically positioned, adapted, and optimized based on calculated real wind and vehicle motion along with trajectory equations by sniper's 71B three dimensional relative positions.Snipers sensor sight paths 86. Sniper targets 71A, 71B, 71C, and 71D status, such as firing trajectories, or no longer firing, appears injured, or killed, are reported and distributed as marked probabilities in real time. In STAGE III if no snipers were preemptively detected, they can be otherwise detected at this STAGE III by their weapon firing. - In
STAGE IV sniper 71C'sbullet 304 has itstrajectory 300 tracked in real time (using mean of multiple triangulated anti-sniper 2 ofFIG. 1 unit sensors, throwing out spurious data and noise) by high speed bullet tracking radar (32 ofFIG. 1 )sight lines 302 onarmored personnel vehicles sniper 71C is hostile whereby a highly coordinated efficient semi-autonomous rapid return fire response occurs at STAGE V, controlled by rapid autonomous zoom visual inspection with rapid target gyro stabilized rapid robotic weapon zoom views can be rapidly shuffled through to verify if each target is verified hostile. Targets can be rapidly assigned in real time to optimal units based on position and weapons capability available, threat level, and based on target type. - At STAGE V the targets were verified hostile, engaged, and destroyed in rapid near simultaneous succession as shown by the X's. If targets were missed or new targets were found, targets can be re-assigned and transferred between units rapidly and autonomously in real time with target status (Engaged, New, Detected, Tracked, Lost, Partly Destroyed, Destroyed) continually updated where numbers of rounds along with round types, round source into target can be tracked, recorded, and reported as well as used for computing probability of target disabled. A performance recording and measurement system can be incorporated for post battle analysis, whereby the successful battles, and unsuccessful, can be reinforced into the neural-fuzzy reinforcement system, based on parameters such as injury types, numbers, and fatalities, if any. To improve the system through the Monte Carlo process.
- To help avoid friendly fire, each
anti-sniper system 2 can share gun position using gun orientation sensors to provide gun barrel angle, and complete calculated planned fire trajectory that can be displayed on the HUD of eachanti-sniper system 2 user whereanti-sniper system 2 users are clearly identified on the HUD's. -
- 2 anti-sniper targeting and detection system
- 4 spherical high speed stereoscopic IR and/or visible camera and stereoscopic microphone system
- 6 spherical high speed stereoscopic IR and/or visible camera system
- 8A spherical microphone system left ears
- 8B spherical microphone system right ears
- 10 computer system
- 12 accelerometers, compass, gyros/inertial reference (in case GPS is bad)
- 14 differential or other global positioning system (GPS) navigation sensor, combined with omni-directional RGB-D (red, green, blue, depth) camera, along with glint IR laser detection system, barrel and bullet trajectory tracking radar detector
- 16 zoom-able gimbaled camera
- 18 gimbaled weapon system with laser designator, gyros
- 20 wind direction and speed sensor
- 22 user weapon fire control interface
- 24 user display and input control
- 26 wireless communications system
- 27 other networked systems such as satellites, drones, aircraft, robots
- 28 fire control switch
- 30 user
- 32 bullet trajectory radar
- 34 omni-directional pre-snipe (i.e. laser to scope/eye glint, radar, barrel) sensor detection system
- 38 muzzle sound detection peak
- 40 muzzle sound echoes
- 42 time detected at start of muzzle firing peak
- 44 IR muzzle heat signature
- 46 time detected at start of muzzle heat signature
- 48 range equation using combination of detected acoustic and IR peak times
- 50 magnitude axis of acoustic signal
- 52 ambient microphone noise
- 53 magnitude axis of IR signal
- 54 ambient IR pixel noise
- 56A forward armored personnel vehicle that carries the muzzle event target detection system (2), that can also contain a mine or Improvised Explosive Device (IED) detection system
- 56D dismounted person that carries a miniature or trimmed down anti-sniping system (2)
- 57 building
- 58 system mounting post
- 60 swivel beam to support gimbaled weapons system with laser designator (18) and zoom-able gimbaled IR and/or visible camera (16)
- 62 road
- 64 terrains
- 66 sky
- 67 tree
- 68 camera view rectangle/square
- 70 camera projected corner through 3D space
- 71 target
- 72 true target (T1) vector
- 74 true target (T2) vector
- 76 target (T1) locked & engaged highlighted circle with square
- 78 target (T2) lock highlighted circle
- 80 target projected in plane at camera level or gun plane
- 82 right camera center pixel
- 84 left camera center pixel
- 86 target detected by either glint from IR laser or radar barrel
- 87 robotic fire arm target automatic engagement lock
- 88 manual target engagement lock
- 90 occlusion zones
- 90A left occlusion zone
- 90B right occlusion zone
- 100 Flowchart process block: Scan for evidence of human presence/movement(s), as well as muzzle event(s) in IR images as well as microphones
- 102 Flowchart condition block: Target or muzzle event(s) de-tected?
- 104 Flowchart process block: Locate Target(s) and compute range(s)
- 106 Flowchart process block: determine wind speed & direction
- 108 Flowchart process block: move firing arm to target vector adjusting for wind speed & direction, range zoom camera to target, fire, adjust, if commanded
- 110 Flowchart condition block: shutdown?
- 112 Flowchart condition block: next target?
- 200 pre-detect stage I (no targets detected)
- 202 Barrel/glint detection stage II (target detected via radar or IR/laser glint of scope/eye: pre-emptive fire offensive fire opportunity if verified threat, such as through zoomed visual/IR camera)
- 204 Fire detection stage III (fire from target detected via IR muzzle flash, and/or acoustic array)
- 206 Bullet trajectory tracking stage IV
- 208 Coordinated return fire stage V
- 300 bullet trajectory
- 302 bullet trajectory radar reflection
- 304 bullet
- The anti-sniper targeting and detection system operates by automatically detecting target(s), in advance, calculating the target positions relative to the system, computing the fire control arm angles based on aggregate sensor data and trajectory calculations, and rapidly moving the fire control arm to the target in an assigned shared priority sequence where targets can be viewed, assessed, verified as unfriendly, and fired upon in rapid succession. If the vehicle and targets are moving, the fire control arm and target data can be continually adjusted accordingly.
Claims (1)
1. A anti-sniper targeting and detection system comprising:
a. a camera,
b. a microphone,
c. a robotic weapon mounted with a zoomable camera,
d. a data processing system whereby targets are detected from said camera and microphone, target position is computed, gimbaled angles are computed to move to targets,
e. a radar system,
f. a transceiver, and
g. a heads up display.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/385,040 US20130192451A1 (en) | 2011-06-20 | 2012-01-30 | Anti-sniper targeting and detection system |
US14/660,661 US9488442B2 (en) | 2011-06-20 | 2015-03-17 | Anti-sniper targeting and detection system |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161571113P | 2011-06-20 | 2011-06-20 | |
US201161575131P | 2011-08-16 | 2011-08-16 | |
US201161626702P | 2011-09-30 | 2011-09-30 | |
US201161626701P | 2011-09-30 | 2011-09-30 | |
US13/385,040 US20130192451A1 (en) | 2011-06-20 | 2012-01-30 | Anti-sniper targeting and detection system |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/660,661 Continuation-In-Part US9488442B2 (en) | 2011-06-20 | 2015-03-17 | Anti-sniper targeting and detection system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130192451A1 true US20130192451A1 (en) | 2013-08-01 |
Family
ID=48869129
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/385,040 Abandoned US20130192451A1 (en) | 2011-06-20 | 2012-01-30 | Anti-sniper targeting and detection system |
Country Status (1)
Country | Link |
---|---|
US (1) | US20130192451A1 (en) |
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140375638A1 (en) * | 2012-07-30 | 2014-12-25 | Mitsubishi Electric Corporation | Map display device |
US20150192667A1 (en) * | 2011-07-21 | 2015-07-09 | James W. Rakeman | Optically Augmented Weapon Locating System and Methods of Use |
US20150273341A1 (en) * | 2013-02-11 | 2015-10-01 | University Of Southern California | Optimal patrol strategy for protecting moving targets with multiple mobile resources |
US20150287224A1 (en) * | 2013-10-01 | 2015-10-08 | Technology Service Corporation | Virtual tracer methods and systems |
US20150345907A1 (en) * | 2011-06-20 | 2015-12-03 | Real Time Companies | Anti-sniper targeting and detection system |
FR3023621A1 (en) * | 2014-07-08 | 2016-01-15 | Bertin Technologies Sa | THREAT DETECTION DEVICE |
DE102014017943A1 (en) * | 2014-12-05 | 2016-06-09 | Thyssenkrupp Ag | System and method for identifying and countering threats, especially in asymmetric threat situations |
FR3030775A1 (en) * | 2014-12-19 | 2016-06-24 | Sagem Defense Securite | OPTRONIC-ACOUSTIC FUSION PROCESS AND DEVICE THEREOF |
US20160214534A1 (en) * | 2014-09-02 | 2016-07-28 | FLIR Belgium BVBA | Watercraft thermal monitoring systems and methods |
US9812020B2 (en) * | 2015-08-13 | 2017-11-07 | Hon Hai Precision Industry Co., Ltd. | Electronic device and unmanned aerial vehicle control method |
WO2018013051A1 (en) * | 2016-07-12 | 2018-01-18 | St Electronics (Training & Simulation Systems) Pte. Ltd. | Intelligent tactical engagement trainer |
US20180017662A1 (en) * | 2015-03-12 | 2018-01-18 | Safran Electronics & Defense Sas | Airborne equipment for detecting shootings and assisting piloting |
EP3123097B1 (en) | 2014-03-28 | 2018-05-09 | Safran Electronics & Defense | Armed optoelectronic turret |
US10026165B1 (en) * | 2011-07-05 | 2018-07-17 | Bernard Fryshman | Object image recognition and instant active response |
CN108965789A (en) * | 2017-05-17 | 2018-12-07 | 杭州海康威视数字技术股份有限公司 | A kind of unmanned plane monitoring method and audio/video linkage device |
CN109040654A (en) * | 2018-08-21 | 2018-12-18 | 苏州科达科技股份有限公司 | Recognition methods, device and the storage medium of external capture apparatus |
US20180372451A1 (en) * | 2015-12-16 | 2018-12-27 | Hanwha Land Systems Co., Ltd. | Gunnery control system and gunnery control method using the same |
US20190003803A1 (en) * | 2016-02-03 | 2019-01-03 | Vk Integrated Systems | Firearm electronic system |
US10191153B2 (en) | 2014-09-02 | 2019-01-29 | Flir Systems, Inc. | Augmented reality sonar imagery systems and methods |
US10444349B2 (en) | 2014-09-02 | 2019-10-15 | FLIR Belgium BVBA | Waypoint sharing systems and methods |
US10656650B2 (en) * | 2015-01-09 | 2020-05-19 | Korean Air Lines Co., Ltd. | Method for guiding and controlling drone using information for controlling camera of drone |
US10677921B2 (en) | 2014-09-02 | 2020-06-09 | FLIR Belgium BVBA | Casting guidance systems and methods |
WO2020142126A3 (en) * | 2018-10-18 | 2020-08-13 | Bae Systems Information And Electronic Systems Integration Inc. | Imuless flight control system |
US10802141B2 (en) | 2014-05-30 | 2020-10-13 | FLIR Belgium BVBA | Water temperature overlay systems and methods |
US10852428B2 (en) | 2014-02-21 | 2020-12-01 | FLIR Belgium BVBA | 3D scene annotation and enhancement systems and methods |
EP3416408B1 (en) | 2017-06-13 | 2020-12-02 | Krauss-Maffei Wegmann GmbH & Co. KG | Vehicle with an interior and method for sound transmission into a vehicle interior of a vehicle |
DE102019115529A1 (en) * | 2019-06-07 | 2020-12-10 | Rheinmetall Electronics Gmbh | Vehicle with microphone arrangement |
US10969484B2 (en) * | 2019-01-18 | 2021-04-06 | United Arab Emirates University | Bullet detection system |
US10997721B2 (en) * | 2019-05-06 | 2021-05-04 | Beth Allison Lopez | Microbe scanning device and methods thereof |
CN113650037A (en) * | 2021-09-29 | 2021-11-16 | 绵阳久强智能装备有限公司 | Photoelectric antagonistic anti-sniper robot and control method |
US11181637B2 (en) | 2014-09-02 | 2021-11-23 | FLIR Belgium BVBA | Three dimensional target selection systems and methods |
CN113776388A (en) * | 2021-09-29 | 2021-12-10 | 中国兵器装备集团自动化研究所有限公司 | Method for suppressing follow-up shooting of moving target of weapon |
CN114268744A (en) * | 2021-12-16 | 2022-04-01 | 上海研鼎信息技术有限公司 | Camera flicker test system and test method |
US11927688B2 (en) | 2019-05-18 | 2024-03-12 | Battelle Memorial Institute | Firearm discharge location systems and methods |
US12105216B2 (en) | 2021-12-10 | 2024-10-01 | Battelle Memorial Institute | Waveform emission location determination systems and associated methods |
US12141113B2 (en) | 2020-09-17 | 2024-11-12 | James Matthew Underwood | Electronic threat assessment system |
-
2012
- 2012-01-30 US US13/385,040 patent/US20130192451A1/en not_active Abandoned
Cited By (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150345907A1 (en) * | 2011-06-20 | 2015-12-03 | Real Time Companies | Anti-sniper targeting and detection system |
US9488442B2 (en) * | 2011-06-20 | 2016-11-08 | Real Time Companies, LLC | Anti-sniper targeting and detection system |
US10026165B1 (en) * | 2011-07-05 | 2018-07-17 | Bernard Fryshman | Object image recognition and instant active response |
US20180204320A1 (en) * | 2011-07-05 | 2018-07-19 | Bernard Fryshman | Object image recognition and instant active response |
US20150192667A1 (en) * | 2011-07-21 | 2015-07-09 | James W. Rakeman | Optically Augmented Weapon Locating System and Methods of Use |
US9234963B2 (en) * | 2011-07-21 | 2016-01-12 | Thales-Raytheon Systems Company Llc | Optically augmented weapon locating system and methods of use |
US20140375638A1 (en) * | 2012-07-30 | 2014-12-25 | Mitsubishi Electric Corporation | Map display device |
US20150273341A1 (en) * | 2013-02-11 | 2015-10-01 | University Of Southern California | Optimal patrol strategy for protecting moving targets with multiple mobile resources |
US9931573B2 (en) * | 2013-02-11 | 2018-04-03 | University Of Southern California | Optimal patrol strategy for protecting moving targets with multiple mobile resources |
US20150287224A1 (en) * | 2013-10-01 | 2015-10-08 | Technology Service Corporation | Virtual tracer methods and systems |
US10852428B2 (en) | 2014-02-21 | 2020-12-01 | FLIR Belgium BVBA | 3D scene annotation and enhancement systems and methods |
EP3123097B1 (en) | 2014-03-28 | 2018-05-09 | Safran Electronics & Defense | Armed optoelectronic turret |
US10802141B2 (en) | 2014-05-30 | 2020-10-13 | FLIR Belgium BVBA | Water temperature overlay systems and methods |
FR3023621A1 (en) * | 2014-07-08 | 2016-01-15 | Bertin Technologies Sa | THREAT DETECTION DEVICE |
US10931934B2 (en) * | 2014-09-02 | 2021-02-23 | FLIR Belgium BVBA | Watercraft thermal monitoring systems and methods |
US10191153B2 (en) | 2014-09-02 | 2019-01-29 | Flir Systems, Inc. | Augmented reality sonar imagery systems and methods |
US11181637B2 (en) | 2014-09-02 | 2021-11-23 | FLIR Belgium BVBA | Three dimensional target selection systems and methods |
US20160214534A1 (en) * | 2014-09-02 | 2016-07-28 | FLIR Belgium BVBA | Watercraft thermal monitoring systems and methods |
US10677921B2 (en) | 2014-09-02 | 2020-06-09 | FLIR Belgium BVBA | Casting guidance systems and methods |
US10444349B2 (en) | 2014-09-02 | 2019-10-15 | FLIR Belgium BVBA | Waypoint sharing systems and methods |
DE102014017943A1 (en) * | 2014-12-05 | 2016-06-09 | Thyssenkrupp Ag | System and method for identifying and countering threats, especially in asymmetric threat situations |
WO2016087115A1 (en) | 2014-12-05 | 2016-06-09 | Thyssenkrupp Marine Systems Gmbh | System and a method for locating and combatting threats, in particular in asymmetric threat situations |
FR3030775A1 (en) * | 2014-12-19 | 2016-06-24 | Sagem Defense Securite | OPTRONIC-ACOUSTIC FUSION PROCESS AND DEVICE THEREOF |
US10656650B2 (en) * | 2015-01-09 | 2020-05-19 | Korean Air Lines Co., Ltd. | Method for guiding and controlling drone using information for controlling camera of drone |
US10459069B2 (en) * | 2015-03-12 | 2019-10-29 | Safran Electronics & Defense Sas | Airborne equipment for detecting shootings and assisting piloting |
US20180017662A1 (en) * | 2015-03-12 | 2018-01-18 | Safran Electronics & Defense Sas | Airborne equipment for detecting shootings and assisting piloting |
US9812020B2 (en) * | 2015-08-13 | 2017-11-07 | Hon Hai Precision Industry Co., Ltd. | Electronic device and unmanned aerial vehicle control method |
US20180372451A1 (en) * | 2015-12-16 | 2018-12-27 | Hanwha Land Systems Co., Ltd. | Gunnery control system and gunnery control method using the same |
US10663258B2 (en) * | 2015-12-16 | 2020-05-26 | Hanwha Defense Co., Ltd. | Gunnery control system and gunnery control method using the same |
US20190003803A1 (en) * | 2016-02-03 | 2019-01-03 | Vk Integrated Systems | Firearm electronic system |
US10578403B2 (en) * | 2016-02-03 | 2020-03-03 | VK Integrated Systems, Inc. | Firearm electronic system |
US10890415B2 (en) * | 2016-02-03 | 2021-01-12 | VK Integrated Systems, Inc. | Firearm electronic system |
WO2018013051A1 (en) * | 2016-07-12 | 2018-01-18 | St Electronics (Training & Simulation Systems) Pte. Ltd. | Intelligent tactical engagement trainer |
CN108965789A (en) * | 2017-05-17 | 2018-12-07 | 杭州海康威视数字技术股份有限公司 | A kind of unmanned plane monitoring method and audio/video linkage device |
EP3416408B2 (en) † | 2017-06-13 | 2023-08-09 | Krauss-Maffei Wegmann GmbH & Co. KG | Vehicle with an interior and method for sound transmission into a vehicle interior of a vehicle |
EP3416408B1 (en) | 2017-06-13 | 2020-12-02 | Krauss-Maffei Wegmann GmbH & Co. KG | Vehicle with an interior and method for sound transmission into a vehicle interior of a vehicle |
CN109040654A (en) * | 2018-08-21 | 2018-12-18 | 苏州科达科技股份有限公司 | Recognition methods, device and the storage medium of external capture apparatus |
US11221194B2 (en) * | 2018-10-18 | 2022-01-11 | Bae Systems Information And Electronic Systems Integration Inc. | IMUless flight control system |
WO2020142126A3 (en) * | 2018-10-18 | 2020-08-13 | Bae Systems Information And Electronic Systems Integration Inc. | Imuless flight control system |
US10969484B2 (en) * | 2019-01-18 | 2021-04-06 | United Arab Emirates University | Bullet detection system |
US10997721B2 (en) * | 2019-05-06 | 2021-05-04 | Beth Allison Lopez | Microbe scanning device and methods thereof |
US11927688B2 (en) | 2019-05-18 | 2024-03-12 | Battelle Memorial Institute | Firearm discharge location systems and methods |
DE102019115529A1 (en) * | 2019-06-07 | 2020-12-10 | Rheinmetall Electronics Gmbh | Vehicle with microphone arrangement |
US12141113B2 (en) | 2020-09-17 | 2024-11-12 | James Matthew Underwood | Electronic threat assessment system |
CN113650037A (en) * | 2021-09-29 | 2021-11-16 | 绵阳久强智能装备有限公司 | Photoelectric antagonistic anti-sniper robot and control method |
CN113776388A (en) * | 2021-09-29 | 2021-12-10 | 中国兵器装备集团自动化研究所有限公司 | Method for suppressing follow-up shooting of moving target of weapon |
US12105216B2 (en) | 2021-12-10 | 2024-10-01 | Battelle Memorial Institute | Waveform emission location determination systems and associated methods |
CN114268744A (en) * | 2021-12-16 | 2022-04-01 | 上海研鼎信息技术有限公司 | Camera flicker test system and test method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20130192451A1 (en) | Anti-sniper targeting and detection system | |
US9488442B2 (en) | Anti-sniper targeting and detection system | |
US11874092B2 (en) | Target analysis and recommendation | |
US7870816B1 (en) | Continuous alignment system for fire control | |
US8833231B1 (en) | Unmanned range-programmable airburst weapon system for automated tracking and prosecution of close-in targets | |
US6621764B1 (en) | Weapon location by acoustic-optic sensor fusion | |
US6215731B1 (en) | Acousto-optic weapon location system and method | |
EP2956733B1 (en) | Firearm aiming system with range finder, and method of acquiring a target | |
US7210392B2 (en) | Autonomous weapon system | |
US6995660B2 (en) | Commander's decision aid for combat ground vehicle integrated defensive aid suites | |
US5822713A (en) | Guided fire control system | |
US10048039B1 (en) | Sighting and launching system configured with smart munitions | |
US20120274922A1 (en) | Lidar methods and apparatus | |
US20090260511A1 (en) | Target acquisition and tracking system | |
US20060283317A1 (en) | Missile protection system for vehicles | |
US20130099096A1 (en) | Flash detection and laser response system | |
US12000674B1 (en) | Handheld integrated targeting system (HITS) | |
KR20230105162A (en) | Unmanned Combat Vehicle and Tarket Detection Method thereof | |
RU2241193C2 (en) | Antiaircraft guided missile system | |
RU2292005C1 (en) | Installation for fire at high-speed low-altitude targets | |
RU25077U1 (en) | MOBILE ANTI-AIR DEFENSE MISSILE COMPLEX | |
Scanlon et al. | Sensor and information fusion for enhanced detection, classification, and localization | |
Young et al. | Acoustic sensors on small robots for the urban environment | |
AU2024200238A1 (en) | All Seeing Eyes Housing System | |
Ke-Rong et al. | Design of an unattended laser-target designator based on multi-sensor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: REAL TIME COMPANIES, ARIZONA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:VARGA, KENNETH;HIETT, JOHN;SCOTT, STEVEN;REEL/FRAME:027950/0165 Effective date: 20120227 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |