US9612326B2 - Methods and apparatus for detection system having fusion of radar and audio data - Google Patents

Methods and apparatus for detection system having fusion of radar and audio data Download PDF

Info

Publication number
US9612326B2
US9612326B2 US14/068,318 US201314068318A US9612326B2 US 9612326 B2 US9612326 B2 US 9612326B2 US 201314068318 A US201314068318 A US 201314068318A US 9612326 B2 US9612326 B2 US 9612326B2
Authority
US
United States
Prior art keywords
weapon
detected
firing event
time
projectile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US14/068,318
Other versions
US20160223662A1 (en
Inventor
Richard S. Herbel
James W. Rakeman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Raytheon Command and Control Solutions LLC
Original Assignee
Raytheon Command and Control Solutions LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Raytheon Command and Control Solutions LLC filed Critical Raytheon Command and Control Solutions LLC
Priority to US14/068,318 priority Critical patent/US9612326B2/en
Assigned to THALES RAYTHEON SYSTEMS COMPANY, LLC reassignment THALES RAYTHEON SYSTEMS COMPANY, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RAKEMAN, JAMES W., HERBEL, RICHARD S.
Assigned to RAYTHEON COMMAND AND CONTROL SOLUTIONS LLC reassignment RAYTHEON COMMAND AND CONTROL SOLUTIONS LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: THALES-RAYTHEON SYSTEMS COMPANY LLC
Publication of US20160223662A1 publication Critical patent/US20160223662A1/en
Application granted granted Critical
Publication of US9612326B2 publication Critical patent/US9612326B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/14Indirect aiming means
    • F41G3/147Indirect aiming means based on detection of a firing weapon
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/80Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using ultrasonic, sonic or infrasonic waves
    • G01S3/802Systems for determining direction or deviation from predetermined direction

Definitions

  • Radar systems transmit electromagnetic radiation and analyze reflected echoes of returned radiation to determine information about the presence, position, and motion of objects within a scanned area.
  • Conventional weapon locating systems include a radar system that can detect and track projectiles, such as artillery projectiles, to determine the location of the fired weapon. This determination can be based on an extrapolation of estimated state vectors, derived by radar tracking of a ballistic target, to a point of intersection. Identified coordinates associated with the intersection point approximate the location of the weapon that launched the projectile.
  • Exemplary embodiments of the present invention provide method and apparatus to detect the firing location of weapons that fire projectiles with low angle trajectories.
  • audio sensor information is fused with radar data to enhance the ability of the system to locate the location of a projectile firing system, such as a rocket. While exemplary embodiments of the invention are shown and described in conjunction with certain components, configurations, weapons, and the like, it is understood that alternative embodiments within the scope of the present invention will be apparent to one of ordinary skill in the art.
  • a method of locating a weapon comprises: detecting a weapon firing event with an audio sensor system, the detected weapon firing event indicative of a detected firing of the weapon and indicative of a detected time of the weapon firing event, detecting a projectile fired from the weapon with a radar system, calculating a state vector associated with the projectile detection, identifying a location of the weapon by backtracking the state vector to the detected time of the weapon firing event time, and communicating the location of the weapon.
  • the method can further include one or more of the following features: generating a common time base for the weapon firing event and for the projectile detection, the audio sensor system comprises an audio sensor, the audio sensor system is directional and provides at least one of an azimuth angle of the detected weapon firing event or an elevation angle of the detected weapon firing event, the step of detecting the weapon firing event with the audio sensor system comprises detecting the weapon firing event by direct path detection of audio generated by the weapon firing event, correlating the weapon firing event detected by the audio sensor system with the detection of the projectile by the radar system to determine if the weapon firing event detected by the audio sensor system corresponds to the same projectile as that detected by the radar system, wherein the correlating comprises: selecting a time difference threshold, and relating the time difference threshold to a difference between the detected time of the weapon firing event detected by the audio sensor system and a time of the detection of the projectile by the radar system, the correlating comprises: selecting a time difference threshold, and relating the time difference threshold to a difference between a time predicted by the state vector when backtracke
  • a weapon locating system comprises: an audio sensor system configured to detect a weapon firing event, the detected weapon firing event indicative of a detected firing of the weapon and indicative of a detected time of the weapon firing event, a radar system configured to detect a projectile fired from the weapon, a processor configured to calculate a state vector associated with the projectile detection and to backtrack the state vector to the detected time of the weapon firing event to identify the location of the weapon, and a communication system configured to communicate the location of the weapon.
  • the system can further include one or more of the following features: the processor is further configured to correlate the weapon firing event detected by the audio sensor system with the detection of the projectile by the radar system to determine if the weapon firing event detected by the audio sensor system corresponds to the same projectile as that detected by the radar system, the processor is further configured to select a time difference threshold, and relate the time difference threshold to a difference between the detected time of the weapon firing event detected by the audio sensor system and a time of the detection of the projectile by the radar system, the processor is further configured to select a time difference threshold, and relate the time difference threshold to a difference between a time predicted by the state vector when backtracked to a terrain and the detected time of the weapon firing event detected by the audio sensor system, the processor is further configured to select a position difference threshold, and relate the position difference threshold to a difference between a location predicted by the state vector when backtracked to a terrain and a location predicted by the state vector when backtracked to the detected time of the weapon firing event detected by the audio sensor system, and/
  • an article comprises: at least one computer-readable medium containing non-transitory stored instructions that enable a machine to perform: detecting a weapon firing event with an audio sensor system, the detected weapon firing event indicative of a detected firing of the weapon and indicative of a detected time of the weapon firing event, detecting a projectile fired from the weapon with a radar system, calculating a state vector associated with the projectile detection, identifying a location of the weapon by backtracking the state vector to the detected time of the weapon firing event time, and communicating the location of the weapon.
  • the article can further include instructions for correlating the weapon firing event detected by the audio sensor system with the detection of the projectile by the radar system to determine if the weapon firing event detected by the audio sensor system corresponds to the same projectile as that detected by the radar system.
  • FIG. 1 is a schematic diagram illustrating a weapon locating system according to an embodiment of the present invention
  • FIG. 2 is schematic diagram of a weapon locating system according to an embodiment of the present invention
  • FIG. 3 is a diagram illustrating the trajectory of a projectile fired from a weapon
  • FIG. 4 is a flowchart illustrating a method of locating a weapon using an audio augmented weapon locating system having a radar system in combination with an audio sensor system according to an embodiment of the present invention
  • FIG. 5 is a graph of down range error for an audio augmented and non augmented conventional weapon locating systems at various quadrant elevations.
  • FIG. 6 is a schematic representation of an exemplary a weapon locating system combing optical and/or audio sensor data with a radar system
  • FIG. 7 is a schematic representation of an exemplary computer that can perform at least a portion of the processing described herein.
  • FIG. 1 shows an exemplary weapon locating system 100 including a radar system 102 and an audio sensor system 104 that communicate signals to a signal processing system 106 .
  • the signal processing system 106 can transmit information derived from the received signals to a response system 108 .
  • Quadrant elevation is a commonly used artillery term to describe the angle between the gun or launchers elevation angle and local horizontal.
  • data generated by the combination of systems i.e., an audio augmented weapon locating system
  • the radar system 102 can be capable of detecting and tracking one or more projectiles fired from a weapon.
  • the radar system 102 can be a phased array radar system, also known as an electronically scanned array (“ESA”), which is a type of radar system that uses multiple antennas to transmit and/or receive radiofrequency (RF) signals at shifted relative phases.
  • ESA electronically scanned array
  • the phase shifting thus allows the transmitted and/or received RF energy to be transmitted and/or received as transmit and/or received beams that can be electronically “steered” without the need to physically move components of the radar system.
  • phased array radar system used in a conventional non-augmented weapon locating system include the AN/TPQ-36 and the AN/TPQ-37 Firefinder Weapon Locating Systems manufactured by Raytheon Company of Waltham, Mass.
  • the phased array can be comprised of transmit and/or receive elements disposed within a common assembly. In some other embodiments, the phased array can be comprised of transmit and/or receive antennas that a spatially separated and not disposed within a common assembly.
  • phased array radar systems can be an effective choice for the radar system 102 , other types of radar systems may also be suitable.
  • the radar system 102 can be stationary or mounted on a mobile platform.
  • the radar system 102 includes an antenna system 110 , one or more transmitters 112 , and one or more receivers 114 .
  • the transmit and receive functions can be provided by a combined transmit/receive module. While not shown for clarity, it will be understood that the radar system 102 can also include various components such as controllers, duplexers, oscillators, mixers, amplifiers, synchronizers, modulators, antenna positioning systems, power supply systems, data storage devices, and signal pre-processing equipment.
  • the audio sensor system 104 can comprise any type of sensor system capable of detecting and processing sound information.
  • the audio sensor system 104 can include one or more transducers 116 sensitive to sound information for a given frequency range. It is understood that a series of transducers optimized for particular frequency ranges can be used to cover a desired aggregate frequency range.
  • a signal processing module 118 can process the sensor information, as described more fully below.
  • the sensors may be located on the radar or the sensors may be located to form a network of sensors for determining the azimuth and elevation of a projectile solution. It is understood that the temperature and humidity of the environment can be used to calculate the speed of sound.
  • the system can process audio profiles of different types of devices to determine the type of device fired. Such profiles can be stored in a database, for example.
  • state vector is used to describe a collection of parameters (i.e., one or more parameters) that correspond to set of characteristics of a moving projectile.
  • the one or more state parameters within a state vector can include, but are not limited to, a position (in a coordinate system), a time, a speed, a heading (or three dimensional velocity vector), and acceleration in one or more dimensions, of the moving projectile.
  • the term “backtracking” is used to describe a process by which one or more state vectors, each describing one or more parameters associated with a projectile at a respective one or more positions along a trajectory, can be extrapolated backward in time and space to identify a state vector associated with the projectile at an earlier point along the trajectory.
  • the state vector at the earlier time and space can include both an earlier time and a location of the projectile at the earlier time.
  • the term “terrain” is used to describe topographical characteristics of the earth's surface. The terrain can be represented by numerical values.
  • electromagnetic radiation is classified by wavelength into radio, microwave, infrared, visible, ultraviolet, X-rays, and gamma rays, in order of decreasing wavelength.
  • the term “light” is used to describe at least electromagnetic radiation having a wavelength in the infrared, visible, or ultraviolet portions of the electromagnetic spectrum.
  • the term “optical” is used herein to describe a system or component (e.g., sensor that interacts with or that processes the infrared, visible, or ultraviolet portions of the electromagnetic spectrum.
  • “sound” is used to describe as vibrations that travel through the air or another medium.
  • FIG. 2 shows an exemplary system 200 having at least one radar system 202 a -N and at least one audio sensor system 204 a -M.
  • the radar systems 202 and audio sensor systems 204 are coupled to a signal processor 206 that can fuse the radar and audio sensor information to locate a weapon system 20 firing a projectile 22 .
  • the signal processor 206 can generate an output signal indicative of one or more of a time of a detected firing event, an azimuth bearing of the detected firing event, or an elevation angle of the detected firing event.
  • the audio sensor systems 204 are located proximate or attached to a respective radar system. In an alternative embodiment, the audio sensors are located at locations separated from the radar systems. In one embodiment, audio sensors systems 204 can be launched and/or dropped in the area suspected to contain a weapon system. The audio sensors 204 can include a GPS receiver to determine its location and report this and other information to the signal processor 206 or other entity. In other embodiments, audio sensors are remote from other components of the audio sensor systems.
  • the signal processing system 206 can include one or more computer processors, a data storage system, an output interface, a network interface, and software for processing the signals received from the audio sensor system(s) 204 and the radar system(s) 202 . Other hardware, firmware, and software can also be incorporated into the signal processing system 206 .
  • the signal processing system 206 can also include a communication system for transmitting, via either wired or wireless connection, data to a response system. The communicated data may include a set of fired weapon location coordinates.
  • the response system 208 can include, for example, a counter fire weapon system capable of returning fire to the location of the fired weapon, a friendly fire detection system capable of determining the location of allied forces, or a threat assessment system for use by peace-keeping or law enforcement agencies to determine a location for follow-up investigation or patrol.
  • the weapon locating system 200 can be used to determine a location from which a weapon is fired.
  • the weapon may fire any type of projectiles including shells, shot, missiles, or rockets.
  • the projectile when a weapon fires a projectile from a location 330 , the projectile follows a trajectory 332 . While the trajectory is shown to be a straight line, it will be understood that the trajectory need not be a straight line.
  • a quadrant elevation is an angle between an axis 336 upon a horizontal plane and an axis of a bore of the weapon fired from the location 330 .
  • a firing azimuth, ⁇ is an angle formed between an axis 331 between the firing location 330 and the radar system 102 (of the weapon locating system 100 ) and the axis 336 .
  • Acquisition, i.e. detection and tracking, of the fired projectile by the radar system 102 can occur at a location 338 along the trajectory 332 .
  • the radar system 102 can generate a state vector that describes one or more characteristics of the projectile and of the trajectory of the projectile.
  • the radar system 102 can make other detections at other points along the trajectory and can form other associated state vectors.
  • the radar system 102 and signal processing system 106 can backtrack the resulting one or more state vectors to identify a state vector that intersects the terrain.
  • the intersection can identify the location 330 of the weapon that fired the projectile.
  • the identification of the location 330 is not precise.
  • An error associated with the radar system 102 (without use of the audio sensor system 104 (and neglecting radar ranging errors that are generally small compared to angular errors) can be characterized as an “error ellipse” 340 lying along the line-of-fire having a down range error component ⁇ DOWN 342 and a cross range error component ⁇ CROSS 344 .
  • the error components 342 , 344 of the error ellipse 340 can be calculated as follows:
  • FIG. 4 shows an exemplary sequence of steps 450 for determining the location of a weapon firing system by combining radar and audio sensor information in accordance with exemplary embodiments of the invention.
  • conventional backtracking of a state vector associated with a sequence of radar measurements backtracks the state vector until the backtracked state vector intersects a position in space identified by the backtracked state vector intersecting the terrain.
  • the intersection in space can provide, within the intersecting state vector, a prediction in space of a location of a weapon, and also a prediction in time of when the weapon was fired.
  • techniques described below can backtrack the state vector until the backtracked state vector intersects a time identified by the backtracked state vector intersecting a time identified by the above-described audio sensor system 104 .
  • This intersection in time can also provide, within the intersecting state vector, a prediction in space of a location of the weapon and also a prediction in time (known by the audio sensor detection in time) of when the weapon was fired.
  • both the conventional non-augmented weapon locating system and the audio-augmented weapon locating system can provide both a prediction of a location of a weapon and either a prediction of or knowledge of, a time of firing of the weapon.
  • a weapon firing event occurs, resulting in sound that propagates both directly and indirectly from the weapon to the audio sensor system, for example, to the audio sensor system 104 of FIG. 1 .
  • the sound tends to have a time duration in accordance with the type of weapon fired. For example, if the sound occurs due to a gun firing event, the sound can have a duration in the order of milliseconds.
  • the sound can result from an explosive event, e.g., a gunshot, or a controlled firing event, e.g., a rocket, etc.
  • the beginning of the sound is indicative of a weapon firing event, however, for some types of weapon firing events, it may be desirable to mark the time of the weapon firing event as being a bit later in time, for example, if the sound is generated by a prolonged rocket blast and the projectile is a rocket that accelerates relatively slowly.
  • the audio sensor system 104 may use the time duration of the audio event to classify the event as to the sound's source.
  • the time history may be used to discriminate out non-firings or to classify the firing event as to type (e.g., rocket, mortar, artillery, etc.).
  • type e.g., rocket, mortar, artillery, etc.
  • the radar system can use the classification data to improve ballistic estimator performance through better modeling of the projectile, to adjust the projectile's firing time to reflect the time the projectile actually began to leave the lunch platform (for example, rockets may require time to build thrust) and to eliminate possible false radar detections.
  • an audio detection of a weapon firing event occurs where the audio sensor system. 104 detects the sound associated with the weapon firing event 452 .
  • the time of the detection event 454 can be stored, for example in a memory device associated with the audio sensor system 104 or with the signal processing system 106 .
  • the audio sensor system 104 is directional, in which case, a bearing (i.e., direction) associated with the detection event can also be stored.
  • the audio sensor system 104 may have directional measurement capability in both bearing and elevation.
  • a means of aligning the audio system with the radar is provided. For example, a microphone array can perform beamforming to determine event direction.
  • a common time base for the radar system 102 and for the audio sensor system 104 is established, for example using a global positioning system (GPS), inter-range instrumentation group (IRIG) time codes, or still another type of common clock.
  • GPS global positioning system
  • IRIG inter-range instrumentation group
  • the common clock can be absolute or relative. This synchronization of the times for the systems 102 , 104 will generally occur before the weapon firing event 452 and can be scheduled to occur with regularity so that the systems stay synchronized.
  • a radar acquisition event i.e., a radar detection of a projectile
  • state information such as altitude, speed, direction, acceleration, and time, associated with the projectile at the location 338 ( FIG. 3 ) along the trajectory 332 ( FIG. 3 ) is collected.
  • the radar acquisition event i.e., the radar detection, can occur after the weapon firing event detected by the audio sensor system 1-4 at block 454 .
  • the radar system 102 forms a state vector using the state information associated with the projectile at location 338 .
  • a correlation between the audio detection event 454 and the radar acquisition event 460 is made, if possible, using the stored event time and, in some embodiment, bearing data.
  • the correlation can be made in a number or ways. For example, in some embodiments, the correlation is made by comparing the time of the audio detection event with a time of the radar detection.
  • a time difference threshold can be established based upon the environment (i.e., the application) in which the system is used, and any time difference less than the time difference threshold can be indicative of a correlation. For example, if the system is used to detect locations of missile firing events, the time difference threshold can be relatively large, for example, 5 seconds. For another example, if the system is used to detect locations of close range gun firing events, the time difference threshold can be relatively small, for example, 0.1 seconds. Other time difference thresholds are possible.
  • backtracking the state vector in steps below provides both a weapon firing location estimate and a firing time estimate.
  • correlation can be established by comparing the time of the weapon firing event detected by the audio sensor system 104 ( FIG. 1 ) with a time of the radar detection of the projectile when the state vector of the projectile is conventionally backtracked to intersect the terrain.
  • a time difference threshold can be established based upon the environment (i.e., the application) in which the system is used, and any time difference less than the time difference threshold can be indicative of a correlation. For example, if the system is used to detect locations of missile firing events, the time difference threshold can be relatively large, for example, 5 seconds. For another example, if the system is used to detect locations of close range gun firing events, the time difference threshold can be relatively small, for example, 0.1 seconds. Other time difference thresholds are possible.
  • correlation can be established by comparing results of the conventional backtracking of the radar state vector to intersect the terrain with the backtracking described herein and below that backtracks the radar state vector to a point in time (and resulting space) established by the audio sensor system 104 .
  • both methods generate a prediction of a position from which the weapon was fired.
  • a position difference threshold can be established based upon the environment (i.e., the application) in which the system is used, and any position difference less than the position difference threshold can be indicative of a correlation. Other position difference thresholds are possible.
  • correlation can be established by comparing the azimuth bearing and or the elevation angle reported by the audio sensor system 104 with the azimuth bearing and/or the elevation angle reported by the radar system 102 .
  • An azimuth angle difference threshold and/or an elevation angle difference threshold can be established based upon the environment (i.e., the application) in which the system is used, and any azimuth angle difference and/or elevation angle difference less than azimuth angle difference threshold and/or elevation angle difference threshold can be indicative of a correlation.
  • the azimuth angle difference threshold and/or the elevation angle difference threshold can be relatively large, for example, both 10.0 degrees.
  • the azimuth angle difference threshold and/or the elevation angle difference threshold can be relatively small, for example, both 1.0 degrees.
  • Other angle difference thresholds are possible.
  • any one or more of the above-described techniques can be used to identify a correlation between radar detected events and filing events detected by the audio sensor system 104 . Some correlations can be deemed to be primary and others can be deemed to be secondary, in any combination. Other correlation techniques are also possible, including techniques that make use of the directional capability of some audio sensor systems.
  • a more accurate weapon firing location can be determined by the audio augmented weapon locating system 100 ( FIG. 1 ) by backtracking the state vector, not to an intersection with the terrain, but instead to an intersection in time with the weapon firing event time identified by the audio sensor system 104 .
  • Extrapolation and/or interpolation techniques can be used to perform the backtracking. Geographic coordinates associated with the backtracked state vector at the weapon firing event time identified by the audio sensor system 104 are calculated to establish a likely location of the weapon that fired the projectile. This calculated location of the weapon is more accurate than that described above using the radar system 102 alone.
  • the coordinates associated with the likely location of the fired weapon are communicated to the response system 108 ( FIG. 1 ).
  • the response system 108 can direct a counter fire weapon capable of returning fire to the location of the fired weapon.
  • the response system 108 can associate the location of the fired weapon with friendly fire from allied forces for purposes of mapping the location of allied forces.
  • the response system 108 can map the location of the fired weapon for use by peace-keeping or law enforcement agencies to determine a geographic area for follow-up investigation or patrol.
  • a graph 580 compares down range error at various quadrant elevations liar one example of a non-augmented conventional weapon locating system having only a radar system and one example of an audio augmented weapon locating system having both a radar system and an audio sensor system.
  • Alternative embodiments of audio-augmented and non-augmented conventional weapon locating systems may result in different error responses.
  • the plotted data set 582 shows a relationship between the QE of a firing weapon and the down range error component ( ⁇ DOWN ) associated with the use of a non-augmented conventional weapon locating system having the radar system 102 but not the audio sensor system 104 .
  • the graph 580 shows the general relationship between QE and down range error associated with the non-augmented conventional weapon locating system. As the QE of the firing weapon approaches 0, the error increases exponentially, rendering the non-augmented conventional weapon locating system relatively ineffectual in determining the location, of the firing weapon.
  • a plotted data set 582 shows a relationship between the QE of the fired weapon and the down range error component associated with the use of an audio augmented weapon locating system, e.g., 100 of FIG. 1 , having both a radar system and an audio sensor system.
  • the down range comparison graph 580 shows that backtracking the state vector of a projectile, as formed by a non-augmented conventional weapon locating system, to a time associated with a weapon firing event, as detected by an audio sensor system in an audio-augmented weapon locating system, results in improved weapon locating for low QE values of the firing weapon. It is understood that the results presented in FIG. 5 are just one example of audio-augmented and non-augmented conventional weapon locating system down range error values. For alternative embodiments, using different radar and audio sensor systems, the results may vary.
  • FIG. 6 shows an exemplary weapon locating system 600 including a radar system 602 and an optical sensor system 604 , both of which communicate signals to a signal processing system 606 .
  • the optical sensor system 604 and/or an audio sensor system 616 can provide information to the signal processing system 606 .
  • the signal processing system 606 can transmit information derived from the received signals to a response system 608 .
  • the optical sensor system 604 is also referred to as an electro-optical (EO) system herein.
  • the optical sensor system 604 can be any type of optical sensor system capable of detecting and processing light.
  • the optical sensor system 604 can comprise an electro optical (EO) sensor system.
  • FIG. 7 shows an exemplary computer 700 that can perform at least part of the processing described herein.
  • the computer 700 includes a processor 702 , a volatile memory 704 , a non volatile memory 706 (e.g., hard disk), an output device 707 and a graphical user interface (GUI) 708 (e.g., a mouse, a keyboard, a display, for example).
  • the non-volatile memory 706 stores computer instructions 712 , an operating system 716 and data 718 .
  • the computer instructions 712 are executed by the processor 702 out of volatile memory 704 .
  • an article 720 comprises non-transitory computer-readable instructions.
  • Processing may be implemented in hardware, software, or a combination of the two. Processing may be implemented in computer programs executed on programmable computers/machines that each includes a processor, a storage medium or other article of manufacture that is readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and one or more output devices. Program code may be applied to data entered using an input device to perform processing and to generate output information.
  • the system can perform processing, at least in part, via a computer program product, (e.g., in a machine-readable storage device), for execution by, or to control the operation of data processing apparatus (e.g., a programmable processor, a computer, or multiple computers).
  • a computer program product e.g., in a machine-readable storage device
  • data processing apparatus e.g., a programmable processor, a computer, or multiple computers.
  • Each such program may be implemented in a high level procedural or object-oriented programming language to communicate with a computer system.
  • the programs may be implemented in assembly or machine language.
  • the language may be a compiled or an interpreted language and it may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
  • a computer program may be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.
  • a computer program may be stored on a storage medium or device (e.g., CD-ROM, hard disk, or magnetic diskette) that is readable by a general or special purpose programmable computer for configuring and operating the computer when the storage medium or device is read by the computer.
  • Processing may also be implemented as a machine-readable storage medium, configured with a computer program, where upon execution, instructions in the computer program cause the computer to operate.
  • Processing may be performed by one or more programmable processors executing one or more computer programs to perform the functions of the system. All or part of the system may be implemented as, special purpose logic circuitry (e.g., an FPGA (field programmable gate array) and/or an ASIC (application-specific integrated circuit)).
  • special purpose logic circuitry e.g., an FPGA (field programmable gate array) and/or an ASIC (application-specific integrated circuit)

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

Methods and apparatus for locating a weapon by fusing audio and radar data. An exemplary embodiment comprises detecting a weapon firing event with an audio sensor system, detecting a projectile fired from the weapon with a radar system, calculating a state vector associated with the projectile detection, identifying a location of the weapon by backtracking the state vector to the detected time of the weapon firing event time, and communicating the location of the weapon.

Description

BACKGROUND
Radar systems transmit electromagnetic radiation and analyze reflected echoes of returned radiation to determine information about the presence, position, and motion of objects within a scanned area. Conventional weapon locating systems include a radar system that can detect and track projectiles, such as artillery projectiles, to determine the location of the fired weapon. This determination can be based on an extrapolation of estimated state vectors, derived by radar tracking of a ballistic target, to a point of intersection. Identified coordinates associated with the intersection point approximate the location of the weapon that launched the projectile.
When the elevation angle of the bore of the fired weapon is small relative to the local earth tangent plane, conventional weapon locating systems are generally unable to accurately determine the location of the weapon. Such low angle trajectories produce exaggerated errors in the state vector estimates. As the angle of elevation approaches zero, the intersection point on the terrain becomes indeterminate. At low angle trajectories, weapon location determination is also limited because projectile detection and tracking by radar systems can be limited by impaired lines of sight, radar multipath echoes, and clutter. In addition, the short track life of near-in fire with low angle trajectories creates difficulties in discriminating false targets. When the location of the firing weapon cannot be accurately determined, the ability to return precision counter fire or launch rockets at the firing weapon is impaired.
SUMMARY
Exemplary embodiments of the present invention provide method and apparatus to detect the firing location of weapons that fire projectiles with low angle trajectories. In exemplary embodiments, audio sensor information is fused with radar data to enhance the ability of the system to locate the location of a projectile firing system, such as a rocket. While exemplary embodiments of the invention are shown and described in conjunction with certain components, configurations, weapons, and the like, it is understood that alternative embodiments within the scope of the present invention will be apparent to one of ordinary skill in the art.
In one aspect of the invention, a method of locating a weapon comprises: detecting a weapon firing event with an audio sensor system, the detected weapon firing event indicative of a detected firing of the weapon and indicative of a detected time of the weapon firing event, detecting a projectile fired from the weapon with a radar system, calculating a state vector associated with the projectile detection, identifying a location of the weapon by backtracking the state vector to the detected time of the weapon firing event time, and communicating the location of the weapon.
The method can further include one or more of the following features: generating a common time base for the weapon firing event and for the projectile detection, the audio sensor system comprises an audio sensor, the audio sensor system is directional and provides at least one of an azimuth angle of the detected weapon firing event or an elevation angle of the detected weapon firing event, the step of detecting the weapon firing event with the audio sensor system comprises detecting the weapon firing event by direct path detection of audio generated by the weapon firing event, correlating the weapon firing event detected by the audio sensor system with the detection of the projectile by the radar system to determine if the weapon firing event detected by the audio sensor system corresponds to the same projectile as that detected by the radar system, wherein the correlating comprises: selecting a time difference threshold, and relating the time difference threshold to a difference between the detected time of the weapon firing event detected by the audio sensor system and a time of the detection of the projectile by the radar system, the correlating comprises: selecting a time difference threshold, and relating the time difference threshold to a difference between a time predicted by the state vector when backtracked to a terrain and the detected time of the weapon firing event detected by the audio sensor system, the correlating comprises: selecting a position difference threshold, and relating the position difference threshold to a difference between a location predicted by the state vector when backtracked to a terrain and a location predicted by the state vector when backtracked to the detected time of the weapon firing event detected by the audio sensor system, the correlating comprises: selecting an angle difference threshold, and/or relating the angle difference threshold to a difference between an angle to the projectile identified by the radar system and an angle to the weapon identified by the audio sensor system.
In another aspect of the invention, a weapon locating system comprises: an audio sensor system configured to detect a weapon firing event, the detected weapon firing event indicative of a detected firing of the weapon and indicative of a detected time of the weapon firing event, a radar system configured to detect a projectile fired from the weapon, a processor configured to calculate a state vector associated with the projectile detection and to backtrack the state vector to the detected time of the weapon firing event to identify the location of the weapon, and a communication system configured to communicate the location of the weapon.
The system can further include one or more of the following features: the processor is further configured to correlate the weapon firing event detected by the audio sensor system with the detection of the projectile by the radar system to determine if the weapon firing event detected by the audio sensor system corresponds to the same projectile as that detected by the radar system, the processor is further configured to select a time difference threshold, and relate the time difference threshold to a difference between the detected time of the weapon firing event detected by the audio sensor system and a time of the detection of the projectile by the radar system, the processor is further configured to select a time difference threshold, and relate the time difference threshold to a difference between a time predicted by the state vector when backtracked to a terrain and the detected time of the weapon firing event detected by the audio sensor system, the processor is further configured to select a position difference threshold, and relate the position difference threshold to a difference between a location predicted by the state vector when backtracked to a terrain and a location predicted by the state vector when backtracked to the detected time of the weapon firing event detected by the audio sensor system, and/or the processor is further configured to select an angle difference threshold, and relate the angle difference threshold to a difference between an angle to the projectile identified by the radar system and an angle to the weapon identified by the audio sensor system.
In a further aspect of the invention, an article comprises: at least one computer-readable medium containing non-transitory stored instructions that enable a machine to perform: detecting a weapon firing event with an audio sensor system, the detected weapon firing event indicative of a detected firing of the weapon and indicative of a detected time of the weapon firing event, detecting a projectile fired from the weapon with a radar system, calculating a state vector associated with the projectile detection, identifying a location of the weapon by backtracking the state vector to the detected time of the weapon firing event time, and communicating the location of the weapon. The article can further include instructions for correlating the weapon firing event detected by the audio sensor system with the detection of the projectile by the radar system to determine if the weapon firing event detected by the audio sensor system corresponds to the same projectile as that detected by the radar system.
BRIEF DESCRIPTION OF THE DRAWING
The accompanying drawings illustrate embodiments of the devices and methods disclosed herein and together with the description, serve to explain the principles of the present disclosure.
FIG. 1 is a schematic diagram illustrating a weapon locating system according to an embodiment of the present invention;
FIG. 2 is schematic diagram of a weapon locating system according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating the trajectory of a projectile fired from a weapon;
FIG. 4 is a flowchart illustrating a method of locating a weapon using an audio augmented weapon locating system having a radar system in combination with an audio sensor system according to an embodiment of the present invention;
FIG. 5 is a graph of down range error for an audio augmented and non augmented conventional weapon locating systems at various quadrant elevations; and
FIG. 6 is a schematic representation of an exemplary a weapon locating system combing optical and/or audio sensor data with a radar system;
FIG. 7 is a schematic representation of an exemplary computer that can perform at least a portion of the processing described herein.
DETAILED DESCRIPTION
FIG. 1 shows an exemplary weapon locating system 100 including a radar system 102 and an audio sensor system 104 that communicate signals to a signal processing system 106. The signal processing system 106 can transmit information derived from the received signals to a response system 108.
As described in more detail below, a conventional weapon locating system having only a radar system can experience a reduced accuracy when detecting and tracking weapons fired at low quadrant elevations (QE). Quadrant elevation (QE) is a commonly used artillery term to describe the angle between the gun or launchers elevation angle and local horizontal. However when a conventional weapon locating system is combined with an audio sensor system, data generated by the combination of systems (i.e., an audio augmented weapon locating system) can be processed to more accurately determine the location of the fired weapon, particularly at low quadrant elevations (QEs).
The radar system 102 can be capable of detecting and tracking one or more projectiles fired from a weapon. In some embodiments, the radar system 102 can be a phased array radar system, also known as an electronically scanned array (“ESA”), which is a type of radar system that uses multiple antennas to transmit and/or receive radiofrequency (RF) signals at shifted relative phases. The phase shifting thus allows the transmitted and/or received RF energy to be transmitted and/or received as transmit and/or received beams that can be electronically “steered” without the need to physically move components of the radar system. Examples of such a phased array radar system used in a conventional non-augmented weapon locating system include the AN/TPQ-36 and the AN/TPQ-37 Firefinder Weapon Locating Systems manufactured by Raytheon Company of Waltham, Mass.
In some embodiments, the phased array can be comprised of transmit and/or receive elements disposed within a common assembly. In some other embodiments, the phased array can be comprised of transmit and/or receive antennas that a spatially separated and not disposed within a common assembly. Although phased array radar systems can be an effective choice for the radar system 102, other types of radar systems may also be suitable. The radar system 102 can be stationary or mounted on a mobile platform.
In one embodiment, the radar system 102 includes an antenna system 110, one or more transmitters 112, and one or more receivers 114. In some embodiments, the transmit and receive functions can be provided by a combined transmit/receive module. While not shown for clarity, it will be understood that the radar system 102 can also include various components such as controllers, duplexers, oscillators, mixers, amplifiers, synchronizers, modulators, antenna positioning systems, power supply systems, data storage devices, and signal pre-processing equipment.
The audio sensor system 104 can comprise any type of sensor system capable of detecting and processing sound information. For example, the audio sensor system 104 can include one or more transducers 116 sensitive to sound information for a given frequency range. It is understood that a series of transducers optimized for particular frequency ranges can be used to cover a desired aggregate frequency range. A signal processing module 118 can process the sensor information, as described more fully below.
A variety of sound sensor types can be used. The sensors may be located on the radar or the sensors may be located to form a network of sensors for determining the azimuth and elevation of a projectile solution. It is understood that the temperature and humidity of the environment can be used to calculate the speed of sound. In one embodiment, the system can process audio profiles of different types of devices to determine the type of device fired. Such profiles can be stored in a database, for example.
As used herein, the term “state vector” is used to describe a collection of parameters (i.e., one or more parameters) that correspond to set of characteristics of a moving projectile. The one or more state parameters within a state vector can include, but are not limited to, a position (in a coordinate system), a time, a speed, a heading (or three dimensional velocity vector), and acceleration in one or more dimensions, of the moving projectile.
As used herein, the term “backtracking” is used to describe a process by which one or more state vectors, each describing one or more parameters associated with a projectile at a respective one or more positions along a trajectory, can be extrapolated backward in time and space to identify a state vector associated with the projectile at an earlier point along the trajectory. The state vector at the earlier time and space can include both an earlier time and a location of the projectile at the earlier time. As used herein, the term “terrain” is used to describe topographical characteristics of the earth's surface. The terrain can be represented by numerical values.
In general, electromagnetic radiation is classified by wavelength into radio, microwave, infrared, visible, ultraviolet, X-rays, and gamma rays, in order of decreasing wavelength. As used herein, the term “light” is used to describe at least electromagnetic radiation having a wavelength in the infrared, visible, or ultraviolet portions of the electromagnetic spectrum. Similarly, the term “optical” is used herein to describe a system or component (e.g., sensor that interacts with or that processes the infrared, visible, or ultraviolet portions of the electromagnetic spectrum. As used herein, “sound” is used to describe as vibrations that travel through the air or another medium.
FIG. 2 shows an exemplary system 200 having at least one radar system 202 a-N and at least one audio sensor system 204 a-M. The radar systems 202 and audio sensor systems 204 are coupled to a signal processor 206 that can fuse the radar and audio sensor information to locate a weapon system 20 firing a projectile 22. The signal processor 206 can generate an output signal indicative of one or more of a time of a detected firing event, an azimuth bearing of the detected firing event, or an elevation angle of the detected firing event.
In one embodiment, the audio sensor systems 204 are located proximate or attached to a respective radar system. In an alternative embodiment, the audio sensors are located at locations separated from the radar systems. In one embodiment, audio sensors systems 204 can be launched and/or dropped in the area suspected to contain a weapon system. The audio sensors 204 can include a GPS receiver to determine its location and report this and other information to the signal processor 206 or other entity. In other embodiments, audio sensors are remote from other components of the audio sensor systems.
The signal processing system 206 can include one or more computer processors, a data storage system, an output interface, a network interface, and software for processing the signals received from the audio sensor system(s) 204 and the radar system(s) 202. Other hardware, firmware, and software can also be incorporated into the signal processing system 206. The signal processing system 206 can also include a communication system for transmitting, via either wired or wireless connection, data to a response system. The communicated data may include a set of fired weapon location coordinates.
The response system 208 can include, for example, a counter fire weapon system capable of returning fire to the location of the fired weapon, a friendly fire detection system capable of determining the location of allied forces, or a threat assessment system for use by peace-keeping or law enforcement agencies to determine a location for follow-up investigation or patrol.
As described more fully below, the weapon locating system 200 can be used to determine a location from which a weapon is fired. The weapon may fire any type of projectiles including shells, shot, missiles, or rockets.
Referring now to FIG. 3, when a weapon fires a projectile from a location 330, the projectile follows a trajectory 332. While the trajectory is shown to be a straight line, it will be understood that the trajectory need not be a straight line.
A quadrant elevation (QE) is an angle between an axis 336 upon a horizontal plane and an axis of a bore of the weapon fired from the location 330. A firing azimuth, α, is an angle formed between an axis 331 between the firing location 330 and the radar system 102 (of the weapon locating system 100) and the axis 336.
Acquisition, i.e. detection and tracking, of the fired projectile by the radar system 102 can occur at a location 338 along the trajectory 332. Associated with the location 338, the radar system 102 can generate a state vector that describes one or more characteristics of the projectile and of the trajectory of the projectile. The radar system 102 can make other detections at other points along the trajectory and can form other associated state vectors.
Conventionally, without use of the audio sensor system 104, the radar system 102 and signal processing system 106 can backtrack the resulting one or more state vectors to identify a state vector that intersects the terrain. Conventionally, the intersection can identify the location 330 of the weapon that fired the projectile. However, particularly at low QE, the identification of the location 330 is not precise.
An error associated with the radar system 102 (without use of the audio sensor system 104 (and neglecting radar ranging errors that are generally small compared to angular errors) can be characterized as an “error ellipse” 340 lying along the line-of-fire having a down range error component σ DOWN 342 and a cross range error component σ CROSS 344. The error components 342, 344 of the error ellipse 340 can be calculated as follows:
σ DOWN = VR · σ ɛ 2 + σ ɛ - bias 2 tan 2 ( QE ) + ( VRR · σ η 2 + σ η - bias 2 ) sin 2 ( α ) σ CROSS = ( VR · σ η 2 + σ η - bias 2 ) cos 2 ( α ) VRR = 1 + 12 ( N - 1 N + 1 ) ( T BACK T TRACK + 0.5 ) 2 N
    • where:
    • σε=random component (1 sigma) of the estimated target height in meters;
    • σε-bias=bias component of the estimated target height in meters;
    • ση=random component (1 sigma) of the estimated azimuth error in meters;
    • ση-bias=bias component (1 sigma) of the estimated azimuth error in meters;
    • α=firing azimuth angle in radians;
    • VR=VRR (in this example a single filter type is used);
    • VRR=Variance reduction from filter smoothing (non-dimensional);
    • TBACK=the total time the projectile is tracked by the radar in seconds;
    • TTRACK=that portion of the projectile flight time where extrapolation is required in seconds;
    • N=number of measurements processed by the radar.
As can be seen from the equations above, as QE becomes a small angle and approaches zero (i.e. direct fire), error in locating the fired weapon using the radar system 102, particularly the down range error component σ DOWN 342, becomes greatly exaggerated and approaches infinity. Minimizing the error components improves the accuracy with which the weapon firing location 330 can be determined.
FIG. 4 shows an exemplary sequence of steps 450 for determining the location of a weapon firing system by combining radar and audio sensor information in accordance with exemplary embodiments of the invention.
As further discussed above, conventional backtracking of a state vector associated with a sequence of radar measurements backtracks the state vector until the backtracked state vector intersects a position in space identified by the backtracked state vector intersecting the terrain. The intersection in space can provide, within the intersecting state vector, a prediction in space of a location of a weapon, and also a prediction in time of when the weapon was fired. In contrast, techniques described below can backtrack the state vector until the backtracked state vector intersects a time identified by the backtracked state vector intersecting a time identified by the above-described audio sensor system 104. This intersection in time can also provide, within the intersecting state vector, a prediction in space of a location of the weapon and also a prediction in time (known by the audio sensor detection in time) of when the weapon was fired.
Thus, both the conventional non-augmented weapon locating system and the audio-augmented weapon locating system can provide both a prediction of a location of a weapon and either a prediction of or knowledge of, a time of firing of the weapon.
In step 452, a weapon firing event occurs, resulting in sound that propagates both directly and indirectly from the weapon to the audio sensor system, for example, to the audio sensor system 104 of FIG. 1. The sound tends to have a time duration in accordance with the type of weapon fired. For example, if the sound occurs due to a gun firing event, the sound can have a duration in the order of milliseconds. The sound can result from an explosive event, e.g., a gunshot, or a controlled firing event, e.g., a rocket, etc.
In general, the beginning of the sound is indicative of a weapon firing event, however, for some types of weapon firing events, it may be desirable to mark the time of the weapon firing event as being a bit later in time, for example, if the sound is generated by a prolonged rocket blast and the projectile is a rocket that accelerates relatively slowly.
The audio sensor system 104 may use the time duration of the audio event to classify the event as to the sound's source. The time history may be used to discriminate out non-firings or to classify the firing event as to type (e.g., rocket, mortar, artillery, etc.). When classification data is available to the radar system 102, the radar system can use the classification data to improve ballistic estimator performance through better modeling of the projectile, to adjust the projectile's firing time to reflect the time the projectile actually began to leave the lunch platform (for example, rockets may require time to build thrust) and to eliminate possible false radar detections.
At step 454, an audio detection of a weapon firing event occurs where the audio sensor system. 104 detects the sound associated with the weapon firing event 452. At step 456, the time of the detection event 454 can be stored, for example in a memory device associated with the audio sensor system 104 or with the signal processing system 106. In some embodiments, the audio sensor system 104 is directional, in which case, a bearing (i.e., direction) associated with the detection event can also be stored. The audio sensor system 104 may have directional measurement capability in both bearing and elevation. When the audio sensor system 104 can measure direction, a means of aligning the audio system with the radar is provided. For example, a microphone array can perform beamforming to determine event direction.
At step 458, a common time base for the radar system 102 and for the audio sensor system 104 is established, for example using a global positioning system (GPS), inter-range instrumentation group (IRIG) time codes, or still another type of common clock. The common clock can be absolute or relative. This synchronization of the times for the systems 102, 104 will generally occur before the weapon firing event 452 and can be scheduled to occur with regularity so that the systems stay synchronized.
At step 460, a radar acquisition event, i.e., a radar detection of a projectile, occurs in which state information, such as altitude, speed, direction, acceleration, and time, associated with the projectile at the location 338 (FIG. 3) along the trajectory 332 (FIG. 3) is collected. The radar acquisition event, i.e., the radar detection, can occur after the weapon firing event detected by the audio sensor system 1-4 at block 454.
At step 462, the radar system 102, either alone or together with the signal processing system 106, forms a state vector using the state information associated with the projectile at location 338. At step 464, a correlation between the audio detection event 454 and the radar acquisition event 460 is made, if possible, using the stored event time and, in some embodiment, bearing data.
The correlation can be made in a number or ways. For example, in some embodiments, the correlation is made by comparing the time of the audio detection event with a time of the radar detection. A time difference threshold can be established based upon the environment (i.e., the application) in which the system is used, and any time difference less than the time difference threshold can be indicative of a correlation. For example, if the system is used to detect locations of missile firing events, the time difference threshold can be relatively large, for example, 5 seconds. For another example, if the system is used to detect locations of close range gun firing events, the time difference threshold can be relatively small, for example, 0.1 seconds. Other time difference thresholds are possible.
As described above and further below, backtracking the state vector in steps below provides both a weapon firing location estimate and a firing time estimate.
In other embodiments, correlation can be established by comparing the time of the weapon firing event detected by the audio sensor system 104 (FIG. 1) with a time of the radar detection of the projectile when the state vector of the projectile is conventionally backtracked to intersect the terrain. A time difference threshold can be established based upon the environment (i.e., the application) in which the system is used, and any time difference less than the time difference threshold can be indicative of a correlation. For example, if the system is used to detect locations of missile firing events, the time difference threshold can be relatively large, for example, 5 seconds. For another example, if the system is used to detect locations of close range gun firing events, the time difference threshold can be relatively small, for example, 0.1 seconds. Other time difference thresholds are possible.
In other embodiments, correlation can be established by comparing results of the conventional backtracking of the radar state vector to intersect the terrain with the backtracking described herein and below that backtracks the radar state vector to a point in time (and resulting space) established by the audio sensor system 104. As described above, both methods generate a prediction of a position from which the weapon was fired. A position difference threshold can be established based upon the environment (i.e., the application) in which the system is used, and any position difference less than the position difference threshold can be indicative of a correlation. Other position difference thresholds are possible.
With regard to the above correlation that backtracks the state vector in conventional and also in audio enhanced ways, from discussion above, it will be understood that, for low QEs, the conventional method to estimate weapon position may have very large errors, thus the above correlation using a position difference threshold may only apply to QEs above a threshold QE.
In still other embodiments for which the audio sensor system 104 provides directional information, e.g., azimuth bearing and/or elevation angle of a detected firing event, correlation can be established by comparing the azimuth bearing and or the elevation angle reported by the audio sensor system 104 with the azimuth bearing and/or the elevation angle reported by the radar system 102. An azimuth angle difference threshold and/or an elevation angle difference threshold can be established based upon the environment (i.e., the application) in which the system is used, and any azimuth angle difference and/or elevation angle difference less than azimuth angle difference threshold and/or elevation angle difference threshold can be indicative of a correlation. For example, if the system is used to detect locations of missile firing events, the azimuth angle difference threshold and/or the elevation angle difference threshold can be relatively large, for example, both 10.0 degrees. For another example, if the system is used to detect locations of close range gun firing events, the azimuth angle difference threshold and/or the elevation angle difference threshold can be relatively small, for example, both 1.0 degrees. Other angle difference thresholds are possible.
It should be understood that any one or more of the above-described techniques can be used to identify a correlation between radar detected events and filing events detected by the audio sensor system 104. Some correlations can be deemed to be primary and others can be deemed to be secondary, in any combination. Other correlation techniques are also possible, including techniques that make use of the directional capability of some audio sensor systems.
At step 466, a determination is made as to whether the weapon firing event detected by the audio sensor system at block 454 and the radar acquisition event detected by the radar system 102 at block 460 correlate. If a correlation cannot be made, at step 468 a potential (and possibly less accurate) weapon firing location can be determined by conventionally backtracking the state vector until it intersects the terrain topography. Extrapolation techniques can be used to perform the backtracking.
If a correlation can be made, at step 470 a more accurate weapon firing location can be determined by the audio augmented weapon locating system 100 (FIG. 1) by backtracking the state vector, not to an intersection with the terrain, but instead to an intersection in time with the weapon firing event time identified by the audio sensor system 104. Extrapolation and/or interpolation techniques can be used to perform the backtracking. Geographic coordinates associated with the backtracked state vector at the weapon firing event time identified by the audio sensor system 104 are calculated to establish a likely location of the weapon that fired the projectile. This calculated location of the weapon is more accurate than that described above using the radar system 102 alone.
At step 472, the coordinates associated with the likely location of the fired weapon are communicated to the response system 108 (FIG. 1). The response system 108 can direct a counter fire weapon capable of returning fire to the location of the fired weapon. Alternatively, the response system 108 can associate the location of the fired weapon with friendly fire from allied forces for purposes of mapping the location of allied forces. In still another alternative, the response system 108 can map the location of the fired weapon for use by peace-keeping or law enforcement agencies to determine a geographic area for follow-up investigation or patrol.
Referring now to FIG. 5, a graph 580 compares down range error at various quadrant elevations liar one example of a non-augmented conventional weapon locating system having only a radar system and one example of an audio augmented weapon locating system having both a radar system and an audio sensor system. Alternative embodiments of audio-augmented and non-augmented conventional weapon locating systems may result in different error responses. In the illustrative embodiment of FIG. 5, the plotted data set 582 shows a relationship between the QE of a firing weapon and the down range error component (σDOWN) associated with the use of a non-augmented conventional weapon locating system having the radar system 102 but not the audio sensor system 104. The graph 580 shows the general relationship between QE and down range error associated with the non-augmented conventional weapon locating system. As the QE of the firing weapon approaches 0, the error increases exponentially, rendering the non-augmented conventional weapon locating system relatively ineffectual in determining the location, of the firing weapon.
A plotted data set 582 shows a relationship between the QE of the fired weapon and the down range error component associated with the use of an audio augmented weapon locating system, e.g., 100 of FIG. 1, having both a radar system and an audio sensor system. The down range comparison graph 580 shows that backtracking the state vector of a projectile, as formed by a non-augmented conventional weapon locating system, to a time associated with a weapon firing event, as detected by an audio sensor system in an audio-augmented weapon locating system, results in improved weapon locating for low QE values of the firing weapon. It is understood that the results presented in FIG. 5 are just one example of audio-augmented and non-augmented conventional weapon locating system down range error values. For alternative embodiments, using different radar and audio sensor systems, the results may vary.
FIG. 6 shows an exemplary weapon locating system 600 including a radar system 602 and an optical sensor system 604, both of which communicate signals to a signal processing system 606. The optical sensor system 604 and/or an audio sensor system 616 can provide information to the signal processing system 606. Optionally, the signal processing system 606 can transmit information derived from the received signals to a response system 608. The optical sensor system 604 is also referred to as an electro-optical (EO) system herein. The optical sensor system 604 can be any type of optical sensor system capable of detecting and processing light. For example, the optical sensor system 604 can comprise an electro optical (EO) sensor system.
FIG. 7 shows an exemplary computer 700 that can perform at least part of the processing described herein. The computer 700 includes a processor 702, a volatile memory 704, a non volatile memory 706 (e.g., hard disk), an output device 707 and a graphical user interface (GUI) 708 (e.g., a mouse, a keyboard, a display, for example). The non-volatile memory 706 stores computer instructions 712, an operating system 716 and data 718. In one example, the computer instructions 712 are executed by the processor 702 out of volatile memory 704. In one embodiment, an article 720 comprises non-transitory computer-readable instructions.
Processing may be implemented in hardware, software, or a combination of the two. Processing may be implemented in computer programs executed on programmable computers/machines that each includes a processor, a storage medium or other article of manufacture that is readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and one or more output devices. Program code may be applied to data entered using an input device to perform processing and to generate output information.
The system can perform processing, at least in part, via a computer program product, (e.g., in a machine-readable storage device), for execution by, or to control the operation of data processing apparatus (e.g., a programmable processor, a computer, or multiple computers). Each such program may be implemented in a high level procedural or object-oriented programming language to communicate with a computer system. However, the programs may be implemented in assembly or machine language. The language may be a compiled or an interpreted language and it may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program may be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network. A computer program may be stored on a storage medium or device (e.g., CD-ROM, hard disk, or magnetic diskette) that is readable by a general or special purpose programmable computer for configuring and operating the computer when the storage medium or device is read by the computer. Processing may also be implemented as a machine-readable storage medium, configured with a computer program, where upon execution, instructions in the computer program cause the computer to operate.
Processing may be performed by one or more programmable processors executing one or more computer programs to perform the functions of the system. All or part of the system may be implemented as, special purpose logic circuitry (e.g., an FPGA (field programmable gate array) and/or an ASIC (application-specific integrated circuit)).
All references cited herein are hereby incorporated herein by reference in their entirety. Having described preferred embodiments, which serve to illustrate various concepts, structures and techniques, which are the subject of this patent, it will now become apparent to those of ordinary skill in the art that other embodiments incorporating these concepts, structures and techniques may be used. Accordingly, it is submitted that that scope of the patent should not be limited to the described embodiments but rather should be limited only by the spirit and scope of the following claims.

Claims (12)

What is claimed is:
1. A method of locating a weapon, comprising:
detecting a weapon firing event with an audio sensor system, the detected weapon firing event indicative of a detected firing of the weapon and indicative of a detected time of the weapon firing event;
detecting a projectile fired from the weapon with a radar system;
employing a processor configured for:
correlating the weapon firing event detected by the audio sensor system with the detection of the projectile by the radar system to determine if the weapon firing event detected by the audio sensor system corresponds to the same projectile as that detected by the radar system, wherein the correlating comprises:
selecting a time difference threshold; and
relating the time difference threshold to a difference between the detected time of the weapon firing event detected by the audio sensor system and a time of the detection of the projectile by the radar system;
calculating a state vector associated with the projectile detection;
identifying a location of the weapon by backtracking the state vector to the detected time of the weapon firing event time; and
communicating the location of the weapon.
2. The method of claim 1, further comprising generating a common time base for the weapon firing event and for the projectile detection.
3. The method of claim 1, wherein the audio sensor system comprises an audio sensor.
4. The method of claim 1, wherein the audio sensor system is directional and provides at least one of an azimuth angle of the detected weapon firing event or an elevation angle of the detected weapon firing event.
5. The method of claim 1, wherein the step of detecting the weapon firing event with the audio sensor system comprises detecting the weapon firing event by direct path detection of audio generated by the weapon firing event.
6. A method of locating a weapon, comprising:
detecting a weapon firing event with an audio sensor system, the detected weapon firing event indicative of a detected firing of the weapon and indicative of a detected time of the weapon firing event;
detecting a projectile fired from the weapon with a radar system;
employing a processor configured for:
correlating the weapon firing event detected by the audio sensor system with the detection of the projectile by the radar system to determine if the weapon firing event detected by the audio sensor system corresponds to the same projectile as that detected by the radar system, wherein the correlating comprises:
selecting a time difference threshold; and
relating the time difference threshold to a difference between a time predicted by the state vector when backtracked to a terrain and the detected time of the weapon firing event detected by the audio sensor system;
calculating a state vector associated with the projectile detection;
identifying a location of the weapon by backtracking the state vector to the detected time of the weapon firing event time; and
communicating the location of the weapon.
7. A method of locating a weapon, comprising:
detecting a weapon firing event with an audio sensor system, the detected weapon firing event indicative of a detected firing of the weapon and indicative of a detected time of the weapon firing event;
detecting a projectile fired from the weapon with a radar system;
employing a processor configured for:
correlating the weapon firing event detected by the audio sensor system with the detection of the projectile by the radar system to determine if the weapon firing event detected by the audio sensor system corresponds to the same projectile as that detected by the radar system, wherein the correlating comprises:
selecting a position difference threshold; and
relating the position difference threshold to a difference between a location predicted by the state vector when backtracked to a terrain and a location predicted by the state vector when backtracked to the detected time of the weapon firing event detected by the audio sensor system;
calculating a state vector associated with the projectile detection;
identifying a location of the weapon by backtracking the state vector to the detected time of the weapon firing event time; and
communicating the location of the weapon.
8. A method of locating a weapon, comprising:
detecting a weapon firing event with an audio sensor system, the detected weapon firing event indicative of a detected firing of the weapon and indicative of a detected time of the weapon firing event;
detecting a projectile fired from the weapon with a radar system;
employing a processor configured for:
correlating the weapon firing event detected by the audio sensor system with the detection of the projectile by the radar system to determine if the weapon firing event detected by the audio sensor system corresponds to the same projectile as that detected by the radar system, wherein the correlating comprises:
selecting an angle difference threshold; and
relating the angle difference threshold to a difference between an angle to the projectile identified by the radar system and an angle to the weapon identified by the audio sensor system;
calculating a state vector associated with the projectile detection;
identifying a location of the weapon by backtracking the state vector to the detected time of the weapon firing event time; and
communicating the location of the weapon.
9. A weapon locating system, comprising:
an audio sensor system configured to detect a weapon firing event, the detected weapon firing event indicative of a detected firing of the weapon and indicative of a detected time of the weapon firing event;
a radar system configured to detect a projectile fired from the weapon;
a processor configured to:
correlate the weapon firing event detected by the audio sensor system with the detection of the projectile by the radar system to determine if the weapon firing event detected by the audio sensor system corresponds to the same projectile as that detected by the radar system, wherein the correlating comprises:
selecting a time difference threshold; and
relating the time difference threshold to a difference between the detected time of the weapon firing event detected by the audio sensor system and a time of the detection of the projectile by the radar system; and
calculate a state vector associated with the projectile detection and to backtrack the state vector to the detected time of the weapon firing event to identify the location of the weapon; and
a communication system configured to communicate the location of the weapon.
10. An article, comprising:
at least one computer-readable medium containing non-transitory stored instructions that enable a machine to perform:
detecting a weapon firing event with an audio sensor system, the detected weapon firing event indicative of a detected firing of the weapon and indicative of a detected time of the weapon firing event;
detecting a projectile fired from the weapon with a radar system;
correlating the weapon firing event detected by the audio sensor system with the detection of the projectile by the radar system to determine if the weapon firing event detected by the audio sensor system corresponds to the same projectile as that detected by the radar system, wherein the correlating comprises:
selecting a time difference threshold; and
relating the time difference threshold to a difference between the detected time of the weapon firing event detected by the audio sensor system and a time of the detection of the projectile by the radar system;
calculating a state vector associated with the projectile detection;
identifying a location of the weapon by backtracking the state vector to the detected time of the weapon firing event time; and
communicating the location of the weapon.
11. The method of claim 1, further including receiving data from an optical sensor system to identify the location of the weapon.
12. The system of claim 9, further including an optical sensor system coupled to the radar system.
US14/068,318 2013-10-31 2013-10-31 Methods and apparatus for detection system having fusion of radar and audio data Active 2035-11-14 US9612326B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/068,318 US9612326B2 (en) 2013-10-31 2013-10-31 Methods and apparatus for detection system having fusion of radar and audio data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/068,318 US9612326B2 (en) 2013-10-31 2013-10-31 Methods and apparatus for detection system having fusion of radar and audio data

Publications (2)

Publication Number Publication Date
US20160223662A1 US20160223662A1 (en) 2016-08-04
US9612326B2 true US9612326B2 (en) 2017-04-04

Family

ID=56554111

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/068,318 Active 2035-11-14 US9612326B2 (en) 2013-10-31 2013-10-31 Methods and apparatus for detection system having fusion of radar and audio data

Country Status (1)

Country Link
US (1) US9612326B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220365198A1 (en) * 2021-05-17 2022-11-17 Jeremey M. Davis Systems and methods for radar detection having intelligent acoustic activation

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9612326B2 (en) * 2013-10-31 2017-04-04 Raytheon Command And Control Solutions Llc Methods and apparatus for detection system having fusion of radar and audio data
KR20170064349A (en) * 2015-12-01 2017-06-09 삼성전자주식회사 Electronic device and method for implementing of service thereof
CN107554470B (en) * 2016-06-30 2021-11-19 罗伯特·博世有限公司 Apparatus and method for handling vehicle emergency status
US10572809B1 (en) * 2016-09-01 2020-02-25 Northrop Grumman Systems Corporation Multi-int maritime threat detection
EP3591427B1 (en) 2018-07-05 2023-06-14 HENSOLDT Sensors GmbH Missile alerter and a method for issuing a warning about a missile
CN114818836B (en) * 2022-06-29 2022-09-20 电科疆泰(深圳)科技发展有限公司 Shooting counting method and device, electronic equipment and storage medium

Citations (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3699341A (en) * 1968-09-23 1972-10-17 Gen Electric Muzzle flash detector
US4825216A (en) 1985-12-04 1989-04-25 Hughes Aircraft Company High efficiency optical limited scan antenna
US5703321A (en) * 1994-11-08 1997-12-30 Daimler-Benz Aerospace Ag Device for locating artillery and sniper positions
USH1916H (en) 1997-06-27 2000-11-07 The United States Of America As Represented By The Secretary Of The Navy Hostile weapon locator system
US6215731B1 (en) * 1997-04-30 2001-04-10 Thomas Smith Acousto-optic weapon location system and method
WO2001065197A1 (en) * 2000-02-25 2001-09-07 Tda Armements S.A.S. Device for protecting a field zone against enemy threats
US6621764B1 (en) * 1997-04-30 2003-09-16 Thomas Smith Weapon location by acoustic-optic sensor fusion
US6823621B2 (en) 2002-11-26 2004-11-30 Bradley L. Gotfried Intelligent weapon
US6839025B1 (en) * 2002-06-03 2005-01-04 Ken Reigle Precision direction finding sensing systems and methods
US6903676B1 (en) * 2004-09-10 2005-06-07 The United States Of America As Represented By The Secretary Of The Navy Integrated radar, optical surveillance, and sighting system
US20060028373A1 (en) 2004-08-06 2006-02-09 Time Domain Corporation System and method for active protection of a resource
US20060038678A1 (en) * 2002-06-10 2006-02-23 Shahar Avneri Security system and method
US7151478B1 (en) 2005-02-07 2006-12-19 Raytheon Company Pseudo-orthogonal waveforms radar system, quadratic polyphase waveforms radar, and methods for locating targets
US7239976B2 (en) 2005-08-24 2007-07-03 American Gnc Corporation Method and system for automatic pointing stabilization and aiming control device
US7239975B2 (en) 2005-04-02 2007-07-03 American Gnc Corporation Method and system for automatic stabilization and pointing control of a device
US7277046B2 (en) 2005-07-08 2007-10-02 Raytheon Company Single transmit multi-receiver modulation radar, multi-modulation receiver and method
US7394724B1 (en) * 2005-08-09 2008-07-01 Uzes Charles A System for detecting, tracking, and reconstructing signals in spectrally competitive environments
US20080191926A1 (en) * 2006-01-18 2008-08-14 Rafael - Armament Development Authority Ltd. Threat Detection System
US20080291075A1 (en) * 2007-05-25 2008-11-27 John Rapanotti Vehicle-network defensive aids suite
US20090009378A1 (en) * 2005-12-15 2009-01-08 Israel Aerospace Industries Ltd. System and Method of Analyzing Radar Information
US7532542B2 (en) 2004-01-20 2009-05-12 Shotspotter, Inc. System and method for improving the efficiency of an acoustic sensor
US20090260511A1 (en) 2005-07-18 2009-10-22 Trex Enterprises Corp. Target acquisition and tracking system
US20090295624A1 (en) * 2004-07-02 2009-12-03 Fredrik Tuxen Method and apparatus for determining a deviation between an actual direction of a launched projectile and a predetermined direction
US20110127328A1 (en) * 2008-10-23 2011-06-02 Warren Michael C Dual Band Threat Warning System
US8050141B1 (en) * 2008-01-15 2011-11-01 The United States Of America As Represented By The Secretary Of The Navy Direction finder for incoming gunfire
US8149156B1 (en) * 2008-05-20 2012-04-03 Mustang Technology Group, L.P. System and method for estimating location of projectile source or shooter location
US20120182837A1 (en) * 2007-05-24 2012-07-19 Calhoun Robert B Systems and methods of locating weapon fire incidents using measurements/data from acoustic, optical, seismic, and/or other sensors
US20120211562A1 (en) * 2007-06-08 2012-08-23 Raytheon Company Methods and apparatus for intercepting a projectile
US20130021194A1 (en) * 2010-03-31 2013-01-24 Qinetiq Limited System for the detection of incoming munitions
WO2013055422A2 (en) * 2011-07-21 2013-04-18 Raytheon Company Optically augmented weapon locating system and methods of use
US20130293406A1 (en) * 2012-05-03 2013-11-07 Lockheed Martin Corporation Preemptive signature control for vehicle survivability planning
US20140086454A1 (en) * 2012-09-24 2014-03-27 Marc C. Bauer Electro-optical radar augmentation system and method
US20140269199A1 (en) * 2013-03-14 2014-09-18 Supervene LLC System and method for detecting and responding to indoor shooters
EP2793043A1 (en) * 2013-04-18 2014-10-22 Airbus Defence and Space GmbH Determination of weapon locations and projectile trajectories by using automatic and hybrid processing of acoustic and electromagnetic detections
US20160223660A1 (en) * 2010-03-17 2016-08-04 George Mason Intellectual Properties, Inc. Cavity Length Determination Apparatus
US20160223662A1 (en) * 2013-10-31 2016-08-04 Richard S. Herbel Methods and Apparatus for Detection System Having Fusion of Radar and Audio Data
US20160291117A1 (en) * 2011-09-23 2016-10-06 Bitwave Pte Ltd Hostile fire detection for an airborne platform
US9494687B2 (en) * 2013-06-21 2016-11-15 Rosemount Aerospace Inc Seeker having scanning-snapshot FPA

Patent Citations (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3699341A (en) * 1968-09-23 1972-10-17 Gen Electric Muzzle flash detector
US4825216A (en) 1985-12-04 1989-04-25 Hughes Aircraft Company High efficiency optical limited scan antenna
US5703321A (en) * 1994-11-08 1997-12-30 Daimler-Benz Aerospace Ag Device for locating artillery and sniper positions
US6215731B1 (en) * 1997-04-30 2001-04-10 Thomas Smith Acousto-optic weapon location system and method
US6621764B1 (en) * 1997-04-30 2003-09-16 Thomas Smith Weapon location by acoustic-optic sensor fusion
USH1916H (en) 1997-06-27 2000-11-07 The United States Of America As Represented By The Secretary Of The Navy Hostile weapon locator system
WO2001065197A1 (en) * 2000-02-25 2001-09-07 Tda Armements S.A.S. Device for protecting a field zone against enemy threats
US6839025B1 (en) * 2002-06-03 2005-01-04 Ken Reigle Precision direction finding sensing systems and methods
US20060038678A1 (en) * 2002-06-10 2006-02-23 Shahar Avneri Security system and method
US6823621B2 (en) 2002-11-26 2004-11-30 Bradley L. Gotfried Intelligent weapon
US7532542B2 (en) 2004-01-20 2009-05-12 Shotspotter, Inc. System and method for improving the efficiency of an acoustic sensor
US20090295624A1 (en) * 2004-07-02 2009-12-03 Fredrik Tuxen Method and apparatus for determining a deviation between an actual direction of a launched projectile and a predetermined direction
US20060028373A1 (en) 2004-08-06 2006-02-09 Time Domain Corporation System and method for active protection of a resource
US6903676B1 (en) * 2004-09-10 2005-06-07 The United States Of America As Represented By The Secretary Of The Navy Integrated radar, optical surveillance, and sighting system
US7151478B1 (en) 2005-02-07 2006-12-19 Raytheon Company Pseudo-orthogonal waveforms radar system, quadratic polyphase waveforms radar, and methods for locating targets
US7239975B2 (en) 2005-04-02 2007-07-03 American Gnc Corporation Method and system for automatic stabilization and pointing control of a device
US7277046B2 (en) 2005-07-08 2007-10-02 Raytheon Company Single transmit multi-receiver modulation radar, multi-modulation receiver and method
US20090260511A1 (en) 2005-07-18 2009-10-22 Trex Enterprises Corp. Target acquisition and tracking system
US7394724B1 (en) * 2005-08-09 2008-07-01 Uzes Charles A System for detecting, tracking, and reconstructing signals in spectrally competitive environments
US7239976B2 (en) 2005-08-24 2007-07-03 American Gnc Corporation Method and system for automatic pointing stabilization and aiming control device
US20090009378A1 (en) * 2005-12-15 2009-01-08 Israel Aerospace Industries Ltd. System and Method of Analyzing Radar Information
US20080191926A1 (en) * 2006-01-18 2008-08-14 Rafael - Armament Development Authority Ltd. Threat Detection System
US20120182837A1 (en) * 2007-05-24 2012-07-19 Calhoun Robert B Systems and methods of locating weapon fire incidents using measurements/data from acoustic, optical, seismic, and/or other sensors
US20080291075A1 (en) * 2007-05-25 2008-11-27 John Rapanotti Vehicle-network defensive aids suite
US20120211562A1 (en) * 2007-06-08 2012-08-23 Raytheon Company Methods and apparatus for intercepting a projectile
US8050141B1 (en) * 2008-01-15 2011-11-01 The United States Of America As Represented By The Secretary Of The Navy Direction finder for incoming gunfire
US8149156B1 (en) * 2008-05-20 2012-04-03 Mustang Technology Group, L.P. System and method for estimating location of projectile source or shooter location
US20110127328A1 (en) * 2008-10-23 2011-06-02 Warren Michael C Dual Band Threat Warning System
US20160223660A1 (en) * 2010-03-17 2016-08-04 George Mason Intellectual Properties, Inc. Cavity Length Determination Apparatus
US20130021194A1 (en) * 2010-03-31 2013-01-24 Qinetiq Limited System for the detection of incoming munitions
WO2013055422A2 (en) * 2011-07-21 2013-04-18 Raytheon Company Optically augmented weapon locating system and methods of use
US20160291117A1 (en) * 2011-09-23 2016-10-06 Bitwave Pte Ltd Hostile fire detection for an airborne platform
US20130293406A1 (en) * 2012-05-03 2013-11-07 Lockheed Martin Corporation Preemptive signature control for vehicle survivability planning
US20140086454A1 (en) * 2012-09-24 2014-03-27 Marc C. Bauer Electro-optical radar augmentation system and method
US20140269199A1 (en) * 2013-03-14 2014-09-18 Supervene LLC System and method for detecting and responding to indoor shooters
EP2793043A1 (en) * 2013-04-18 2014-10-22 Airbus Defence and Space GmbH Determination of weapon locations and projectile trajectories by using automatic and hybrid processing of acoustic and electromagnetic detections
US9494687B2 (en) * 2013-06-21 2016-11-15 Rosemount Aerospace Inc Seeker having scanning-snapshot FPA
US20160223662A1 (en) * 2013-10-31 2016-08-04 Richard S. Herbel Methods and Apparatus for Detection System Having Fusion of Radar and Audio Data

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
"Boomerang: Beating The Ambush & Saving Lives", posted Sep. 1, 2007 by Patrick Durkin and filed under Tactical Weapons, Tactical Weapons Sep. 2007, 4 pages, www.tactical-life.com.
Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration, PCT/US2012/047011, Date of Mailing Apr. 26, 2013, 4 pages.
P. Thumwarin, N. Wakayaphattaramanus, T. Matsuura and K. Yakoompai, "Audio forensics from gunshot for firearm identification," Information and Communication Technology, Electronic and Electrical Engineering (JICTEE), 2014 4th Joint International Conference on, Chiang Rai, 2014, pp. 1-4. *
U.S. Appl. No. 13/550,892, filed Jul. 17, 2012, 80 pages.
U.S. Appl. No. 13/550,892, filed Jul. 17, 2012, Rakeman.
Written Opinion of the International Searching Authority, PCT/US2012/047011, Date of Mailing Apr. 26, 2013, 7 pages.

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220365198A1 (en) * 2021-05-17 2022-11-17 Jeremey M. Davis Systems and methods for radar detection having intelligent acoustic activation
US12025691B2 (en) * 2021-05-17 2024-07-02 Jeremey M. Davis Systems and methods for radar detection having intelligent acoustic activation

Also Published As

Publication number Publication date
US20160223662A1 (en) 2016-08-04

Similar Documents

Publication Publication Date Title
US9612326B2 (en) Methods and apparatus for detection system having fusion of radar and audio data
US9234963B2 (en) Optically augmented weapon locating system and methods of use
US7764217B2 (en) Surface RF emitter passive ranging accuracy confirmation algorithm
US4315609A (en) Target locating and missile guidance system
Seo et al. Effect of spoofing on unmanned aerial vehicle using counterfeited GPS signal
US6281841B1 (en) Direction determining apparatus
KR101248045B1 (en) Apparatus for identifying threat target considering maneuvering patterns
US8598501B2 (en) GPS independent guidance sensor system for gun-launched projectiles
US10191150B2 (en) High precision radar to track aerial targets
US20150301169A1 (en) A method and a device for determining the trajectory of a bullet emitted by a shotgun and for locating a shot position
US11199380B1 (en) Radio frequency / orthogonal interferometry projectile flight navigation
CN105467366A (en) Mobile platform cooperative locating device and mobile platform cooperative locating system
de Celis et al. Spot‐Centroid Determination Algorithms in Semiactive Laser Photodiodes for Artillery Applications
KR101750498B1 (en) Guidance system and method for guided weapon using inertial navigation
Carniglia et al. Investigation of sensor bias and signal quality on target tracking with multiple radars
US11385024B1 (en) Orthogonal interferometry artillery guidance and navigation
RU2722903C1 (en) Method of identifying a target using a radio fuse of a missile with a homing head
Janczak et al. Measurement fusion using maximum‐likelihood estimation of ballistic trajectories
RU2325306C1 (en) Method of data computing system operation of missile and device for its implementation
US8513580B1 (en) Targeting augmentation for short-range munitions
RU2484419C1 (en) Method to control characteristics of effective field of high-explosive warhead of missile and device for its realisation
RU2454678C1 (en) Coherent-pulse radar
RU2332634C1 (en) Method of functioning of information computation system of missile and device therefor
US11378676B2 (en) Methods and systems for detecting and/or tracking a projectile
RU2292523C2 (en) Mode of functioning of data-processing systems of rocket and arrangement for its execution

Legal Events

Date Code Title Description
AS Assignment

Owner name: THALES RAYTHEON SYSTEMS COMPANY, LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HERBEL, RICHARD S.;RAKEMAN, JAMES W.;SIGNING DATES FROM 20131021 TO 20131025;REEL/FRAME:031533/0898

AS Assignment

Owner name: RAYTHEON COMMAND AND CONTROL SOLUTIONS LLC, CALIFO

Free format text: CHANGE OF NAME;ASSIGNOR:THALES-RAYTHEON SYSTEMS COMPANY LLC;REEL/FRAME:039255/0062

Effective date: 20160629

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4