US20060005447A1 - Processor aided firing of small arms - Google Patents
Processor aided firing of small arms Download PDFInfo
- Publication number
- US20060005447A1 US20060005447A1 US10/938,321 US93832104A US2006005447A1 US 20060005447 A1 US20060005447 A1 US 20060005447A1 US 93832104 A US93832104 A US 93832104A US 2006005447 A1 US2006005447 A1 US 2006005447A1
- Authority
- US
- United States
- Prior art keywords
- target
- barrel
- weapon
- recited
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/14—Indirect aiming means
- F41G3/16—Sighting devices adapted for indirect laying of fire
- F41G3/165—Sighting devices adapted for indirect laying of fire using a TV-monitor
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/06—Aiming or laying means with rangefinder
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/12—Aiming or laying means with means for compensating for muzzle velocity or powder temperature with means for compensating for gun vibrations
Definitions
- This invention applies to firearms, aiming and triggering, and fire control thereof.
- Boutet, et al. gyro-lasers are used to determine the weapon position and initiate firing commands when the weapon points a theoretical target center constructed by motion averaging.
- This invention combines the traditional aspects of firearm design with the capabilities of modern microelectronics. Instead of relying on the shooter to pull the trigger at the exactly correct instant to hit a target, a high-speed processor in conjunction with a video system, and motion detectors accomplishes this function.
- the implications, especially in warfare, are quite profound: with a modest amount of training, any soldier could be converted into a sharpshooter resulting in several benefits: increased lethality, increased survivability, battlefield dominance and more efficient use of ammunition.
- Another object of the invention is to provide a system and method for facilitating reducing reliance on a shooter's ability to hold a gun steady while simultaneously pulling the trigger by providing an electronic processor and triggering system to launch a projectile, such as a bullet at the proper time.
- Another object of the invention is to provide a system and method for moving or gyrating a firearm barrel, for capturing a plurality of images of a target area, for processing image data associated with the plurality of images on a frame-by-frame basis, for predicting a position of the barrel and the target and for firing the firearm to hit the target.
- Another object of the invention is to provide a system and method for allowing the weapon or gun to be held by the shooter and to quote “free float” thereby reducing or eliminating the need for a supporting platform to act as a base to move the weapon or a reference stage for a directional feedback information supplied to a firing control system.
- Another object of the invention is to reduce or eliminate the need for targets to be specific wavelength emitters and to permit targets to be selected from anyplace in the field of view of the imaging system.
- Still another object is to reduce or eliminate the necessity to define a “home” position as is required by some systems of the past and to provide means for defining a home position using at least one motion sensor.
- Still another object is to provide a system and method for enabling target selection by selecting a point or object within an image captured by a digital camera associated with the weapon.
- Still another aspect of the invention is to provide a weapon comprising a firearm having a barrel and a user interface, a barrel oscillator for oscillating the barrel in a predetermined pattern, an image capture device mounted on the firearm for capturing a plurality of image frames of a target and generating image data in response thereto, at least one motion sensor mounted on the firearm for sensing a motion of the barrel and generating motion data in response thereto, and a processor coupled to the user interface, the image capture device and the at least one motion sensor, the processor enabling a user to select a target and in response thereto, causing the image capture device to capture the plurality of images and generate the image data which is used along with the motion data to determine a predicted target location and coverage point where the barrel covers the target upon which the processor may energize the firearm to fire a projectile.
- Yet another aspect of the invention comprises a weapon comprising a firearm comprising a barrel, an imager mounted to the barrel for capturing an image of a target area, a user interface for displaying the image, the user interface comprising a trigger for selecting a target within the image area, and a processor coupled to the user interface and the imager for determining a future target location of the target and for automatically firing the firearm when the barrel is positioned in a firing position such that a projectile discharged from the firearm will hit the target selected by the user.
- Still another aspect of the invention comprises a weapon comprising: a firearm comprising a barrel, a gyrator mounted on the barrel for gyrating the barrel in a consistent motion, an imager mounted to the firearm for capturing a plurality of images of an area, a user interface for displaying at least one of the plurality of images, the user interface comprising a trigger for selecting a target within the at least one of the plurality of images, and a processor coupled to the user interface, the imager and the gyrator, the processor receiving image data corresponding to on or more of the plurality of images captured causing the firearm to automatically discharge a projectile from the firearm when the barrel is positioned in a firing position such that the will hit the target selected by the user.
- Yet another aspect of the invention comprises a method for increasing accuracy of hitting a target with a firearm, the method comprising the steps of: capturing a plurality of images of a target area including the target, processing the plurality of images to predict an optimum firing condition, anddischarging the firearm when the optimum firing condition is achieved.
- Still another aspect of the invention comprises a firing system that automatically launches the projectile comprising of: a barreled firearm, an electronic digital camera that supplies the electronic processor with rapid, digital, repetitive frame information, motion sensors that supply the electronic processor with angular rate information, an electronic display, which is able to display the image data from the digital camera and display cross hairs for target identification, a computer mouse, which interacts with the electronic processor and is able to position the cross hairs to identify the desired target, an electronic processor, that receives data from the electronic digital camera, motion sensors, computer mouse, and transmits images to the electronic display, executes barrel prediction algorithms while analyzing the motion generated by human drift and mechanically forced motion from the barrel gyrator and finally transmits a fire signal to the trigger mechanism, a trigger mechanism, which implements the projectile launch signal generated by the electronic processor, a barrel gyrator, which forces an orbital motion on the firearm, which is analyzed by the electronic processor.
- a firing system that automatically launches the projectile comprising of: a barreled firearm, an
- Still another aspect of the invention comprises an automatic firing system for use with a firearm, comprising: an image capture device mounted on the firearm for capturing a plurality of images of an area in front of a muzzle end of the firearm, and a processor coupled to the image capture device for processing data associated with the plurality of images and for determining an optimum firing time to discharge a bullet from the firearm in order to hit a target selected by a user.
- Yet another aspect of the invention comprises a firing system that automatically launches the projectile comprising of: a barreled firearm, an electronic digital camera, that supplies the electronic processor with digital rapid, repetitive frame information, an electronic display, which is able to display the image data from the digital camera and display cross hairs for target identification, a computer mouse, which interacts with the electronic processor and is able to position the cross hairs to identify the desired target, an electronic processor, that receives data from the electronic digital camera, computer mouse, transmits the data to the electronic display, and runs barrel prediction algorithms while analyzing the motion generated by human drift and transmits a fire signal to the trigger mechanism, and a trigger mechanism, which implements projectile launch by a signal from the electronic processor.
- a firing system that automatically launches the projectile comprising of: a barreled firearm, an electronic digital camera, that supplies the electronic processor with digital rapid, repetitive frame information, an electronic display, which is able to display the image data from the digital camera and display cross hairs for target identification, a computer mouse, which interacts with the electronic processor and is
- FIG. 1 is a schematic diagram of one embodiment of the invention
- FIG. 2 is another view of the embodiment shown in FIG. 1 during operation
- FIG. 3 is a perspective fragmentary view of barrel gyrator according to one embodiment of the invention.
- FIG. 4 is a system processing schematic according to one embodiment of the invention.
- FIG. 5 is a view of an image of an area that includes a target
- FIG. 6 is a view of a technique for determining various vectors when an embodiment is used with angular moving gun and a moving target;
- FIG. 7 is a view of a plurality of image data pixels associated with a gyration of a barrel
- FIG. 8 is a view of a technique for determining a drift sinusoid function associated with various image data points
- FIG. 9 is a perspective view of a plurality of vectors used to determine a target and barrel position as viewed from a camera mounted on the firearm;
- FIG. 10 is a schematic diagram of the system operation in accordance with one embodiment of the invention.
- FIG. 11 is a schematic diagram illustrating an increasing system performance with use of digital camera, barrel gyrator and motion sensors in various combinations.
- FIG. 1 a general schematic diagram of the weapon system 100 in accordance with one embodiment of the invention is shown.
- the system comprises a firearm, gun, pistol/weapon 10 , a digital camera 1 and a barrel gyrator 6 are operatively mounted to the weapon 10 .
- the barrel gyrator 6 induces a gyration, wobble or movement of consistent motion of the barrel 10 a in a predictable manner such that the barrel 10 a of weapon 10 can be controlled and predicted in the manner described herein.
- the digital camera 1 provides digital image data to frame capture electronics 18 , which generates a plurality of frames of data and subsequently feeds the data to processor 4 .
- the weapon 10 further comprises one or a plurality of inertial motion sensors 5 coupled to a computer or processor 4 that sense the position and motion of the weapon and particularly the barrel 10 a for generating motion data relative to the motion of the weapon 10 and particularly the barrel 10 a .
- the weapon motion data is fed to processor 4 as illustrated. The use of the motion data will be described later herein.
- the weapon 10 further comprises a firing trigger 9 , which in the preferred embodiment is an electronic trigger.
- a firing trigger 9 which in the preferred embodiment is an electronic trigger.
- an order to fire is delivered from processor 4 which generates the order to fire signal in response to the signals received from inertial motion sensors 5 and frame capture electronics 8 and a user interface 2 which enables a user to initiate and authorize the firing process.
- the processor 4 analyzes input data collected from the user interface 2 , digital camera 1 , and sensors 5 to make a firing decision.
- Image frame data is received by the processor 4 and undergoes a variety of processing operations to identify and track targets.
- the shooter commands the user interface 2 to display a single captured frame 52 of an image on the display screen 3 as a still image. From the user interface controls, the user can then electronically move a cursor 51 using a miniature track ball 55 ( FIG. 5 ) over the image to precisely designate a target 50 . For instance, if the target is a soda can on a log as in FIG. 5 , the user can place a boxed-shaped cursor 51 around the soda can to designate the target. The center of the box cursor would have a cross hair to represent the desired target hit location.
- the invention may be used with a stationary target or a moving target.
- a user may selectively operate the system 100 either with or without gyration.
- the weapon 10 In the case of stationary targets, even though the target is fixed, the weapon 10 is usually moving if the user is shooting offhand. As a result, as the target moves, the target will be located at different locations in successive image frame due to the gun's motion. By tracking the target, the camera 1 will capture a different image in each video frame, the pointing direction of the weapon 10 relative to the target is known and can be used to predict future positions of the weapon 10 .
- NGC Normalized Greyscale Correlation
- ROI region of interest
- the target's center location in the new frame is calculated and recorded.
- conventional Pyramiding NGC is used instead of the basic form of NGC, the search time is reduced through the use of a hierarchical searching scheme that is known prior art.
- the performance of Pyramiding NGC is such that the target location can be found in each frame in real time, even when operating at frame rates of 240 frames per second and using an Intel® Pentium 4 class processor, available from Intel Corporation.
- These and other prior art image processing methods can identify and track the location of the target in each video frame and may be used by the invention. This processing can also include comparison of past and present frames that are acquired.
- rate sensors 5 are used to provide high bandwidth tracking of the gun that compliments the video tracking that is performed. This allows the system 100 to reduce the processing demands needed for gun and target tracking, provides more effective tracking of moving targets and reduces the amount of forward looking prediction needed of the gun's pointing direction. For instance, in the case of stationary targets, the rate sensors 5 can be used to provide gun pointing direction in between image frame times, thus allowing for a slower video frame rate and gun pointing direction predictions that only need to be a few millisecond's in the future.
- the video tracking is only needed to update the position of the target relative to the gun every 33 to 1000 milliseconds so the random walk error of the rate sensors doesn't grow too large.
- rate sensors frame rates of 120 frames per second and higher are typical and predictions of gun pointing direction must be made further into the future to account for image acquisition and processing delays before the X,Y target location information is available for calculation.
- the ability to predict the future positions of the weapon 10 in the invention is important due to ignition and bore time delays that occur after the decision to fire has been made.
- the gun To hit a stationary target, the gun must be pointing at it at the time the bullet exits a muzzle or end of the weapon 10 (ignoring gravity and other environment effects). If the gun is in motion and changing its pointing direction, it is not usually sufficient to make a firing decision based on current pointing data because the gun will have changed its pointing direction between trigger and muzzle exit events. The time between these events can be several milliseconds. As data is being acquired and processed, the invention continuously predicts the weapon's 10 pointing direction when muzzle exit would occur if a round was to be fired at that moment.
- Forward looking predictions of weapon position and target location are made in a variety of ways.
- Two effective techniques used in the invention are a rate change prediction technique that uses the rate of change calculated from the most recent tracking data points and a curve fitting technique that utilizes several image data points from past and current frames of data tracking information. These techniques are described later herein.
- the user interface 2 in the preferred embodiment includes a small LCD or heads up display 3 to display targeting and system information.
- the user interface 2 or display 3 may be pivoted or rotated in one of a plurality of directions.
- the display 3 can be incorporated directly into the sight scope (not shown) for aiming the weapon 10 .
- the user interface 2 can have any combination of control buttons, input controls (including using the conventional pull trigger) to aid the user in selecting targets 25 , authorizing firing 26 , and configuring parameters within the system 27 .
- a miniature trackball 53 FIG. 5
- Rate Sensors 5 on the weapon 10 also allow the possibility of the cursor to be controlled by movement of the gun itself.
- the user interface 2 button and input controls are linked to the image display 3 so commands and system 100 configuration settings can be inputted by the user.
- the user can authorize the system 100 to fire by pressing a button 57 on the user interface 2 or pulling the gun's modified mechanical trigger 10 b ( FIG. 1 ) that is configured to only authorize a firing event.
- the user interface 2 also allows for threshold accuracy settings 28 ( FIG. 4 ) to be inputted so the user can specified the Minute of Angle (MOA) accuracy desired for the shot.
- MOA Minute of Angle
- One MOA is 1 inch of target size at 100 yards range distance.
- the threshold accuracy settings 28 are used by the firing algorithm described later herein to determine if the predicted pointing direction of the weapon 10 is over the target 50 at any moment in time.
- the digital camera 1 is mounted on the stock 10 a by conventional means, such as mechanical mount.
- the digital camera 1 is aligned generally on top of and over the barrel 10 a as illustrated in FIG. 2 so that images of an area including a target, such as target 50 in FIG. 5 , can be captured.
- digital camera 1 may also capture a plurality of images of the target 50 as the gun barrel 10 a is gyrated by gyrator 6 as described herein.
- the camera 1 is used to electronically photograph the view looking down the barrel and toward the target 50 .
- a charge-coupled device (CCD) video camera may be used.
- the signal from the digital camera 1 is fed to an image capture or frame capture electronics 8 mentioned earlier herein.
- the frame capture electronics 8 are provided on a circuit board (not shown) on which the processor 4 may also be mounted, and the board may be stored in or mounted on the rifle or mounted in a separate container such as a processor container 28 ( FIG. 2 ) which is worn by the user or affixed to a user's garment, such as a belt.
- the processor container 28 may further comprise a power source, such as a battery, for energizing or providing power to the system 100 .
- the image or frame capture electronics 8 records at least one or a plurality of frames of data consisting of thousands of pixels at rates of hundreds of times per second.
- the data is fed to the processor 4 which, in the preferred embodiment of the invention, uses a relatively powerful processor system, such as a 2.4 GHz Intel® Pentium® 4 computer with 512 MB of memory.
- the image capture electronics 8 can be a Matrox Meteor II frame grabber board available from Matrox Electronic Systems Ltd. of 1055 St. Regis Blvd., Durval QC H9P-2T4 Canada, installed into, for example one of the computer's PCI slots.
- the board has a bus-mastering mode that can perform data transfers directly into host computer memory without requiring continuous host intervention.
- the processor 4 comprises suitable memory (not shown) for receiving each frame of data for subsequent processing by the processor 4 .
- System 100 comprises the image display 3 ( FIGS. 1 and 2 ) that displays the image captured by the camera 1 and which is transmitted via the processor 4 .
- Various types of displays can be used for 3 ; however the leading choices are the Kaiser Electro-Optics, Inc.'s ProviewTM SL35 monocular display, Kopin CyberDisplayTM, available from Kopin Corporation 125 North Drive, Westboro, Mass. 01581, and the Sharp Microelectronics 3.5-inch Transflective TFT-LCD display, available from Sharp Microelectronics.
- the barrel gyrator 6 causes the muzzle or end of barrel 10 a to orbit in an elliptical fashion.
- the miniaturized track ball mouse 53 ( FIG. 5 ) is used as part of the user interface 2 to allow the shooter to select the target from a still image presented on the video display 3 .
- Inertial motion sensors 5 are mounted to the weapon 10 and provides angular rate data to the processor 4 as the weapon 10 moves about.
- the gyrator 6 is to cause to aim point of the weapon or weapon 10 to oscillate or gyrate in a predictable manner to sweep out more target area in a given time than when compared to when the weapon or weapon 10 is held stationary.
- a gyrating motion is added to the slow sweeping motion caused by the shooter moving the gun, for example, to follow the target 50 , a large swath of the target area is covered in one stroke. If a motion-inducing mechanism, such as the gyrator 6 were not present, the shooter would have to manually align the barrel with the target 50 .
- a disadvantage is that depending on the bullet launch tolerance or desired accuracy limits (referred to in block 48 in FIG. 4 ), this could require an indefinite length of time for the shooter to accurately align the barrel with the target 50 and it may never occur at all during a manual operation.
- a circular or elliptical motion of the barrel 10 a is achieved by altering the center of mass of the weapon 10 .
- this is achieved by providing a gyrator 6 having a weight 19 secured thereto with a screw 21 .
- the weight 19 is secured to a cylinder 16 that is rotatably mounted on bearings 22 which are positioned on the barrel 10 a .
- the cylinder 16 is rotatably driven by a motor 17 that is coupled to the ball bearing 22 via a drive linkage, including a pulley 20 and belt or O-ring 18 .
- the barrel 10 a should rotate in a circular fashion, but it has been found that the barrel 10 a actually orbits in an elliptical fashion because of the difference in the moments of inertia of the weapon along different axes, such as axis A illustrated in FIGS. 2 and 3 .
- the weight 19 is mounted to the cylinder 16 , which is aluminum in the embodiment being described.
- the cylinder 16 mounted on the pair of low profile ball bearings 22 a at the muzzle end of the weapon barrel 10 a .
- the cylinder 16 comprises a groove 29 a that accommodates and receives the belt or O-ring 18 which is attached to the pulley 20 on the small electric motor 17 .
- the cylinder 16 further comprises a pair of diametrically opposed tapped holes used to attach the weight 19 , such as a lead weight with the screw 21 as shown. It should be understood that the weight 19 is removable by unscrewing screw 21 so that different mass weights can be fastened to the cylinder 16 , which allows the amplitude of the barrels 10 a circular motion to be adjustable.
- the voltage applied to the motor 17 controls the angular frequency of gyration which is independent of the angle of barrel gyration.
- the angle of gyration is related to the ratio of the orbital radius to an overall length of the gun or weapon 10 , as well as the weapon and gyrator masses.
- the voltage applied to the motor 17 controls the angular frequency which operates in the 300-800 RPM range in the embodiment being described.
- the shooter points the weapon 10 at the target 50 ( FIG. 5 ) and a button, such as trigger button 57 on user interface 2 is pressed to freeze the image captured by digital camera 1 ( FIGS. 1 and 2 ).
- the captured image is displayed on the image display 3 .
- the shooter only needs to provide an approximate aim toward the target 50 , such that the camera 1 can capture the image of the target 50 .
- the shooter views the frozen image on the display 3 and positions cross-hairs 51 which are controlled by the miniature mouse 53 ( FIG. 5 ) on the user-interface 2 or display 3 over the desired target 50 .
- the weapon 10 is still pointed approximately at the target 50 .
- the user then puts or actuates an auto-fire mode by pulling a pseudo trigger 10 b ( FIG. 1 ).
- the processing computer supplies digital I/O signals 32 for system controls. Inputs from the user interface are received by the processing computer 4 to perform such actions as freezing a video frame, select targets 25 , and making authorizations for firing 26 . To fire, an authorization signal is given to the firing trigger 9 in the form of a binary representation of the time interval between the next camera shutter, available as a digital output from the digital camera 1 , and bullet launch time. Numerous digital implementations of an interval timer can be implemented by those conversant with the art. The only requirement is that the clock driving the interval timer be sufficiently fast so that one single tick of the clock is negligible in comparison to the overall system timing. A clock frequency of 10 MHz is sufficient to fulfill this requirement.
- the pseudo trigger 10 b may comprise an electronic as opposed to a traditional mechanical trigger involving a moving firing pin.
- the electronic firing trigger 9 may comprise the same mechanism resident on the Remington® Model 700 Etronix rifle which is advertised as having an ignition time of 0.27 microseconds available from Remington Arms Company, Inc.
- the pseudo trigger 10 b is coupled to the processor 4 which, in turn, generates a signal for energizing the gyrator 6 and, more particularly, motor 17 .
- the energized motor 17 rotatably drives pulley 20 and O-ring 18 to rotatably drive the cylinder 16 to wobble the barrel 10 a .
- this causes the aiming direction of the muzzle end of barrel 10 a to sweep out an area encompassing the target 50 at the target distance as the shooter slowly sweeps the weapon or gun 10 over the target 50 .
- the speed of the barrel 10 a rotation is much faster than the motion caused by the shooter's natural drift, so at some point of an over-target sweep, the barrel 10 a points directly or nearly directly at the target 50 .
- successive image frames are captured by digital camera 1 and processed in accordance with the algorithm described herein. For each frame captured, the target 50 is located in and a series of previous frames are analyzed to predict the anticipated time at which the barrel 10 a will point directly at the target 50 .
- the shot is automatically discharged by means of the processor 4 energizing firing trigger 9 ( FIG. 1 ).
- the projectile or bullet is launched and hits the target 50 in response to the electronic firing of the firing mechanism 9 .
- a plurality of rate sensors 5 are mounted on weapon 10 ( FIG. 1 ). Solid state rate sensor technology is used in the preferred embodiment of the invention to track the precise angular motion of the gun barrel 10 a . That angular rate of change around the X, Y, and Z axes is conveyed to the processor, 4 , as shown in FIG. 1 .
- rate sensors 5 allow for frequent pointing direction sampling with minimal processing demands.
- rate sensors 5 offer a straightforward method of separating out angular motion of the gun/video system with actual motion of the target 50 , such as the movement of the tank shown in FIG. 6 . This is particularly important for firing accuracy at moving targets since target trajectory and ballistic “lead” need to be calculated for a correct ballistic solution.
- Rate sensors 5 are mounted on weapon 10 and coupled to processor 4 .
- the rate sensors 5 are small, low cost “gyro on a chip” devices available in a coin size form factor that can be packaged to meet military specification environmental requirements.
- Example rate sensors that can be used in the invention are the QRS11 series manufactured by Systron Donner Inertial Division of BEI Technologies, Inc. By integrating the rate sensors signal over time, a precise angle of barrel 10 a can be calculated. By incorporating two rate sensors into the weapon 10 (one for azimuth angle measurement and the other for elevation) a precise measure of pointing direction can be achieved. If desired, a third rate sensor (not shown) can also be used to monitor rotation about the gun barrel axis.
- the rate sensors 5 have a random walk noise that can cause the pointing direction measurement to drift over time. If rate sensors 5 alone are used for tracking the gun's pointing direction 39 , the random walk error becomes significant over time and eventually impacts the accuracy of the ballistic solution after a few seconds of time. To counter this effect so as to achieve high accuracy pointing direction information, the invention in one preferred embodiment frequently updates the pointing direction reference from the rate sensors 5 with pointing direction information determined by processing camera video frames 40 of data. The target location is found in each video frame where the FOV of the camera 1 subtends the target. This location is then used to update the angular position of the weapon 10 relative to the target. This updating process can eliminate the gun pointing direction error with respect to the target from growing too large over time.
- Rate sensors 5 lower the video processing demands of the invention by allowing the system to operate at much lower frame rates compared to using video alone to determine muzzle pointing direction.
- Conventional video frame rates of 30 frames per second or lower can typically be used with rate sensors as long as the target trajectory can be resolved for the given frame rate.
- Faster moving targets demand higher frame rates in order to adequately sample the target trajectory.
- the rate sensors 5 incorporated into the invention allow the frame rate requirements to depend mostly on the motion of the target as opposed to that of the gun.
- the invention's video tracking technology can update the target location relative to the local image coordinates on every frame time using target pattern matching algorithms 38 , 40 applied to the video frames.
- target pattern matching algorithms 38 , 40 applied to the video frames.
- FIG. 6 shows a graphical example of how target motion relative to a two dimensional inertial reference frame can be calculated in a moving gun and moving target scenario.
- the target position can be located by image processing techniques relative to the local image coordinate system.
- the rate sensor 5 data can be used to track the motion of this image coordinate system relative to an inertial coordinate reference. Since the random walk error of the rate sensor 5 is small in the frame times involved, rate sensor information spanning several frame times can be used for this tracking.
- the coordinates of the target relative to the inertial frame of reference can be calculated.
- this method When this method is used in combination with range data to the target, it allows for a relatively simple way to calculate the velocity and acceleration vectors of the target moving relative to the inertial reference frame.
- distance information is derived from target tracking within each video frame and time information is derived from the inverse of the video frame rate.
- the target's motion vectors can be used to predict the target's new location in between frame times, thus allowing for targeting of high speed targets that are moving at significant displacement rates.
- a weapon's 10 lead can be calculated by determining the time of flight to the target based on range and published bullet velocity and then multiply this time by the velocity component perpendicular to the target line of sight.
- a decision to fire 41 can be made when the pointing direction of the gun is within the ballistic solution necessary to hit the moving target at its predicted future position.
- This general technique for handling the motion of the target relative to the gun platform does not rely on processing fixed point background references and as a result, the technique can work on both uniform and non-uniform background scenes.
- a weapon's 10 motion is expected to have translation velocity components relative to an inertial frame of reference (e.g. a gun in a moving vehicle)
- an inertial frame of reference e.g. a gun in a moving vehicle
- accelerometers 5 can be incorporated in to the invention to detect changes in translation motion.
- Translational information can be incorporated into the calculated ballistic solution to further the overall firing accuracy of the system.
- FIG. 9 illustrates a video frame 83 captured by camera 1 .
- a relative angular position of the target 81 with respect to the gun's 10 pointing direction 80 at the time of frame exposure can be determined since the location vector 82 , which is a vector between the target 50 original position 87 and actual location 85 where the weapon 10 is pointing is known.
- the gun's pointing direction 80 is known relative to the pointing direction of the digital camera 1 and is shown in FIG. 9 as being the same vector that ends at the center of the field of view of the image frame.
- Past and present location information can be used to help predict the future location of the target 50 and this prediction is used in the ballistic solution 43 which determines if the processor 4 can fire upon the target at a particular moment in time 41 .
- a variety of different image processing algorithms can be incorporated into the invention for identify and tracking targets in the video scene. This includes the use of raster, vector, and temporal based imaging algorithms in any combination.
- An algorithm incorporated into one embodiment of the invention for use in identifying and locating non-changing, fixed targets or reference points within an image frame is a hierarchical search normalized greyscale correlation, also referred to as pyramiding normalized greyscale correlation.
- This technique is known and is taught, for example, in Matrox Imaging, ActiveMIL version 7 User Guide, Jun. 6, 2002, Matrox Electronic Systems Ltd., pg. 183, which is incorporated herein by reference and made a part hereof.
- This highly efficient and precise raster based algorithm is effective for locating stationary targets or reference points in situations where there is no relative rotating or scaling occurring within the video scene.
- this technique scans a reference region of interest over the entire image and then outputting the correlation match at each index step across the image.
- the index at which the highest correlation exists above a certain threshold is the location of the target in the acquired frame. If all correlation results are below the threshold set for the image, then the frame is deemed to not contain the target.
- ⁇ overscore (w) ⁇ is the average value of the pixels in the ROI defined by w (this only needs to be computed once).
- f is the region within the frame image coinciding with the current location of w and ⁇ overscore (f) ⁇ is the average value within f
- the summation of the indexes s and t are only over the coordinates that are common to both f and w.
- the correlation coefficient is calculated for each location (x,y) in the image.
- a pyramid scheme is used where a small lower resolution image of the frame is initially processed. Then, this correlation information is used to limit the pattern search to selected regions of the next higher resolution version of the frame image. This process repeats itself so that the best correlation coefficient is quickly found for the ROI model that is being matched in the image. If the correlation coefficient is below a threshold value then the image is marked as not having the target contained with it.
- This pyramid scheme allows NGC processing to only take 3-4 milliseconds using a ROI size of 100 ⁇ 100 pixels and an image size of 640 ⁇ 480.
- the NGC time is about half this ( ⁇ 1.5 to 2 ms).
- NGC is a very powerful pixel level pattern matching technique that can discern the location of virtual any static ROI model from an image frame. NGC proves to be a well performing algorithm to meet this criterion. In future phases of this development, NGC in combination with other image processing algorithms (e.g. geometric pattern matching) can be researched to extend the system beyond static targets to the moving target domain.
- image processing algorithms e.g. geometric pattern matching
- vector based algorithms can be more flexible then raster based algorithms.
- image processing technologies currently known in the art that can be incorporated into preferred embodiment of the invention depending on the particular targeting demands. These technologies can involve identifying and tracking spatial and/or temporal properties of the target within the video frames.
- An example of one such method is to use geometric pattern matching algorithms to match edges of a target against a database of target templates.
- This “model” based approach can be used to track the target (such as the tank shown in FIG. 6 ) as it moves across a scene.
- Geometric techniques tend to be more robust in handling the scaling and rotation effects of moving platforms, but generally demand more CPU processing compared to raster based correlation techniques. It should be understood that the incorporation of rate sensors into the invention, reduces the overall processing demands so geometric pattern matching and other powerful image processing techniques can be used to accurately track the target given a variety of scene conditions.
- the analysis is considered in two fashions; passive mode, when the gyrator 6 is off and active mode, when the gyrator 6 is on. Each case produces a characteristic sequence of barrel pointing.
- the barrel wander from human drift behaves much like Brownian motion on a long time scale of the order of seconds.
- the motion is, for all intents and purposes, random and therefore non-predictable.
- the location difference between successive data points is not that great and leads itself to predictive techniques.
- One firing method is to wait until the barrel 10 a is pointed at the target, within a certain tolerance, and then launch the bullet.
- This can be viewed as a non-forecasting approach and gives performance within a few MOA.
- the preferred rate prediction technique uses the rate of change of the X and Y components of motion to predict future location coordinates.
- the two main concepts of this approach are 1) actual X, Y location prediction based on the rate of change, and 2) the values of the first and second derivatives used as constraints when near the target.
- the complete calculation uses data from the 3 previous frames to make a prediction on whether to fire the rifle 1.5 to 2.5 frames in the future.
- the X, Y coordinate location is defined in the image plane of the target where 0, 0 is the target center.
- vectors 60 and 61 are determined. These vectors 60 and 61 correspond to a location of target relative to center of image frame. A pointing direction of gun/camera 1 relative to rate sensor's 5 inertial frame of reference 67 is then determined to provide a calculation of vectors 62 and 63 in FIG. 6 . Next, vectors 64 and 65 are calculated indicating the position of target relative to the inertial frame of reference 67 . Processor 4 then determines a resultant displacement vector 66 of target during time delta t relative to the inertial frame 67 . The velocity vector 66 for the target can then be calculated from this information.
- first derivatives are calculated for each X and Y component.
- a similar formula is used to calculate the Y component.
- a second derivative term can also be added to this prediction, but requires more investigation on whether it is actually helpful since it is based on slightly older frame data.
- the rifle 10 As the pointing direction of the rifle 10 changes, it sweeps out a path in the image plane of the target. This path will approach the target center and then move away as it sweeps by. Because there is a variance between predicted motion and actual motion, it is best to make firing decisions using data that is representative of the motion moving toward the target and not away from it. Such a decision can be made by using constraint conditions based upon the pointing angle distance from the target and the rates of change of this distance.
- the distance from the target center defined in the plane of the target is calculated from the Pythagorean Theorem, using the X and Y components. After calculating the distance from the target in each of the 3 previous frames, the first and second derivatives of the distance from the target center can be obtained for the most recent frame in the set of three. The distance from target and the first and second derivatives of this distance allow for the setup of powerful constraint conditions on when to fire the rifle.
- the barrel 10 a motion is the combination of the Brownian-like motion arising from human drift and the elliptical motion from the gyrator and is illustrated in FIG. 7 .
- Data points on this figure are made by taking the recent frames of image data and finding, by pattern matching or other means, the point where the previously selected target is located. Points at this stage are given in pixel units as X and Y coordinates in a Cartesian coordinate plane.
- the unequal lengths of the major and minor axes arise from the asymmetrical moment of inertia of the weapon.
- the vertical and horizontal components each trace out a sinusoid that wanders about due to human drift.
- the polynomial factor estimates the motion caused by human drift, and the sine function models the gyration motion.
- the X (horizontal) raw data from FIG. 7 is shown in FIG. 8 .
- the following steps are taken. Decompose the positional data into X (horizontal) and Y (vertical) functions of time. A cubic spline fit 71 ( FIG. 8 ) is fit to each component. From this the extrema (min and max) and inflection points of the sinusoid are extracted by processor 4 . A min and max difference is used to seed the amplitude of the sine coefficient, B 0 . The time difference between successive inflection points (or extrema) is then used to seed the angular frequency coefficient, B 1 . The last inflection point is used to seed the phase, B 2 , or let it free float. A polynomial fit is then performed on the set of inflection points.
- the coefficients of An and Bn are determined by processor 4 and thus the expected barrel 10 a position can be evaluated by the drifting sinusoid, f(t) 72 , in the X coordinate.
- the Y coordinate is fit in a similar manner.
- Target Identification algorithms operate on each frame of video data to identify and track the target in real time.
- Rate sensors 5 are used to separate out gun motion from target motion, so the anticipated trajectory of the target can be calculated by processor 4 .
- This processing is done by the processor or ballistic computer(X) 4 . Location of the image in the X-Y plane of the field of view can be readily obtained by this method.
- the invention can calculate the “lead” necessary for hitting a moving target by determining the target's trajectory using the image based tracking techniques described earlier relative to FIG. 6 and by determining the target's range.
- Range determination 36 can be done using a variety of techniques andior sensors to determine a range or distance to the target, such as target 50 in FIG. 9 .
- a range sensor 34 FIG. 4 may also be optionally directly incorporated into the invention to determine range.
- Its output is feed into the ballistic computer processor 4 and used to calculate the ballistic solution.
- the ballistic solution 43 is used, along with other factors such as windage, gravity drop, ammunition, gun type, environmental effects by processor 4 , to determine the angular position the gun needs to be in before the processor 4 energizes the firing trigger 9 to fire the weapon 10 .
- the invention can use a variety of image processing techniques to track a moving target in successive video frames.
- Geometric Pattern matching is one example of such a technique that can account for both scaling and rotation of the moving target.
- lens and camera 1 can be dependent on a variety of factors including the particular tracking requirements of target scene. For example, for fast moving targets, an imaging system with larger aperture optics, a wider field of view lens, high frame rate video, and a higher resolution imager will often be necessary to simultaneously lead, track, and resolve the target.
- the video camera 1 used in the invention has an electronic shutter capability to adjust the exposure time of each video frame. This is particular important in both the case of the static and moving target since in either case the gun and/or the target can be moving. A short exposure time is often needed to avoid blurring of the target image in the video frame.
- Typical shutter settings of 1/500 to 1/5000 of a second are common depending on the motion, lighting, and the f# of the lens.
- successive image frames are captured by camera 1 and processed by frame capture electronics 8 .
- the target is located, and a series of previous frames are analyzed to predict the anticipated time at which the barrel will point directly at the target in the manner described relative to FIGS. 6-9 .
- the system can work without the barrel gyrator. This decreases the system performance and forces a barrel/target alignment to fire. In the case where a target is static, and has a stationary background, it is possible to operate the system without motion detectors. In this scenario, the optical identification part of this system is sufficient to fire accurately.
- the digital camera 1 is typically a conventional visible light imaging device, but thermal Infrared and night vision imaging devices can also be used with the invention.
- Automatic Target identification is an additional embodiment of the device that can allow the use to select targets that are identified by computer algorithms. Once identified using predefined criteria, they are presented to the user as possible targets on the image display 3 .
- the system 100 can work without a rotating muzzle gyrator 6 in cases where is there is ample time to direct the weapon directly at the target.
- the above steps 1-3 are performed: a sighting is made, image captured, and crosshairs placed on the target. Then the barrel 10 a is aimed at the target and slowly wandered about. When the barrel 10 a is pointed directly at the target, less corrections, the processor 4 energizes firing trigger 9 to cause the round to be automatically fired.
- This invention augments a shooter's ability to pull the trigger at the exact instant to hit a target by substituting an electronic means.
- the ramifications are profound, in that it may enable the shooter to increase the change he will hit his target virtually every time.
- FIG. 10 A system flow processing diagram of the preferred embodiment is shown in FIG. 10 .
- Digital video 90 is acquired of the target scene by the digital camera 1 and is then transferred to the computer processor 4 .
- a single frame of the video can be captured and displayed on the image display screen 3 for target selection (block 92 ).
- the user selects 92 the target by moving a cursor to define the target with region of interest box or similar marking method.
- an authorization to shoot 93 is made by the user by depressing a button or similar input control on the user interface.
- the computer processor 4 uses the targets unique image properties to find the target in subsequent video frames as they are acquired by the digital camera 1 .
- the identification and location is done using pyramiding Normalized Greyscale Correlation, Geometric Pattern Matching, Image differencing and other image processing techniques currently known in the art.
- the target location is found in the frame 96 . This location is used to determine the relative pointing direction of the target relative to the current position of the gun.
- the current gun pointing direction can be determined by integrating the inertial motion rate sensor signals to determine pointing direction at any instant of time relative to an inertial frame of reference 95 .
- the inertial motion rate sensors can give measurement of gun pointing direction independent of imaging data and can allow for high frequency sampling of gun pointing direction that can exceed the frequency of target tracking using video frame rate measurements.
- Prediction of gun pointing direction can be made using inertial motion sensor data 95 and in the case of a stationary targets by also using the target tracking data 96 .
- inertial motion sensor data is the primary data source for gun pointing direction information due to the low processing demands, high bandwidth performance, and other system benefits of this method of acquisition.
- the algorithms used to predict future gun pointing direction depends on if the barrel is gyrating or not 97 (decision block 97 ).
- a rate of change method 98 is used in the case of no barrel gyration and a drifting sinusoid method 99 is used in the case when barrel gyration is occurring.
- the prediction of gun pointing direction needs to typically be several to many milliseconds in the future to account for data acquisition time, processing time, trigger time, ignition time, and round bore time between when the pointing data was acquired and the time when muzzle exit could occur.
- the system then calculates (block 103 ) range to target and proceeds to the ballistics calculation 104 to determine if the gun is currently at the correct aim point to hit the target.
- both the range to target 101 at time of bullet impact and the gun lead 102 required at round muzzle exit to hit the target needs to be predicted. Since true target trajectory needs to be known for these calculations, gun pointing information obtained from the inertial rate sensors is used to isolate target motion in the video tracking information from that of weapon 10 motion. Once target trajectory is known, predicted range and lead can be calculated by processor 4 in the course of calculating the general ballistic solution 104 .
- the system 100 evaluates whether the weapon 10 is correctly aiming correctly to hit the target 107 .
- the ballistic solution can incorporate other sensor data 105 and gun/ammunition performance 106 data. If the weapon's 10 aim point is with the accuracy threshold configured in the system by the user a countdown timer 108 will be set to fire 109 at precisely the time indicated by ballistic calculations at block 104 .
- the firing is by electrical means, via firing trigger 9 ( FIG. 1 ) to allow for minimal ignition delays.
- the system 100 loops back to tracking gun and target motion to collect new gun pointing data and/or image tracking data.
- FIG. 11 diagrams the increasing system performance with the exercise of more system components.
- the essential group of components is the digital camera 1 and the basic system items; user interface 2 , image display 3 , processor 4 , and frame capture electronics 8 .
- each digital image frame is analyzed allowing a determination of the barrel direction and a prediction of when the weapon will point directly at the target.
- the shooter manually sweeps the gun about until the gun alligns with the target. This enables the hitting of staisty targets with great (beyond human) accuarcy without the need of a rigid weapon support.
- the weapon is mechanically oscillated which causes the sweeping of the aim point to cover much more area in a given amount of time.
- a “spotlight” is being swept about around the target area. This causes the weapon to find it's target much faster, and can be used with stationary and moving targets.
- the instantaneous barrel motion can be detected and integrated over time to provide a barrel pointing direction.
- This information can be used to frequently determine barrel position independently of image data. As a result, the accuracy of the system is enhanced.
- the information can be used to efficiently target both stationary and moving targets.
- GPS Global Positioning System
- digital compass e.g., Honeywell HMR 3000 Digital Compass Module
- WLAN radio e.g., Wireless Fidelity
- the position of each squad member is shared with every other member by the WLAN.
- an alarm (not shown) can alert the soldier or, when incorporated into the Processor Aided Firing of Small Arms system, inhibit the firing of the weapons.
Abstract
A digital processor aiming and firing system generates a trigger signal with electronic timing exactness, resulting in shooting accuracy unobtainable by humans. To achieve this, a view down the barrel sight is captured by a digital video camera and analyzed on a frame-by-frame basis by an electronic processor equipped with image identification software. Motion detectors attached to the weapon are used to interpolate the barrel position between frames. A motion history of the barrel position relative to the target is calculated and an extrapolation of the future position is made. When the anticipated barrel direction impinges on the target, corrected for motion and ballistic effects, the processor signals the launch of the projectile.
Description
- This application claims the benefit of U.S. provisional application Ser. No. 60/502,693 filed Sep. 12, 2003, which is incorporated herein by reference and made a part hereof.
- The U.S. Government has a paid-up license in this invention and the right in limited circumstances to require the patent owner to license others on reasonable terms as provided for by the terms of DMB07-03-D-B009 awarded by the Department of Defense.
- 1. Field of Invention
- This invention applies to firearms, aiming and triggering, and fire control thereof.
- 2. Description of Related Art
- The ability to aim a firearm and hit a desired target is a challenge that has existed since the advent of modern warfare. With the aiming of conventional firearms, the shooter must align the barrel with the target, or lead the target, while simultaneously pulling the trigger. This act takes considerable practice to master and separates sharpshooters from neophytes.
- In the past, numerous innovations have been made in the art of fire control and targeting. Many of the inventions pertain to a weapon that is attached to a stable mounting or quasi-stable mounting such as a moving vehicle. In U.S. Pat. No. 4,787,291, Frohock, Jr., teaches how a system of gyros, servo motors, and a tracking sub-system can be used to pivot a gun to track and fire upon its target. In a similar fashion, in U.S. Pat. No. 4,004,729, Rawicz, et al.; U.S. Pat. No. 3,840,794, Clement et al.; U.S. Pat. No. 3,766,826, Salomonsson; U.S. Pat. No. 5,686,690, Lougheed, et. al., and U.S. Pat. No. 3,575,085, McAdam, Jr., all present tracking systems where the barrel is moved about on larger platforms.
- In U.S. Pat. No. 5,966,859, Samuels presents a scheme for automatic firing based on an “electromagnetic signature”, e.g. a thermal detector that distinguishes human and animal targets from their surroundings. In a like fashion, U.S. Pat. No. 5,544,439, Grember, et al., teaches that an infrared detector can be integrated into a weapon and used to activate the trigger. In both of these approaches, there is a limitation in that the target must emit specific and distinctive wavelengths, which must be distinguishable from the target background.
- In U.S. Pat. No. 3,659,494, Philbrick, et al. presents a targeting system where the “dancing” motion of a target observed while viewing is attenuated. A pair of gyros (not to be confused with the gyrator in the present invention) are used to drive deflection circuitry and shift the image that is focused on a photosensitive detector. The principle is extended to targeting where a signal indicating target alignment with a “home” position is exploited to activate a trigger.
- In U.S. Pat. No. 5,392,688, Boutet, et al., gyro-lasers are used to determine the weapon position and initiate firing commands when the weapon points a theoretical target center constructed by motion averaging.
- It is therefore, an object of the invention to provide and improved system and method for automated firing.
- This invention combines the traditional aspects of firearm design with the capabilities of modern microelectronics. Instead of relying on the shooter to pull the trigger at the exactly correct instant to hit a target, a high-speed processor in conjunction with a video system, and motion detectors accomplishes this function. The implications, especially in warfare, are quite profound: with a modest amount of training, any soldier could be converted into a sharpshooter resulting in several benefits: increased lethality, increased survivability, battlefield dominance and more efficient use of ammunition.
- It is an object of the invention to provide an improved system and method that increases the probability of hitting a desired target by reducing reliance on a human to correctly time pulling a trigger.
- Another object of the invention is to provide a system and method for facilitating reducing reliance on a shooter's ability to hold a gun steady while simultaneously pulling the trigger by providing an electronic processor and triggering system to launch a projectile, such as a bullet at the proper time.
- Another object of the invention is to provide a system and method for moving or gyrating a firearm barrel, for capturing a plurality of images of a target area, for processing image data associated with the plurality of images on a frame-by-frame basis, for predicting a position of the barrel and the target and for firing the firearm to hit the target.
- Another object of the invention is to provide a system and method for allowing the weapon or gun to be held by the shooter and to quote “free float” thereby reducing or eliminating the need for a supporting platform to act as a base to move the weapon or a reference stage for a directional feedback information supplied to a firing control system.
- Another object of the invention is to reduce or eliminate the need for targets to be specific wavelength emitters and to permit targets to be selected from anyplace in the field of view of the imaging system.
- Still another object is to reduce or eliminate the necessity to define a “home” position as is required by some systems of the past and to provide means for defining a home position using at least one motion sensor.
- Still another object is to provide a system and method for enabling target selection by selecting a point or object within an image captured by a digital camera associated with the weapon.
- Still another aspect of the invention is to provide a weapon comprising a firearm having a barrel and a user interface, a barrel oscillator for oscillating the barrel in a predetermined pattern, an image capture device mounted on the firearm for capturing a plurality of image frames of a target and generating image data in response thereto, at least one motion sensor mounted on the firearm for sensing a motion of the barrel and generating motion data in response thereto, and a processor coupled to the user interface, the image capture device and the at least one motion sensor, the processor enabling a user to select a target and in response thereto, causing the image capture device to capture the plurality of images and generate the image data which is used along with the motion data to determine a predicted target location and coverage point where the barrel covers the target upon which the processor may energize the firearm to fire a projectile.
- Yet another aspect of the invention comprises a weapon comprising a firearm comprising a barrel, an imager mounted to the barrel for capturing an image of a target area, a user interface for displaying the image, the user interface comprising a trigger for selecting a target within the image area, and a processor coupled to the user interface and the imager for determining a future target location of the target and for automatically firing the firearm when the barrel is positioned in a firing position such that a projectile discharged from the firearm will hit the target selected by the user.
- Still another aspect of the invention comprises a weapon comprising: a firearm comprising a barrel, a gyrator mounted on the barrel for gyrating the barrel in a consistent motion, an imager mounted to the firearm for capturing a plurality of images of an area, a user interface for displaying at least one of the plurality of images, the user interface comprising a trigger for selecting a target within the at least one of the plurality of images, and a processor coupled to the user interface, the imager and the gyrator, the processor receiving image data corresponding to on or more of the plurality of images captured causing the firearm to automatically discharge a projectile from the firearm when the barrel is positioned in a firing position such that the will hit the target selected by the user.
- Yet another aspect of the invention comprises a method for increasing accuracy of hitting a target with a firearm, the method comprising the steps of: capturing a plurality of images of a target area including the target, processing the plurality of images to predict an optimum firing condition, anddischarging the firearm when the optimum firing condition is achieved.
- Still another aspect of the invention comprises a firing system that automatically launches the projectile comprising of: a barreled firearm, an electronic digital camera that supplies the electronic processor with rapid, digital, repetitive frame information, motion sensors that supply the electronic processor with angular rate information, an electronic display, which is able to display the image data from the digital camera and display cross hairs for target identification, a computer mouse, which interacts with the electronic processor and is able to position the cross hairs to identify the desired target, an electronic processor, that receives data from the electronic digital camera, motion sensors, computer mouse, and transmits images to the electronic display, executes barrel prediction algorithms while analyzing the motion generated by human drift and mechanically forced motion from the barrel gyrator and finally transmits a fire signal to the trigger mechanism, a trigger mechanism, which implements the projectile launch signal generated by the electronic processor, a barrel gyrator, which forces an orbital motion on the firearm, which is analyzed by the electronic processor.
- Still another aspect of the invention comprises an automatic firing system for use with a firearm, comprising: an image capture device mounted on the firearm for capturing a plurality of images of an area in front of a muzzle end of the firearm, and a processor coupled to the image capture device for processing data associated with the plurality of images and for determining an optimum firing time to discharge a bullet from the firearm in order to hit a target selected by a user.
- Yet another aspect of the invention comprises a firing system that automatically launches the projectile comprising of: a barreled firearm, an electronic digital camera, that supplies the electronic processor with digital rapid, repetitive frame information, an electronic display, which is able to display the image data from the digital camera and display cross hairs for target identification, a computer mouse, which interacts with the electronic processor and is able to position the cross hairs to identify the desired target, an electronic processor, that receives data from the electronic digital camera, computer mouse, transmits the data to the electronic display, and runs barrel prediction algorithms while analyzing the motion generated by human drift and transmits a fire signal to the trigger mechanism, and a trigger mechanism, which implements projectile launch by a signal from the electronic processor.
- Other advantages of the present invention includes:
-
- 1. provides means for hitting the desired target virtually every time;
- 2. decreases the time needed to shoot at a target;
- 3. in warfare, it saves ammunition by eliminating the need to spray bullets;
- 4. prevents shooting a wrong, intervening target, when an extraneous object occludes the line of sight to the desired target;
- 5. provides means whereby a typical soldier would immediately have the capabilities to be a sharp shooter;
- 6. provides means whereby the lethality of a soldier would be greatly enhanced;
- 7. provides means whereby the survivability of a soldier would be vastly increased; and,
- 8. provides a psychological deterrent to the enemy; and
- 9. provides users to select non-lethal target areas on, for example an enemy soldier or criminal.
- Other objects and advantages of the invention will be apparent from the following description and the accompanying drawings.
-
FIG. 1 is a schematic diagram of one embodiment of the invention; -
FIG. 2 is another view of the embodiment shown inFIG. 1 during operation; -
FIG. 3 is a perspective fragmentary view of barrel gyrator according to one embodiment of the invention; -
FIG. 4 is a system processing schematic according to one embodiment of the invention; -
FIG. 5 is a view of an image of an area that includes a target; -
FIG. 6 is a view of a technique for determining various vectors when an embodiment is used with angular moving gun and a moving target; -
FIG. 7 is a view of a plurality of image data pixels associated with a gyration of a barrel; -
FIG. 8 is a view of a technique for determining a drift sinusoid function associated with various image data points; -
FIG. 9 is a perspective view of a plurality of vectors used to determine a target and barrel position as viewed from a camera mounted on the firearm; -
FIG. 10 is a schematic diagram of the system operation in accordance with one embodiment of the invention; and -
FIG. 11 is a schematic diagram illustrating an increasing system performance with use of digital camera, barrel gyrator and motion sensors in various combinations. - Referring now to
FIG. 1 , a general schematic diagram of theweapon system 100 in accordance with one embodiment of the invention is shown. The system comprises a firearm, gun, pistol/weapon 10, adigital camera 1 and abarrel gyrator 6 are operatively mounted to theweapon 10. In the embodiment being described and as will be discussed later herein relative toFIG. 3 , thebarrel gyrator 6 induces a gyration, wobble or movement of consistent motion of thebarrel 10 a in a predictable manner such that thebarrel 10 a ofweapon 10 can be controlled and predicted in the manner described herein. - In general, the
digital camera 1 provides digital image data to framecapture electronics 18, which generates a plurality of frames of data and subsequently feeds the data toprocessor 4. In one embodiment, theweapon 10 further comprises one or a plurality ofinertial motion sensors 5 coupled to a computer orprocessor 4 that sense the position and motion of the weapon and particularly thebarrel 10 a for generating motion data relative to the motion of theweapon 10 and particularly thebarrel 10 a. The weapon motion data is fed toprocessor 4 as illustrated. The use of the motion data will be described later herein. - The
weapon 10 further comprises a firingtrigger 9, which in the preferred embodiment is an electronic trigger. In the embodiment being described, an order to fire is delivered fromprocessor 4 which generates the order to fire signal in response to the signals received frominertial motion sensors 5 andframe capture electronics 8 and auser interface 2 which enables a user to initiate and authorize the firing process. - In the preferred embodiment, the
processor 4 analyzes input data collected from theuser interface 2,digital camera 1, andsensors 5 to make a firing decision. Image frame data is received by theprocessor 4 and undergoes a variety of processing operations to identify and track targets. - When selecting a target, such as
target 50 inFIG. 2 , the shooter commands theuser interface 2 to display a single capturedframe 52 of an image on thedisplay screen 3 as a still image. From the user interface controls, the user can then electronically move acursor 51 using a miniature track ball 55 (FIG. 5 ) over the image to precisely designate atarget 50. For instance, if the target is a soda can on a log as inFIG. 5 , the user can place a boxed-shapedcursor 51 around the soda can to designate the target. The center of the box cursor would have a cross hair to represent the desired target hit location. - It should be understood that the invention may be used with a stationary target or a moving target. Also, a user may selectively operate the
system 100 either with or without gyration. In the case of stationary targets, even though the target is fixed, theweapon 10 is usually moving if the user is shooting offhand. As a result, as the target moves, the target will be located at different locations in successive image frame due to the gun's motion. By tracking the target, thecamera 1 will capture a different image in each video frame, the pointing direction of theweapon 10 relative to the target is known and can be used to predict future positions of theweapon 10. - Once the
cursor 51 is positioned on and designates the target in the displayed still image, algorithms are run in theprocessor 4 to identify unique characteristics of the target so the target can be identified and located in video frames as they are acquired by thedigital camera 1. An example of an algorithm that is used in the invention is Normalized Greyscale Correlation (NGC). In the basic form of NGC, the boxed area around thetarget 50 is used to define a region of interest (ROI) raster. When NGC searches for the target in a new, subsequent image frame, it measures the correlation between an equal sized region of the new image with the reference target raster (51). NGC systematically scans the new image and measures the correlation value at each index in the row, column image scan. If a correlation match is found above a threshold value between the reference and a particular sub-image of a new video frame, the target's center location in the new frame is calculated and recorded. If conventional Pyramiding NGC is used instead of the basic form of NGC, the search time is reduced through the use of a hierarchical searching scheme that is known prior art. The performance of Pyramiding NGC is such that the target location can be found in each frame in real time, even when operating at frame rates of 240 frames per second and using anIntel® Pentium 4 class processor, available from Intel Corporation. These and other prior art image processing methods can identify and track the location of the target in each video frame and may be used by the invention. This processing can also include comparison of past and present frames that are acquired. - Because tracking the weapon's 10 pointing direction is an important component of the invention,
rate sensors 5 are used to provide high bandwidth tracking of the gun that compliments the video tracking that is performed. This allows thesystem 100 to reduce the processing demands needed for gun and target tracking, provides more effective tracking of moving targets and reduces the amount of forward looking prediction needed of the gun's pointing direction. For instance, in the case of stationary targets, therate sensors 5 can be used to provide gun pointing direction in between image frame times, thus allowing for a slower video frame rate and gun pointing direction predictions that only need to be a few millisecond's in the future. Because the target is stationary, the video tracking is only needed to update the position of the target relative to the gun every 33 to 1000 milliseconds so the random walk error of the rate sensors doesn't grow too large. Without rate sensors, frame rates of 120 frames per second and higher are typical and predictions of gun pointing direction must be made further into the future to account for image acquisition and processing delays before the X,Y target location information is available for calculation. - The ability to predict the future positions of the
weapon 10 in the invention is important due to ignition and bore time delays that occur after the decision to fire has been made. To hit a stationary target, the gun must be pointing at it at the time the bullet exits a muzzle or end of the weapon 10 (ignoring gravity and other environment effects). If the gun is in motion and changing its pointing direction, it is not usually sufficient to make a firing decision based on current pointing data because the gun will have changed its pointing direction between trigger and muzzle exit events. The time between these events can be several milliseconds. As data is being acquired and processed, the invention continuously predicts the weapon's 10 pointing direction when muzzle exit would occur if a round was to be fired at that moment. This allows the system to accurately trigger a bullet launch once the predicted pointing direction of theweapon 10 crosses over the target, assuming user authorization to fire has been made. In the general case, if the pointing direction of theweapon 10 satisfies the calculated ballistic solution for the predicted location of the target and the computer has authorization to shoot theprocessor 4 will send a digital output signal to the firing electronics to initiate thefiring trigger 9. - Forward looking predictions of weapon position and target location are made in a variety of ways. Two effective techniques used in the invention are a rate change prediction technique that uses the rate of change calculated from the most recent tracking data points and a curve fitting technique that utilizes several image data points from past and current frames of data tracking information. These techniques are described later herein.
- Referring to
FIGS. 1 and 5 , theuser interface 2 in the preferred embodiment includes a small LCD or heads updisplay 3 to display targeting and system information. Although not shown, theuser interface 2 ordisplay 3 may be pivoted or rotated in one of a plurality of directions. Thedisplay 3 can be incorporated directly into the sight scope (not shown) for aiming theweapon 10. Theuser interface 2 can have any combination of control buttons, input controls (including using the conventional pull trigger) to aid the user in selectingtargets 25, authorizing firing 26, and configuring parameters within thesystem 27. In the embodiment being described, a miniature trackball 53 (FIG. 5 ) is used to guidecursor 51 over thetarget 50 for firing selection.Rate Sensors 5 on theweapon 10 also allow the possibility of the cursor to be controlled by movement of the gun itself. - The
user interface 2 button and input controls are linked to theimage display 3 so commands andsystem 100 configuration settings can be inputted by the user. The user can authorize thesystem 100 to fire by pressing abutton 57 on theuser interface 2 or pulling the gun's modifiedmechanical trigger 10 b (FIG. 1 ) that is configured to only authorize a firing event. Theuser interface 2 also allows for threshold accuracy settings 28 (FIG. 4 ) to be inputted so the user can specified the Minute of Angle (MOA) accuracy desired for the shot. One MOA is 1 inch of target size at 100 yards range distance. Thethreshold accuracy settings 28 are used by the firing algorithm described later herein to determine if the predicted pointing direction of theweapon 10 is over thetarget 50 at any moment in time. - Referring now to
FIGS. 2 and 3 , further details of thesystem 100 are shown. As illustrated inFIG. 3 , thedigital camera 1 is mounted on thestock 10 a by conventional means, such as mechanical mount. Thedigital camera 1 is aligned generally on top of and over thebarrel 10 a as illustrated inFIG. 2 so that images of an area including a target, such astarget 50 inFIG. 5 , can be captured. In this regard, and during a normal operation,digital camera 1 may also capture a plurality of images of thetarget 50 as thegun barrel 10 a is gyrated bygyrator 6 as described herein. Thus, it should be appreciated that thecamera 1 is used to electronically photograph the view looking down the barrel and toward thetarget 50. In one embodiment of the invention, a charge-coupled device (CCD) video camera may be used. - The signal from the
digital camera 1 is fed to an image capture orframe capture electronics 8 mentioned earlier herein. In the embodiment being described, theframe capture electronics 8 are provided on a circuit board (not shown) on which theprocessor 4 may also be mounted, and the board may be stored in or mounted on the rifle or mounted in a separate container such as a processor container 28 (FIG. 2 ) which is worn by the user or affixed to a user's garment, such as a belt. Theprocessor container 28 may further comprise a power source, such as a battery, for energizing or providing power to thesystem 100. - The image or
frame capture electronics 8 records at least one or a plurality of frames of data consisting of thousands of pixels at rates of hundreds of times per second. The data is fed to theprocessor 4 which, in the preferred embodiment of the invention, uses a relatively powerful processor system, such as a 2.4 GHz Intel® Pentium® 4 computer with 512 MB of memory. Theimage capture electronics 8 can be a Matrox Meteor II frame grabber board available from Matrox Electronic Systems Ltd. of 1055 St. Regis Blvd., Durval QC H9P-2T4 Canada, installed into, for example one of the computer's PCI slots. The board has a bus-mastering mode that can perform data transfers directly into host computer memory without requiring continuous host intervention. In the embodiment being described, theprocessor 4 comprises suitable memory (not shown) for receiving each frame of data for subsequent processing by theprocessor 4. -
System 100 comprises the image display 3 (FIGS. 1 and 2 ) that displays the image captured by thecamera 1 and which is transmitted via theprocessor 4. Various types of displays can be used for 3; however the leading choices are the Kaiser Electro-Optics, Inc.'s Proview™ SL35 monocular display, Kopin CyberDisplay™, available from Kopin Corporation 125 North Drive, Westboro, Mass. 01581, and the Sharp Microelectronics 3.5-inch Transflective TFT-LCD display, available from Sharp Microelectronics. Thebarrel gyrator 6 causes the muzzle or end ofbarrel 10 a to orbit in an elliptical fashion. By altering the weapon's 10 center of mass, the aim of theweapon 10 is caused to sweep out a large oval at the target range. The miniaturized track ball mouse 53 (FIG. 5 ) is used as part of theuser interface 2 to allow the shooter to select the target from a still image presented on thevideo display 3.Inertial motion sensors 5 are mounted to theweapon 10 and provides angular rate data to theprocessor 4 as theweapon 10 moves about. - Referring to
FIGS. 2 and 3 , thebarrel gyrator 6 will now be described. In general, thegyrator 6 is to cause to aim point of the weapon orweapon 10 to oscillate or gyrate in a predictable manner to sweep out more target area in a given time than when compared to when the weapon orweapon 10 is held stationary. When a gyrating motion is added to the slow sweeping motion caused by the shooter moving the gun, for example, to follow thetarget 50, a large swath of the target area is covered in one stroke. If a motion-inducing mechanism, such as thegyrator 6 were not present, the shooter would have to manually align the barrel with thetarget 50. A disadvantage, as mentioned earlier herein relative to the description of related art, is that depending on the bullet launch tolerance or desired accuracy limits (referred to in block 48 inFIG. 4 ), this could require an indefinite length of time for the shooter to accurately align the barrel with thetarget 50 and it may never occur at all during a manual operation. - Thus, one feature of the embodiment being described is that for aiming prediction purposes, the motion of the
barrel 10 a is controlled, repetitive and lends itself to the mathematical analysis and application of algorithms described herein. Accordingly, a circular or elliptical motion of thebarrel 10 a is achieved by altering the center of mass of theweapon 10. In the embodiment being described, this is achieved by providing agyrator 6 having aweight 19 secured thereto with ascrew 21. Note that theweight 19 is secured to acylinder 16 that is rotatably mounted onbearings 22 which are positioned on thebarrel 10 a. Thecylinder 16 is rotatably driven by amotor 17 that is coupled to theball bearing 22 via a drive linkage, including apulley 20 and belt or O-ring 18. Preferably, thebarrel 10 a should rotate in a circular fashion, but it has been found that thebarrel 10 a actually orbits in an elliptical fashion because of the difference in the moments of inertia of the weapon along different axes, such as axis A illustrated inFIGS. 2 and 3 . In the embodiment being described, theweight 19 is mounted to thecylinder 16, which is aluminum in the embodiment being described. Thecylinder 16 mounted on the pair of low profile ball bearings 22 a at the muzzle end of theweapon barrel 10 a. Thecylinder 16 comprises a groove 29 a that accommodates and receives the belt or O-ring 18 which is attached to thepulley 20 on the smallelectric motor 17. Thecylinder 16 further comprises a pair of diametrically opposed tapped holes used to attach theweight 19, such as a lead weight with thescrew 21 as shown. It should be understood that theweight 19 is removable by unscrewingscrew 21 so that different mass weights can be fastened to thecylinder 16, which allows the amplitude of thebarrels 10 a circular motion to be adjustable. - It should be understood that the voltage applied to the
motor 17 controls the angular frequency of gyration which is independent of the angle of barrel gyration. The angle of gyration is related to the ratio of the orbital radius to an overall length of the gun orweapon 10, as well as the weapon and gyrator masses. As mentioned, the voltage applied to themotor 17 controls the angular frequency which operates in the 300-800 RPM range in the embodiment being described. - During operation, the shooter points the
weapon 10 at the target 50 (FIG. 5 ) and a button, such astrigger button 57 onuser interface 2 is pressed to freeze the image captured by digital camera 1 (FIGS. 1 and 2 ). The captured image is displayed on theimage display 3. At this point, the shooter only needs to provide an approximate aim toward thetarget 50, such that thecamera 1 can capture the image of thetarget 50. The shooter views the frozen image on thedisplay 3 and positions cross-hairs 51 which are controlled by the miniature mouse 53 (FIG. 5 ) on the user-interface 2 ordisplay 3 over the desiredtarget 50. Theweapon 10 is still pointed approximately at thetarget 50. The user then puts or actuates an auto-fire mode by pulling apseudo trigger 10 b (FIG. 1 ). - The processing computer supplies digital I/O signals 32 for system controls. Inputs from the user interface are received by the
processing computer 4 to perform such actions as freezing a video frame,select targets 25, and making authorizations for firing 26. To fire, an authorization signal is given to thefiring trigger 9 in the form of a binary representation of the time interval between the next camera shutter, available as a digital output from thedigital camera 1, and bullet launch time. Numerous digital implementations of an interval timer can be implemented by those conversant with the art. The only requirement is that the clock driving the interval timer be sufficiently fast so that one single tick of the clock is negligible in comparison to the overall system timing. A clock frequency of 10 MHz is sufficient to fulfill this requirement. In the embodiment being described, thepseudo trigger 10 b may comprise an electronic as opposed to a traditional mechanical trigger involving a moving firing pin. For example, theelectronic firing trigger 9 may comprise the same mechanism resident on the Remington® Model 700 Etronix rifle which is advertised as having an ignition time of 0.27 microseconds available from Remington Arms Company, Inc. - In any event, the
pseudo trigger 10 b is coupled to theprocessor 4 which, in turn, generates a signal for energizing thegyrator 6 and, more particularly,motor 17. In response, the energizedmotor 17 rotatably drivespulley 20 and O-ring 18 to rotatably drive thecylinder 16 to wobble thebarrel 10 a. As mentioned earlier herein, this causes the aiming direction of the muzzle end ofbarrel 10 a to sweep out an area encompassing thetarget 50 at the target distance as the shooter slowly sweeps the weapon orgun 10 over thetarget 50. The speed of thebarrel 10 a rotation is much faster than the motion caused by the shooter's natural drift, so at some point of an over-target sweep, thebarrel 10 a points directly or nearly directly at thetarget 50. Without intervention on the shooter's part, successive image frames are captured bydigital camera 1 and processed in accordance with the algorithm described herein. For each frame captured, thetarget 50 is located in and a series of previous frames are analyzed to predict the anticipated time at which thebarrel 10 a will point directly at thetarget 50. At the exact instant when thebarrel 10 a aligns with thetarget 50, less the time delay for the trigger mechanism or the delay that the firingtrigger 9 will need to actuate and electronically fire theweapon 10, and further compensation for effects such as gravity drop and environmental conditions (e.g., wind, rain, etcetera), the shot is automatically discharged by means of theprocessor 4 energizing firing trigger 9 (FIG. 1 ). The projectile or bullet is launched and hits thetarget 50 in response to the electronic firing of thefiring mechanism 9. - A plurality of
rate sensors 5 are mounted on weapon 10 (FIG. 1 ). Solid state rate sensor technology is used in the preferred embodiment of the invention to track the precise angular motion of thegun barrel 10 a. That angular rate of change around the X, Y, and Z axes is conveyed to the processor, 4, as shown inFIG. 1 . - Although motion of the
gun barrel 10 a can also be tracked by usingvideo imaging data 31,rate sensors 5 allow for frequent pointing direction sampling with minimal processing demands. In addition,rate sensors 5 offer a straightforward method of separating out angular motion of the gun/video system with actual motion of thetarget 50, such as the movement of the tank shown inFIG. 6 . This is particularly important for firing accuracy at moving targets since target trajectory and ballistic “lead” need to be calculated for a correct ballistic solution. -
Rate sensors 5 are mounted onweapon 10 and coupled toprocessor 4. Therate sensors 5 are small, low cost “gyro on a chip” devices available in a coin size form factor that can be packaged to meet military specification environmental requirements. Example rate sensors that can be used in the invention are the QRS11 series manufactured by Systron Donner Inertial Division of BEI Technologies, Inc. By integrating the rate sensors signal over time, a precise angle ofbarrel 10 a can be calculated. By incorporating two rate sensors into the weapon 10 (one for azimuth angle measurement and the other for elevation) a precise measure of pointing direction can be achieved. If desired, a third rate sensor (not shown) can also be used to monitor rotation about the gun barrel axis. - The
rate sensors 5 have a random walk noise that can cause the pointing direction measurement to drift over time. Ifrate sensors 5 alone are used for tracking the gun'spointing direction 39, the random walk error becomes significant over time and eventually impacts the accuracy of the ballistic solution after a few seconds of time. To counter this effect so as to achieve high accuracy pointing direction information, the invention in one preferred embodiment frequently updates the pointing direction reference from therate sensors 5 with pointing direction information determined by processing camera video frames 40 of data. The target location is found in each video frame where the FOV of thecamera 1 subtends the target. This location is then used to update the angular position of theweapon 10 relative to the target. This updating process can eliminate the gun pointing direction error with respect to the target from growing too large over time. -
Rate sensors 5 lower the video processing demands of the invention by allowing the system to operate at much lower frame rates compared to using video alone to determine muzzle pointing direction. Conventional video frame rates of 30 frames per second or lower can typically be used with rate sensors as long as the target trajectory can be resolved for the given frame rate. Faster moving targets demand higher frame rates in order to adequately sample the target trajectory. Therate sensors 5 incorporated into the invention allow the frame rate requirements to depend mostly on the motion of the target as opposed to that of the gun. - Consider the case of shooting at stationary targets with a moving gun. The invention's video tracking technology can update the target location relative to the local image coordinates on every frame time using target
pattern matching algorithms - Another case is the challenging task of firing accurately with an angular moving gun and moving target. By using the
rate sensors 5 to keep track of the gun's pointing direction position, the rate sensor information can be used to efficiently isolate a target's relative motion from the pointing direction changes of theweapon 10.FIG. 6 shows a graphical example of how target motion relative to a two dimensional inertial reference frame can be calculated in a moving gun and moving target scenario. - At every frame time, the target position can be located by image processing techniques relative to the local image coordinate system. The
rate sensor 5 data can be used to track the motion of this image coordinate system relative to an inertial coordinate reference. Since the random walk error of therate sensor 5 is small in the frame times involved, rate sensor information spanning several frame times can be used for this tracking. By using the combination of the image coordinates of the target with the rate sensor's 5 coordinates for the image frame, the coordinates of the target relative to the inertial frame of reference can be calculated. - When this method is used in combination with range data to the target, it allows for a relatively simple way to calculate the velocity and acceleration vectors of the target moving relative to the inertial reference frame. In these calculations, distance information is derived from target tracking within each video frame and time information is derived from the inverse of the video frame rate. The target's motion vectors can be used to predict the target's new location in between frame times, thus allowing for targeting of high speed targets that are moving at significant displacement rates. For instance, a weapon's 10 lead can be calculated by determining the time of flight to the target based on range and published bullet velocity and then multiply this time by the velocity component perpendicular to the target line of sight. A decision to fire 41 can be made when the pointing direction of the gun is within the ballistic solution necessary to hit the moving target at its predicted future position. This general technique for handling the motion of the target relative to the gun platform does not rely on processing fixed point background references and as a result, the technique can work on both uniform and non-uniform background scenes.
- If a weapon's 10 motion is expected to have translation velocity components relative to an inertial frame of reference (e.g. a gun in a moving vehicle), a variety of methods exist (
accelerometers 5, global position system references 33,odometer readings 33, etc) to calculate the magnitude and direction of the translational motion. In an additional embodiment of the invention,accelerometers 5 can be incorporated in to the invention to detect changes in translation motion. Translational information can be incorporated into the calculated ballistic solution to further the overall firing accuracy of the system. - The implications of being able to separate gun angular motion from fast relative target motion in a processor aided firing system opens a new realm of firing accuracy when both target and gun are independently moving. This is a significant capability since military guns are typically used from dynamically moving platform (e.g. soldiers, land vehicles, helicopters, airplanes, and boats) and targets are often moving.
-
Image target identification 38 and tracking 40 is a core technology within the invention that is an important component in supporting processor aided firing.FIG. 9 illustrates avideo frame 83 captured bycamera 1. By determining the X-Y location of the target in eachvideo frame 83, a relative angular position of thetarget 81 with respect to the gun's 10pointing direction 80 at the time of frame exposure can be determined since thelocation vector 82, which is a vector between thetarget 50original position 87 andactual location 85 where theweapon 10 is pointing is known. The gun'spointing direction 80 is known relative to the pointing direction of thedigital camera 1 and is shown inFIG. 9 as being the same vector that ends at the center of the field of view of the image frame. Past and present location information can be used to help predict the future location of thetarget 50 and this prediction is used in theballistic solution 43 which determines if theprocessor 4 can fire upon the target at a particular moment intime 41. A variety of different image processing algorithms can be incorporated into the invention for identify and tracking targets in the video scene. This includes the use of raster, vector, and temporal based imaging algorithms in any combination. - An algorithm incorporated into one embodiment of the invention for use in identifying and locating non-changing, fixed targets or reference points within an image frame is a hierarchical search normalized greyscale correlation, also referred to as pyramiding normalized greyscale correlation. This technique is known and is taught, for example, in Matrox Imaging,
ActiveMIL version 7 User Guide, Jun. 6, 2002, Matrox Electronic Systems Ltd., pg. 183, which is incorporated herein by reference and made a part hereof. This highly efficient and precise raster based algorithm is effective for locating stationary targets or reference points in situations where there is no relative rotating or scaling occurring within the video scene. - In its most simplified form, this technique scans a reference region of interest over the entire image and then outputting the correlation match at each index step across the image. The index at which the highest correlation exists above a certain threshold is the location of the target in the acquired frame. If all correlation results are below the threshold set for the image, then the frame is deemed to not contain the target.
- Normalized grayscale correlation (NGC) is defined by the following equation:
where x=0,1,2, . . . ,M-1, y=0,1,2, . . . ,N-1, {overscore (w)} is the average value of the pixels in the ROI defined by w (this only needs to be computed once). f is the region within the frame image coinciding with the current location of w and {overscore (f)} is the average value within f The summation of the indexes s and t are only over the coordinates that are common to both f and w. The correlation coefficient is calculated for each location (x,y) in the image. - To bring the NGC calculation time into the millisecond domain a pyramid scheme is used where a small lower resolution image of the frame is initially processed. Then, this correlation information is used to limit the pattern search to selected regions of the next higher resolution version of the frame image. This process repeats itself so that the best correlation coefficient is quickly found for the ROI model that is being matched in the image. If the correlation coefficient is below a threshold value then the image is marked as not having the target contained with it.
- This pyramid scheme allows NGC processing to only take 3-4 milliseconds using a ROI size of 100×100 pixels and an image size of 640×480. For the image size of 640×1 98 that occurs when the camera is in 240 fps mode the NGC time is about half this (˜1.5 to 2 ms).
- NGC is a very powerful pixel level pattern matching technique that can discern the location of virtual any static ROI model from an image frame. NGC proves to be a well performing algorithm to meet this criterion. In future phases of this development, NGC in combination with other image processing algorithms (e.g. geometric pattern matching) can be researched to extend the system beyond static targets to the moving target domain.
- In the case of moving targets, vector based algorithms can be more flexible then raster based algorithms. To augment moving target tracking, there are several promising image processing technologies currently known in the art that can be incorporated into preferred embodiment of the invention depending on the particular targeting demands. These technologies can involve identifying and tracking spatial and/or temporal properties of the target within the video frames.
- An example of one such method is to use geometric pattern matching algorithms to match edges of a target against a database of target templates. This “model” based approach can be used to track the target (such as the tank shown in
FIG. 6 ) as it moves across a scene. Geometric techniques tend to be more robust in handling the scaling and rotation effects of moving platforms, but generally demand more CPU processing compared to raster based correlation techniques. It should be understood that the incorporation of rate sensors into the invention, reduces the overall processing demands so geometric pattern matching and other powerful image processing techniques can be used to accurately track the target given a variety of scene conditions. - Even with the fastest electronics and processing speeds, there is an inescapable time delay between image capture and the launch of the projectile or bullet. The recent motion of the
barrel 10 a and prediction 45 of its future position needs to be determined. Each frame of data captured bycamera 1 is analyzed to find the position where thebarrel 10 a points. When thebarrel 10 a is anticipated to point at the target, or lead point in the case of moving targets, within a certain tolerance, the bullet is launched 47. - The analysis is considered in two fashions; passive mode, when the
gyrator 6 is off and active mode, when thegyrator 6 is on. Each case produces a characteristic sequence of barrel pointing. - With the gyrator off, the barrel wander from human drift behaves much like Brownian motion on a long time scale of the order of seconds. The motion is, for all intents and purposes, random and therefore non-predictable. However, on a short time scale of the order of milliseconds the location difference between successive data points is not that great and leads itself to predictive techniques.
- One firing method is to wait until the
barrel 10 a is pointed at the target, within a certain tolerance, and then launch the bullet. This can be viewed as a non-forecasting approach and gives performance within a few MOA. However, the preferred rate prediction technique uses the rate of change of the X and Y components of motion to predict future location coordinates. The two main concepts of this approach are 1) actual X, Y location prediction based on the rate of change, and 2) the values of the first and second derivatives used as constraints when near the target. The complete calculation uses data from the 3 previous frames to make a prediction on whether to fire the rifle 1.5 to 2.5 frames in the future. The X, Y coordinate location is defined in the image plane of the target where 0, 0 is the target center. - The procedure is illustrated in
FIG. 6 and is as follows. First,vectors vectors camera 1 relative to rate sensor's 5 inertial frame ofreference 67 is then determined to provide a calculation ofvectors FIG. 6 . Next,vectors reference 67.Processor 4 then determines a resultant displacement vector 66 of target during time delta t relative to theinertial frame 67. The velocity vector 66 for the target can then be calculated from this information. - Thus, first derivatives are calculated for each X and Y component. The predicted X component is then simply calculated using X=(X_last+1.5*X_rate)+(t* X_rate) where t=0 to 1. A similar formula is used to calculate the Y component. A second derivative term can also be added to this prediction, but requires more investigation on whether it is actually helpful since it is based on slightly older frame data.
- As the pointing direction of the
rifle 10 changes, it sweeps out a path in the image plane of the target. This path will approach the target center and then move away as it sweeps by. Because there is a variance between predicted motion and actual motion, it is best to make firing decisions using data that is representative of the motion moving toward the target and not away from it. Such a decision can be made by using constraint conditions based upon the pointing angle distance from the target and the rates of change of this distance. - The distance from the target center defined in the plane of the target is calculated from the Pythagorean Theorem, using the X and Y components. After calculating the distance from the target in each of the 3 previous frames, the first and second derivatives of the distance from the target center can be obtained for the most recent frame in the set of three. The distance from target and the first and second derivatives of this distance allow for the setup of powerful constraint conditions on when to fire the rifle.
- In the case where the gyrator is on, the
barrel 10 a motion is the combination of the Brownian-like motion arising from human drift and the elliptical motion from the gyrator and is illustrated inFIG. 7 . Data points on this figure are made by taking the recent frames of image data and finding, by pattern matching or other means, the point where the previously selected target is located. Points at this stage are given in pixel units as X and Y coordinates in a Cartesian coordinate plane. - The unequal lengths of the major and minor axes arise from the asymmetrical moment of inertia of the weapon. The vertical and horizontal components each trace out a sinusoid that wanders about due to human drift.
- In the preferred embodiment, the drifting sinusoid 72 of each component can be modeled as the addition of a polynomial and sine wave, i.e.,
f(t)=(A 0 +A 1 t+A 2 t 2+ . . . )+B 0 sin(B 1 t+B 2)
where An and Bn are coefficients that can be determined by curve fitting techniques. The polynomial factor estimates the motion caused by human drift, and the sine function models the gyration motion. As an example, The X (horizontal) raw data fromFIG. 7 is shown inFIG. 8 . - To do the actual coefficient determination, a couple of non-linear fitting routines that minimize CHI square can be used; a simple Bevington grid search method along with a more sophisticated Levenburg-Marquardt program. However, the difficulty with non-linear linear routines is that to some extent they have to “fish” for a solution by probing in a trial-and-error fashion. Worse, there may be several solutions that minimize CHI-square, but are only locally valid, and miss the absolute minimum solution. In addition, if the coefficients are not seeded with numbers close to their ultimate values, the routines will not converge on the true solution and sometimes “fly away”.
- To determine the coefficients of the motion equation, the following steps are taken. Decompose the positional data into X (horizontal) and Y (vertical) functions of time. A cubic spline fit 71 (
FIG. 8 ) is fit to each component. From this the extrema (min and max) and inflection points of the sinusoid are extracted byprocessor 4. A min and max difference is used to seed the amplitude of the sine coefficient, B0. The time difference between successive inflection points (or extrema) is then used to seed the angular frequency coefficient, B1. The last inflection point is used to seed the phase, B2, or let it free float. A polynomial fit is then performed on the set of inflection points. This expresses the human drift is now removed in a polynomial format and determines coefficients An. This leaves a function representing movement of the target. Alternately, a fit can be made to the extrema points. The human drift from the data by subtracting out the polynomial function. A non-linear grid search is used to determine exactly the sinusoidal coefficients, Bn. To improve the accuracy of the prediction, the last few data points are artificially weighted by assigning artificially small standard deviations to the data points. - At this point, the coefficients of An and Bn are determined by
processor 4 and thus the expectedbarrel 10 a position can be evaluated by the drifting sinusoid, f(t) 72, in the X coordinate. The Y coordinate is fit in a similar manner. - The invention uses several techniques to handle accurately aiming and firing on moving targets. Target Identification algorithms operate on each frame of video data to identify and track the target in real time.
Rate sensors 5 are used to separate out gun motion from target motion, so the anticipated trajectory of the target can be calculated byprocessor 4. This processing is done by the processor or ballistic computer(X) 4. Location of the image in the X-Y plane of the field of view can be readily obtained by this method. - Since the time of flight of a ballistic round can be many milliseconds even at short ranges, a “lead” on a moving target is usually necessary to hit a fast moving target. The invention can calculate the “lead” necessary for hitting a moving target by determining the target's trajectory using the image based tracking techniques described earlier relative to
FIG. 6 and by determining the target's range. -
Range determination 36 can be done using a variety of techniques andior sensors to determine a range or distance to the target, such astarget 50 inFIG. 9 . For example, if the size of the target is known, range can be calculated by using image data associated with the target and measuring the pixel width of the target and using the below formula:
D=s/tan[(T pw *FOV)/I pw]
Where D is distance to target, s is physical target width, Tpw is target width in pixels, Ipw is Image size in pixels, and FOV is effective angular Field of View of imaging system. Arange sensor 34FIG. 4 may also be optionally directly incorporated into the invention to determine range. Its output is feed into theballistic computer processor 4 and used to calculate the ballistic solution. This “lead” factors into aballistic solution 43 represented inFIG. 4 that corresponds to 43, 44. Theballistic solution 43 is used, along with other factors such as windage, gravity drop, ammunition, gun type, environmental effects byprocessor 4, to determine the angular position the gun needs to be in before theprocessor 4 energizes the firingtrigger 9 to fire theweapon 10. - Depending on the particular target and scene requirements, the invention can use a variety of image processing techniques to track a moving target in successive video frames. Geometric Pattern matching is one example of such a technique that can account for both scaling and rotation of the moving target.
- The choice of lens and
camera 1 to incorporate in to the invention can be dependent on a variety of factors including the particular tracking requirements of target scene. For example, for fast moving targets, an imaging system with larger aperture optics, a wider field of view lens, high frame rate video, and a higher resolution imager will often be necessary to simultaneously lead, track, and resolve the target. - The
video camera 1 used in the invention has an electronic shutter capability to adjust the exposure time of each video frame. This is particular important in both the case of the static and moving target since in either case the gun and/or the target can be moving. A short exposure time is often needed to avoid blurring of the target image in the video frame. Typical shutter settings of 1/500 to 1/5000 of a second are common depending on the motion, lighting, and the f# of the lens. - The operation will now be described from the shooter's point of view, the sequence of operations is as follows:
-
- 1) The shooter points the
weapon 10 at the target and abutton 57 is pressed to freeze the image captured by thedigital camera 1,FIG. 2 , which is rendered on theimage display 3,FIG. 2 . At this point, only an approximate aim is necessary. - 2) The shooter views the frozen image on the
display 3,FIG. 2 , and positions crosshairs controlled by the mouse 53 (FIG. 5 ) in theuser interface 2,FIG. 2 , over the desired target. Theweapon 10 is still pointed approximately at the target. - 3) The
weapon 10 is then put in auto-fire mode by pulling apseudo trigger 10 b (FIG. 1 ). - 4) The
barrel gyrator 6,FIG. 2 , activates and wobbles thebarrel 10 a, which causes its aiming direction to sweep out a sizeable area at the target distance as the shooter slowly sweeps theweapon 10 over the target. The speed of the barrel rotation is much faster than the motion caused by the shooter's natural drift, so at some point of an over-target sweep, the barrel points directly, or nearly directly, at the target.
- 1) The shooter points the
- Without intervention on the shooter's part, successive image frames are captured by
camera 1 and processed byframe capture electronics 8. For each frame of data captured, the target is located, and a series of previous frames are analyzed to predict the anticipated time at which the barrel will point directly at the target in the manner described relative toFIGS. 6-9 . -
- 5) At the exact instant when the
barrel 10 a aligns with the target, less the time delay for the trigger mechanism, and compensated for effects such as gravity drop and environmental conditions, the shot is automatically discharged by means of thetrigger mechanism 9,FIG. 1 . - 6) Finally, the projectile is launched and hits the target.
- 5) At the exact instant when the
- As alluded to earlier, the system can work without the barrel gyrator. This decreases the system performance and forces a barrel/target alignment to fire. In the case where a target is static, and has a stationary background, it is possible to operate the system without motion detectors. In this scenario, the optical identification part of this system is sufficient to fire accurately.
- The
digital camera 1 is typically a conventional visible light imaging device, but thermal Infrared and night vision imaging devices can also be used with the invention. - Automatic Target identification is an additional embodiment of the device that can allow the use to select targets that are identified by computer algorithms. Once identified using predefined criteria, they are presented to the user as possible targets on the
image display 3. - Alternatively, the
system 100 can work without arotating muzzle gyrator 6 in cases where is there is ample time to direct the weapon directly at the target. The above steps 1-3 are performed: a sighting is made, image captured, and crosshairs placed on the target. Then thebarrel 10 a is aimed at the target and slowly wandered about. When thebarrel 10 a is pointed directly at the target, less corrections, theprocessor 4 energizes firingtrigger 9 to cause the round to be automatically fired. - This invention augments a shooter's ability to pull the trigger at the exact instant to hit a target by substituting an electronic means. The ramifications are profound, in that it may enable the shooter to increase the change he will hit his target virtually every time.
- A system flow processing diagram of the preferred embodiment is shown in
FIG. 10 .Digital video 90 is acquired of the target scene by thedigital camera 1 and is then transferred to thecomputer processor 4. Once a user sees a target of interest on thedisplay 3, a single frame of the video can be captured and displayed on theimage display screen 3 for target selection (block 92). The user selects 92 the target by moving a cursor to define the target with region of interest box or similar marking method. Once the user has selected the target, an authorization to shoot 93 is made by the user by depressing a button or similar input control on the user interface. - With the target selected, the
computer processor 4 uses the targets unique image properties to find the target in subsequent video frames as they are acquired by thedigital camera 1. The identification and location is done using pyramiding Normalized Greyscale Correlation, Geometric Pattern Matching, Image differencing and other image processing techniques currently known in the art. Once a new frame of the video is acquired 94, the target location is found in the frame 96. This location is used to determine the relative pointing direction of the target relative to the current position of the gun. - The current gun pointing direction can be determined by integrating the inertial motion rate sensor signals to determine pointing direction at any instant of time relative to an inertial frame of
reference 95. The inertial motion rate sensors can give measurement of gun pointing direction independent of imaging data and can allow for high frequency sampling of gun pointing direction that can exceed the frequency of target tracking using video frame rate measurements. - Prediction of gun pointing direction can be made using inertial
motion sensor data 95 and in the case of a stationary targets by also using the target tracking data 96. However, in the general embodiment of thesystem 100, inertial motion sensor data is the primary data source for gun pointing direction information due to the low processing demands, high bandwidth performance, and other system benefits of this method of acquisition. - In the preferred embodiment, the algorithms used to predict future gun pointing direction depends on if the barrel is gyrating or not 97 (decision block 97). A rate of
change method 98 is used in the case of no barrel gyration and a driftingsinusoid method 99 is used in the case when barrel gyration is occurring. The prediction of gun pointing direction needs to typically be several to many milliseconds in the future to account for data acquisition time, processing time, trigger time, ignition time, and round bore time between when the pointing data was acquired and the time when muzzle exit could occur. - If the target is not moving (block 100), prediction of actual target position is not needed since the target is always stationary. In the
non-moving case 100, the system then calculates (block 103) range to target and proceeds to theballistics calculation 104 to determine if the gun is currently at the correct aim point to hit the target. - In the moving
target case 100, both the range to target 101 at time of bullet impact and thegun lead 102 required at round muzzle exit to hit the target needs to be predicted. Since true target trajectory needs to be known for these calculations, gun pointing information obtained from the inertial rate sensors is used to isolate target motion in the video tracking information from that ofweapon 10 motion. Once target trajectory is known, predicted range and lead can be calculated byprocessor 4 in the course of calculating the generalballistic solution 104. - Once a
ballistic solution 104 is determined, thesystem 100 evaluates whether theweapon 10 is correctly aiming correctly to hit thetarget 107. The ballistic solution can incorporateother sensor data 105 and gun/ammunition performance 106 data. If the weapon's 10 aim point is with the accuracy threshold configured in the system by the user acountdown timer 108 will be set to fire 109 at precisely the time indicated by ballistic calculations atblock 104. In the preferred embodiment, the firing is by electrical means, via firing trigger 9 (FIG. 1 ) to allow for minimal ignition delays. - If the
weapon 10 aim point is currently not at thecorrect position 107, thesystem 100 loops back to tracking gun and target motion to collect new gun pointing data and/or image tracking data. - There are several modes in which this invention can operate and it is not essential for all the
system 100 components be functioning at the same time. The more operating components that art functioning, however, the more the increase in capability.FIG. 11 diagrams the increasing system performance with the exercise of more system components. - The essential group of components is the
digital camera 1 and the basic system items;user interface 2,image display 3,processor 4, andframe capture electronics 8. With this configuration, each digital image frame is analyzed allowing a determination of the barrel direction and a prediction of when the weapon will point directly at the target. In this mode, the shooter manually sweeps the gun about until the gun alligns with the target. This enables the hitting of staionary targets with great (beyond human) accuarcy without the need of a rigid weapon support. - With the addition of the
barrel gyrator 6, the weapon is mechanically oscillated which causes the sweeping of the aim point to cover much more area in a given amount of time. One way to visualize this is that instead of a precise aiming point, a “spotlight”, is being swept about around the target area. This causes the weapon to find it's target much faster, and can be used with stationary and moving targets. - Finally, with the addition of
motion sensors 5, the instantaneous barrel motion can be detected and integrated over time to provide a barrel pointing direction. This information can be used to frequently determine barrel position independently of image data. As a result, the accuracy of the system is enhanced. In addition, the information can be used to efficiently target both stationary and moving targets. - 1. Night Vision and Processor Aided Firing of Small Arms
- Up until now, the simulated shooting of the processor aided firing of small arms system has been performed under daylight conditions. The obvious extension of is to carry out experiments with technologies that enable nighttime warfare. At this point, it is proposed that a survey of existing battlefield cameras be performed. These can be thermal sensors or light amplification types. Candidate sensors could be first checked by studying their performance data, and if practical, placed in the experimental system. Actual testing can then be performed in the field.
- 2. An Anti-Fratricide System and Processor Aided Firing of Small Arms
- Another modification of this system is to equip every squad member with an accurate Global Positioning System (GPS), digital compass (e.g., Honeywell HMR 3000 Digital Compass Module) on their weapon, and WLAN radio. The position of each squad member is shared with every other member by the WLAN. By having the position of all other “friendlys,” it is immediately known by reading the bearing and inclination from the digital compass if the weapon is pointed at a squad member. At this point, an alarm (not shown) can alert the soldier or, when incorporated into the Processor Aided Firing of Small Arms system, inhibit the firing of the weapons.
- While the methods herein described, and the forms of apparatus for carrying these methods into effect, constitute preferred embodiments of this invention, it is to be understood that the invention is not limited to these precise methods and forms of apparatus, and that changes may be made in either without departing from the scope of the invention disclosed herein.
Claims (75)
1. A weapon comprising:
a firearm having a barrel and a user interface;
a barrel oscillator for oscillating the barrel in a predetermined pattern;
an image capture device mounted on said firearm for capturing a plurality of image frames of a target and generating image data in response thereto;
at least one motion sensor mounted on said firearm for sensing a motion of the barrel and generating motion data in response thereto; and
a processor coupled to said user interface, said image capture device and said at least one motion sensor;
said processor enabling a user to select a target and in response thereto, causing said image capture device to capture said plurality of images and generate said image data which is used along with said motion data to determine a predicted target location and coverage point where said barrel covers said target upon which said processor may energize said firearm to fire a projectile.
2. The weapon as recited in claim 1 wherein said user interface comprises a target selector locating a cross-hair on said target.
3. The weapon as recited in claim 2 wherein said target selector comprises a track ball for positioning said cross-hair.
4. The weapon as recited in claim 1 wherein said user interface comprises a trigger mechanism for receiving a fire signal from said processor and for implementing a projectile launch signal in response thereto.
5. The weapon as recited in claim 1 wherein said firearm comprises a gun or rifle.
6. The weapon as recited in claim 1 wherein said image capture device comprises a charge-coupled display.
7. The weapon as recited in claim 1 wherein said processor comprises at least one algorithm that determines a velocity vector for said target by determining a location of said target relative to a center of an image frame in a plurality of said plurality of image frames and calculating a plurality of position vectors in response thereto and using said position vectors to determine a displacement vector which in turn is used to calculate a velocity vector for said target.
8. The weapon as recited in claim 1 wherein said processor comprises barrel position detection algorithm for detecting a position of said barrel.
9. The weapon as recited in claim 8 wherein said barrel position detection algorithm receives said image data and processes said image data by applying a cubic spline fit to provide a drifting sinusoid generally corresponding to a movement of said barell.
10. The weapon as recited in claim 7 wherein said barrel position detection algorithm receives said image data and processes said image data by applying a cubic spline fit to provide a drifting sinusoid generally corresponding to a movement of said barrel.
11. The weapon as recited in claim 1 wherein said processor generates said predicted location of said target in response to said image data and said motion data.
12. The weapon as recited in claim 1 wherein said barrel oscillator comprises a gyrator.
13. The weapon as recited in claim 12 wherein said gyrator comprises:
a bearing mounted on said barrel;
a drive motor coupled to said bearing in response to a drive signal from said processor, said drive motor rotatably driving said bearing to cause said barrel to gyrate in a predetermined manner.
14. The weapon as recited in claim 13 wherein said user interface comprises an electronic button for initiating a gyration sequence during which said processor generates said drive signal.
15. A weapon comprising
a firearm comprising a barrel;
an imager mounted to said barrel for capturing an image of a target area;
a user interface for displaying said image, said user interface comprising a trigger for selecting a target within said image area; and
a processor coupled to said user interface and said imager for determining a future target location of said target and for automatically firing said firearm when said barrel is positioned in a firing position such that a projectile discharged from said firearm will hit the target selected by the user.
16. The weapon as recited in claim 15 wherein said weapon further comprises:
an electronic firing trigger coupled to said processor for firing the weapon.
17. The weapon as recited in claim 15 wherein said user interface further comprises a firing authorization trigger coupled to said processor for enabling a user to authorize firing the firearm after a target has been selected but before said automatic firing of said firearm.
18. The weapon as recited in claim 15 wherein said weapon further comprises:
at least one motion sensor coupled to said processor for sensing an angular position of said barrel;
said processor comprising a barrel tracking algorithm for receiving said angular position and for predicting a future barrel position for said barrel; said processor generating
19. The weapon as recited in claim 15 wherein said processor comprises video processing algorithm for receiving a plurality of images from said imager and for predicting said future target location of said target in response thereto.
20. The weapon as recited in claim 18 wherein said processor comprises video processing algorithm for receiving a plurality of images from said imager and for predicting said future target location of said target in response thereto.
21. The weapon as recited in claim 15 wherein said processor comprises video processing algorithm for receiving a plurality of images from said imager and for predicting said future target location of said target as well as a future barrel position in response thereto.
22. The weapon as recited in claim 15 wherein said weapon further comprises a gyrator mounted to the barrel for gyrating the barrel in a generally consistent motion.
23. The weapon as recited in claim 21 wherein said gyrator comprises:
a bearing mounted on said barrel;
a drive motor coupled to said bearing in response to a drive signal from said processor, said drive motor rotatably driving said bearing to rotate about said barrel to cause said barrel to gyrate in said generally consistent motion.
24. The weapon as recited in claim 23 wherein said gyrator comprises a weight mounted to said bearing.
25. The weapon as recited in claim 15 wherein said firearm comprises a gun or rifle.
26. The weapon as recited in claim 15 wherein said imager is a digital camera.
27. The weapon as recited in claim 15 wherein said imager is a charge-coupled display.
28. A gyrator for gyrating a barrel of a firearm, said gyrator comprising:
a bearing for mounting on said barrel of said firearm; and
a drive motor coupled to said bearing for rotatably driving said bearing to cause said end of said barrel to gyrate.
29. The gyrator as recited in claim 28 wherein said gyrator a weight mounted to said bearing.
30. A weapon comprising:
a firearm comprising a barrel; a gyrator mounted on said barrel for gyrating said barrel in a consistent motion;
an imager mounted to said firearm for capturing a plurality of images of an area;
a user interface for displaying at least one of said plurality of images, said user interface comprising a trigger for selecting a target within said at least one of said plurality of images; and
a processor coupled to said user interface, said imager and said gyrator, said processor receiving image data corresponding to on or more of said plurality of images captured causing said firearm to automatically discharge a projectile from said firearm when said barrel is positioned in a firing position such that said will hit the target selected by the user.
30. The weapon as recited in claim 30 wherein said weapon further comprises:
an electronic firing trigger coupled to said processor for firing the weapon.
32. The weapon as recited in claim 30 wherein said weapon comprises:
a user interface coupled to said processor for displaying said at least one of said plurality of images.
33. The weapon as recited in claim 32 wherein said user interface further comprises a firing authorization trigger coupled to said processor for enabling a user to authorize firing the firearm after a target has been selected but before said automatic firing of said firearm.
34. The weapon as recited in claim 30 wherein said weapon further comprises:
at least one motion sensor coupled to said processor for sensing an angular position of said barrel;
said processor comprising a barrel tracking algorithm for receiving said angular position and for predicting a future barrel position for said barrel.
35. The weapon as recited in claim 30 wherein said processor comprises video processing algorithm for receiving data corresponding to said plurality of images and for predicting said future target location of said target in response thereto.
36. The weapon as recited in claim 34 wherein said processor comprises video processing algorithm for receiving image data corresponding to said plurality of images from said imager and for predicting said future target location of said target in response thereto.
37. The weapon as recited in claim 30 wherein said processor comprises video processing algorithm for receiving image data corresponding to said plurality of images from said imager and for predicting said future target location of said target as well as a future barrel position in response thereto.
38. The weapon as recited in claim 30 wherein said gyrator comprises:
a bearing mounted on said barrel;
a drive motor coupled to said bearing in response to a drive signal from said processor, said drive motor rotatably driving said bearing to rotate about said barrel to cause said barrel to gyrate in said generally consistent motion.
39. The weapon as recited in claim 38 wherein said gyrator comprises a weight mounted to said bearing.
40. The weapon as recited in claim 30 wherein said firearm comprises a gun or rifle.
41. The weapon as recited in claim 30 wherein said imager is a digital camera.
42. The weapon as recited in claim 30 wherein said imager is a charge-coupled display.
43. An automatic firing system for use with a firearm, comprising:
an image capture device mounted on said firearm for capturing a plurality of images of an area in front of a muzzle end of said firearm; and
a processor coupled to said image capture device for processing data associated with said plurality of images and for determining an optimum firing time to discharge a bullet from said firearm in order to hit a target selected by a user.
44. The automatic firing system as recited in claim 43 wherein said system further comprises:
a gyrator for gyrating an end of a barrel of said firearm while said image capture device captures said plurality of images.
45. The automatic firing system as recited in claim 43 wherein said automatic firing system further comprises:
at least one motion sensor coupled to said processor for sensing an angular position of said firearm;
said processor comprising a tracking algorithm for receiving said angular position and for predicting a future position of said muzzle end in response thereto.
46. The automatic firing system as recited in claim 43 wherein said processor comprises video processing algorithm for receiving said data corresponding to said plurality of images and for predicting a future target location of said target in response thereto.
47. The automatic firing system as recited in claim 45 wherein said processor comprises video processing algorithm for receiving said data corresponding to said plurality of images and for predicting a future target location of said target in response thereto.
48. The automatic firing system as recited in claim 43 wherein said processor comprises video processing algorithm for receiving image data corresponding to said plurality of images from said imager and for predicting said future target location of said target as well as a future barrel position in response thereto.
49. A method for increasing accuracy of hitting a target with a firearm, said method comprising the steps of:
capturing a plurality of images of a target area including the target;
processing said plurality of images to predict an optimum firing condition; and
discharging the firearm when said optimum firing condition is achieved.
50. The method as recited in claim 49 wherein said optimum firing condition is when a muzzle end of said barrel covers or leads said target such that when a projectile is discharged from the firearm, it will hit the target.
51. The method as recited in claim 49 wherein said method further comprises the step of:
using image data points generally corresponding to a muzzle end of said firearm to determined said optimum firing condition.
52. The method as recited in claim 49 wherein said method further comprises the step of:
causing a barrel of said gun to move during said capturing step.
53. The method as recited in claim 49 wherein said method further comprises the step of:
using a plurality of motion sensors to determine a position of said target and a position of a muzzle end of said barrel in order to determine said optimum firing condition.
54. The method as recited in claim 49 wherein said method further comprises the steps of:
determining a first function representing a position of said target and a position of a muzzle end of said barrel;
determining a second function representing a barrel position of said barrel;
determining a difference between said first function and said second function to determine a location of said target.
55. The method as recited in claim 54 wherein said method further comprises the step of:
using a plurality of motion sensors to provide position data for use by a processor to calculate said first and second functions.
56. The method as recited in claim 54 wherein said method further comprises the step of:
using image data from said at least one of said plurality of images to calculate said first and second functions.
57. The method as recited in claim 49 wherein said capturing step is performed with a digital camera mounted to said firearm.
58. The method as recited in claim 54 wherein said method further comprises the step of:
providing a user interface on said gun for enabling a user to select the target from a video display.
59. The method as recited in claim 58 wherein said user interface comprises a track ball for placing cross hairs on said target.
60. The method as recited in claim 59 wherein said method further comprises the step of:
performing said capturing and processing steps while said target is moving.
61. The method as recited in claim 52 wherein said method further comprises the step of:
performing said capturing and processing steps while said target is moving.
62. The method as recited in claim 61 wherein said method further comprises the step of:
performing said causing step using a barrel gyrator.
63. The method as recited in claim 21 wherein said method further comprises the step of:
causing a barrel of said gun to move during said capturing step.
64. A firing system that automatically launches the projectile comprising of:
a) a barreled firearm,
b) an electronic digital camera that supplies the electronic processor with rapid, digital, repetitive frame information,
c) motion sensors that supply the electronic processor with angular rate information,
d) an electronic display, which is able to display the image data from the digital camera and display cross hairs for target identification,
e) a computer mouse, which interacts with the electronic processor and is able to position the cross hairs to identify the desired target,
f) an electronic processor, that receives data from the electronic digital camera, motion sensors, computer mouse, and transmits images to the electronic display, executes barrel prediction algorithms while analyzing the motion generated by human drift and mechanically forced motion from the barrel gyrator and finally transmits a fire signal to the trigger mechanism,
g) a trigger mechanism, which implements the projectile launch signal generated by the electronic processor,
h) a barrel gyrator, which forces an orbital motion on the firearm, which is analyzed by the electronic processor.
65. The said barreled firearm of claim 64 is a rifle.
66. The said barreled firearm of claim 64 is a pistol.
67. The said electronic digital camera of claim 64 is a Charge Coupled Device camera.
68. The said electronic display of claim 64 is a monocular type positioned over the eye.
69. The said computer mouse of claim 64 is a miniature trackball.
70. The said electronic processor of claim 64 is capable of calculating and correcting for target range and atmospheric conditions.
71. The said electronic processor of claim 64 is a microprocessor system.
72. The said electronic processor of claim 64 is programmable logic circuitry.
73. The said barrel gyrator of claim 64 is a motor with an eccentric weight attached on its shaft.
74. The said trigger mechanism of claim 64 electrifies an electrically-ignited cartridge.
75. A firing system that automatically launches the projectile comprising of:
(a) a barreled firearm;
(b) an electronic digital camera, that supplies the electronic processor with digital rapid, repetitive frame information;
(c) an electronic display, which is able to display the image data from the digital camera and display cross hairs for target identification;
(d) a computer mouse, which interacts with the electronic processor and is able to position the cross hairs to identify the desired target;
(e) an electronic processor, that receives data from the electronic digital camera, computer mouse, transmits the data to the electronic display, and runs barrel prediction algorithms while analyzing the motion generated by human drift and transmits a fire signal to the trigger mechanism; and
(f) a trigger mechanism, which implements projectile launch by a signal from the electronic processor.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/938,321 US20060005447A1 (en) | 2003-09-12 | 2004-09-10 | Processor aided firing of small arms |
PCT/US2004/029975 WO2005080908A2 (en) | 2003-09-12 | 2004-09-13 | Processor aided firing of small arms |
EP04821469A EP1676090A2 (en) | 2003-09-12 | 2004-09-13 | Processor aided firing of small arms |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US50269303P | 2003-09-12 | 2003-09-12 | |
US10/938,321 US20060005447A1 (en) | 2003-09-12 | 2004-09-10 | Processor aided firing of small arms |
Publications (1)
Publication Number | Publication Date |
---|---|
US20060005447A1 true US20060005447A1 (en) | 2006-01-12 |
Family
ID=35539816
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/938,321 Abandoned US20060005447A1 (en) | 2003-09-12 | 2004-09-10 | Processor aided firing of small arms |
Country Status (3)
Country | Link |
---|---|
US (1) | US20060005447A1 (en) |
EP (1) | EP1676090A2 (en) |
WO (1) | WO2005080908A2 (en) |
Cited By (92)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050018041A1 (en) * | 2003-07-21 | 2005-01-27 | Towery Clay E. | Electronic firearm sight, and method of operating same |
US20050188826A1 (en) * | 2003-05-23 | 2005-09-01 | Mckendree Thomas L. | Method for providing integrity bounding of weapons |
US20060107578A1 (en) * | 2000-11-13 | 2006-05-25 | R.A. Brands, Llc | Actuator assembly |
US20060114519A1 (en) * | 2000-10-02 | 2006-06-01 | Eastman Kodak Company | Articulating camera for digital image acquisition |
US7162806B1 (en) * | 2005-03-21 | 2007-01-16 | Travis Swiggart | Video sighting system |
US20070166669A1 (en) * | 2005-12-19 | 2007-07-19 | Raydon Corporation | Perspective tracking system |
US20070166668A1 (en) * | 2005-12-22 | 2007-07-19 | Maximillian Kusz | Optical sighting device for small arms |
US20080022575A1 (en) * | 2006-05-08 | 2008-01-31 | Honeywell International Inc. | Spotter scope |
US20080094360A1 (en) * | 2006-10-20 | 2008-04-24 | Sunplus Technology Co., Ltd. | Computer mouse having a front sight button and method for generating local coordinates with the same |
US20080160486A1 (en) * | 2006-06-19 | 2008-07-03 | Saab Ab | Simulation system and method for determining the compass bearing of directing means of a virtual projectile/missile firing device |
US20080163536A1 (en) * | 2005-03-18 | 2008-07-10 | Rudolf Koch | Sighting Mechansim For Fire Arms |
US20090037374A1 (en) * | 2007-07-30 | 2009-02-05 | International Business Machines Corporation | Method and system for reporting and relating firearm discharge data to a crime reporting database |
US20090133572A1 (en) * | 2005-12-29 | 2009-05-28 | Men At Work Ltd. | Boresighting system and method |
US20100100321A1 (en) * | 2008-10-16 | 2010-04-22 | Michael Koenig | System and method for use of a vehicle back-up camera as a dead-reckoning sensor |
US20100196859A1 (en) * | 2009-02-01 | 2010-08-05 | John David Saugen | Combat Information System |
US20100309224A1 (en) * | 2004-03-31 | 2010-12-09 | Canon Kabushiki Kaisha | Image displaying method, image displaying program, and display |
WO2011096854A1 (en) * | 2010-02-02 | 2011-08-11 | Saab Ab | Method and arrangements for firing a fire arm |
US20110315767A1 (en) * | 2010-06-28 | 2011-12-29 | Lowrance John L | Automatically adjustable gun sight |
US20120021385A1 (en) * | 2006-11-24 | 2012-01-26 | Trex Enterprises Corp. | Celestial weapons orientation measuring system |
US20120170815A1 (en) * | 2010-12-29 | 2012-07-05 | Kwong Wing Au | System and method for range and velocity estimation in video data as a function of anthropometric measures |
US20120212622A1 (en) * | 2011-02-17 | 2012-08-23 | Kabushiki Kaisha Toshiba | Moving object image tracking apparatus and method |
WO2012121735A1 (en) * | 2011-03-10 | 2012-09-13 | Tesfor, Llc | Apparatus and method of targeting small weapons |
WO2012131548A1 (en) | 2011-03-28 | 2012-10-04 | Smart Shooter Ltd. | Firearm, aiming system therefor, method of operating the firearm and method of reducing the probability of missing a target |
US20120297654A1 (en) * | 2011-05-26 | 2012-11-29 | The Otis Patent Trust | Firearm sensor system |
US8336776B2 (en) | 2010-06-30 | 2012-12-25 | Trijicon, Inc. | Aiming system for weapon |
EP2536995A2 (en) * | 2010-02-16 | 2012-12-26 | TrackingPoint, Inc. | Advanced firearm or air gun scope |
US20130152447A1 (en) * | 2009-12-18 | 2013-06-20 | Vidderna Jakt & Utbildning Ab | Aiming device with a reticle defining a target area at a specified distance |
US8555771B2 (en) * | 2009-03-18 | 2013-10-15 | Alliant Techsystems Inc. | Apparatus for synthetic weapon stabilization and firing |
US20130286216A1 (en) * | 2012-04-30 | 2013-10-31 | Trackingpoint, Inc. | Rifle Scope Including a Circuit Configured to Track a Target |
US20130326923A1 (en) * | 2012-06-07 | 2013-12-12 | Dr. Erez Gur Ltd. | Method and device useful for aiming a firearm |
EP2694908A2 (en) * | 2011-04-01 | 2014-02-12 | Zrf, Llc | System and method for automatically targeting a weapon |
US20140118723A1 (en) * | 2012-10-29 | 2014-05-01 | Teledyne Scientific & Imaging, Llc | System for determining the spatial orientation of a movable apparatus |
US20140168447A1 (en) * | 2012-12-18 | 2014-06-19 | Trackingpoint, Inc. | Optical Device Including a Mode for Grouping Shots for Use with Precision Guided Firearms |
US8777620B1 (en) * | 2006-08-15 | 2014-07-15 | Triggermaster, Inc. | Firearm trigger pull training system and methods |
US20140283430A1 (en) * | 2011-09-09 | 2014-09-25 | Lasermax, Inc. | Target marking system |
WO2014169107A1 (en) | 2013-04-11 | 2014-10-16 | Hall Christopher J | Automated fire control device |
KR101468160B1 (en) * | 2012-12-21 | 2014-12-05 | 주식회사 도담시스템스 | Training system for improving shooting accuracy and its control method |
EP2811252A1 (en) * | 2013-06-07 | 2014-12-10 | TrackingPoint, Inc. | Precision guided firearm including an optical scope configured to determine timing of discharge |
EP2811253A1 (en) * | 2013-06-07 | 2014-12-10 | TrackingPoint, Inc. | Precision guided firearm with hybrid sensor fire control |
US8911235B1 (en) | 2006-08-15 | 2014-12-16 | Triggermaster, Inc. | Shooting training device |
US20150211828A1 (en) * | 2014-01-28 | 2015-07-30 | Trackingpoint, Inc. | Automatic Target Acquisition for a Firearm |
US9127909B2 (en) | 2013-02-17 | 2015-09-08 | Smart Shooter Ltd. | Firearm aiming system with range finder, and method of acquiring a target |
US9151564B1 (en) | 2006-08-15 | 2015-10-06 | Triggermaster, Inc. | Firearm trigger pull training system and methods |
US9163894B1 (en) * | 2011-10-28 | 2015-10-20 | Lockheed Martin Corporation | Laser transmission system for use with a firearm in a battle field training exercise |
EP2950032A1 (en) * | 2014-05-27 | 2015-12-02 | Israel Weapon Industries (I.W.I.) Ltd. | An apparatus and method for improving hit probability of a firearm |
US9250035B2 (en) | 2013-03-21 | 2016-02-02 | Kms Consulting, Llc | Precision aiming system for a weapon |
US20160169625A1 (en) * | 2014-12-10 | 2016-06-16 | Flir Systems, Inc. | Electronic adaptive reticle systems and methods |
EP2943738A4 (en) * | 2013-01-10 | 2016-07-06 | Dale Albert Hodgson | Motorized weapon gyroscopic stabilizer |
US20160216082A1 (en) * | 2015-01-22 | 2016-07-28 | Colt Canada Corporation | Sensor pack for firearm |
US9435603B2 (en) * | 2014-04-16 | 2016-09-06 | Hanwha Techwin Co., Ltd. | Remote weapon system and control method thereof |
US9464871B2 (en) | 2009-09-11 | 2016-10-11 | Laurence Andrew Bay | System and method for ballistic solutions |
US20170059279A1 (en) * | 2010-05-04 | 2017-03-02 | Lasermax, Inc. | Encoded signal detection and display |
US20170146319A1 (en) * | 2015-11-19 | 2017-05-25 | Philip Scott Lyren | Firearm System that Tracks Points of Aim of a Firearm |
US9702662B1 (en) * | 2015-12-22 | 2017-07-11 | Huntercraft Limited | Electronic sighting device with real-time information interaction |
US9823040B1 (en) * | 2016-08-23 | 2017-11-21 | Shih-Che Hu | Gun barrel unit for a toy gun |
US9823043B2 (en) | 2010-01-15 | 2017-11-21 | Colt Canada Ip Holding Partnership | Rail for inductively powering firearm accessories |
US9891023B2 (en) | 2010-01-15 | 2018-02-13 | Colt Canada Ip Holding Partnership | Apparatus and method for inductively powering and networking a rail of a firearm |
US9897411B2 (en) | 2010-01-15 | 2018-02-20 | Colt Canada Ip Holding Partnership | Apparatus and method for powering and networking a rail of a firearm |
US9921028B2 (en) | 2010-01-15 | 2018-03-20 | Colt Canada Ip Holding Partnership | Apparatus and method for powering and networking a rail of a firearm |
US20190003803A1 (en) * | 2016-02-03 | 2019-01-03 | Vk Integrated Systems | Firearm electronic system |
US10203179B2 (en) | 2012-01-11 | 2019-02-12 | Dale Albert Hodgson | Motorized weapon gyroscopic stabilizer |
US20190056198A1 (en) * | 2016-02-24 | 2019-02-21 | James Anthony Pautler | Skeet and Bird Tracker |
US10228208B2 (en) | 2017-03-08 | 2019-03-12 | Sturm, Ruger & Company, Inc. | Dynamic variable force trigger mechanism for firearms |
US20190113310A1 (en) * | 2017-09-15 | 2019-04-18 | Tactacam LLC | Weapon sighted camera system |
US10323894B2 (en) * | 2015-08-19 | 2019-06-18 | Paul Imbriano | Weapons system smart device |
US10337834B2 (en) | 2010-01-15 | 2019-07-02 | Colt Canada Ip Holding Partnership | Networked battle system or firearm |
RU192631U1 (en) * | 2018-06-26 | 2019-09-24 | Константин Александрович Идель | OFFLINE BALLISTIC COMPUTER FOR RUNNING WEAPONS |
US10470010B2 (en) | 2010-01-15 | 2019-11-05 | Colt Canada Ip Holding Partnership | Networked battle system or firearm |
US10477618B2 (en) | 2010-01-15 | 2019-11-12 | Colt Canada Ip Holding Partnership | Networked battle system or firearm |
US10477619B2 (en) | 2010-01-15 | 2019-11-12 | Colt Canada Ip Holding Partnership | Networked battle system or firearm |
US20200012856A1 (en) * | 2018-07-06 | 2020-01-09 | Meopta U.S.A., Inc. | Computer applications integrated with handheld optical devices having cameras |
US10648781B1 (en) * | 2017-02-02 | 2020-05-12 | Arthur J. Behiel | Systems and methods for automatically scoring shooting sports |
US10670361B2 (en) | 2017-03-08 | 2020-06-02 | Sturm, Ruger & Company, Inc. | Single loop user-adjustable electromagnetic trigger mechanism for firearms |
EP3663697A1 (en) * | 2018-12-09 | 2020-06-10 | Israel Weapon Industries (I.W.I.) Ltd. | Firearm controlled by user behaviour |
US10690430B2 (en) | 2017-03-08 | 2020-06-23 | Sturm, Ruger & Company, Inc. | Dynamic variable force trigger mechanism for firearms |
US10712116B1 (en) * | 2014-07-14 | 2020-07-14 | Triggermaster, Llc | Firearm body motion detection training system |
US10900732B2 (en) | 2017-03-08 | 2021-01-26 | Sturm, Ruger & Company, Inc. | Electromagnetic firing system for firearm with firing event tracking |
WO2021048307A1 (en) * | 2019-09-10 | 2021-03-18 | Fn Herstal S.A. | Imaging system for firearm |
EP3819585A1 (en) * | 2019-11-11 | 2021-05-12 | Israel Weapon Industries (I.W.I.) Ltd. | Firearm with automatic target acquiring and shooting |
EP3669135A4 (en) * | 2017-08-15 | 2021-08-04 | Paspa Pharmaceuticals Pty Ltd | Firearm stabilization device |
US20210364256A1 (en) * | 2020-04-21 | 2021-11-25 | Axon Enterprise, Inc. | Motion-based operation for a conducted electrical weapon |
US11231252B2 (en) * | 2020-06-10 | 2022-01-25 | Brett C. Bilbrey | Method for automated weapon system with target selection of selected types of best shots |
US11268789B2 (en) * | 2012-09-13 | 2022-03-08 | Christopher V. Beckman | Device controlling shooting based on firearm movement |
EP3752787A4 (en) * | 2018-02-14 | 2022-03-09 | Wilcox Industries Corp. | Weapon system |
US11300378B2 (en) | 2017-03-08 | 2022-04-12 | Sturm, Ruger & Company, Inc. | Electromagnetic firing system for firearm with interruptable trigger control |
US20220122271A1 (en) * | 2020-10-21 | 2022-04-21 | Hanwha Defense Co., Ltd. | Remote-controlled weapon system in moving platform and moving target tracking method thereof |
EP4027100A1 (en) * | 2021-01-07 | 2022-07-13 | Israel Weapon Industries (I.W.I.) Ltd. | Grenade launcher aiming control system |
US11441874B2 (en) * | 2017-11-10 | 2022-09-13 | Hanwha Defense Co., Ltd. | Remote weapon control device and method for targeting and shooting multiple objects |
US20220349676A1 (en) * | 2021-05-03 | 2022-11-03 | John Aron Maguire | Novel System And Methods For Incorporating Firearm Ammunition Temperature & Thermal Susceptibility To Improve Ballistic Calculator Algorithms And Fidelity |
US20220358738A1 (en) * | 2016-01-29 | 2022-11-10 | Snap Inc. | Local augmented reality persistent sticker objects |
US11754363B1 (en) * | 2020-07-29 | 2023-09-12 | Dale Albert Hodgson | Gimballed Precession Stabilization System |
US11933573B1 (en) * | 2022-07-13 | 2024-03-19 | Anthony Vines | Firearm shot tracking system |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102005057323A1 (en) * | 2005-12-01 | 2007-06-06 | Carl Zeiss Optronics Gmbh | Device for increasing probability of hit of firearm, has image evaluation device which is suitable to bring recorded reference image for evaluation of result by user during operation of device |
WO2008048116A1 (en) * | 2006-10-16 | 2008-04-24 | Urban Voyage Limited | Monitoring engagement of a weapon |
DE102014019199A1 (en) | 2014-12-19 | 2016-06-23 | Diehl Bgt Defence Gmbh & Co. Kg | automatic weapon |
CN104613816B (en) * | 2015-01-30 | 2016-08-24 | 浙江工商大学 | Numeral sight and use its method to target following, locking and precision fire |
Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3575085A (en) * | 1968-08-21 | 1971-04-13 | Hughes Aircraft Co | Advanced fire control system |
US3659494A (en) * | 1965-06-08 | 1972-05-02 | Itek Corp | Fire control system for use in conjunction with electronic image motion stabilization systems |
US3766826A (en) * | 1971-02-26 | 1973-10-23 | Bofors Ab | Device for achieving aim-off for a firearm |
US3840794A (en) * | 1972-03-02 | 1974-10-08 | France Etat | Control system for tracking a moving target |
US4004729A (en) * | 1975-11-07 | 1977-01-25 | Lockheed Electronics Co., Inc. | Automated fire control apparatus |
US4370914A (en) * | 1977-04-07 | 1983-02-01 | E M I Limited | Aiming arrangements |
US4402251A (en) * | 1981-09-18 | 1983-09-06 | The United States Of America As Represented By The Secretary Of The Army | Detection of line of sight reversal and initiation of firing commands for a modified acceleration predictor fire control system engaging maneuvering targets |
US4787291A (en) * | 1986-10-02 | 1988-11-29 | Hughes Aircraft Company | Gun fire control system |
US5392688A (en) * | 1992-06-02 | 1995-02-28 | Giat Industries | Trigger for a firing weapon |
US5544439A (en) * | 1992-09-10 | 1996-08-13 | Giat Industries | Device for firing a firearm using an infrared detector |
US5686690A (en) * | 1992-12-02 | 1997-11-11 | Computing Devices Canada Ltd. | Weapon aiming system |
US5966859A (en) * | 1997-11-14 | 1999-10-19 | Samuels; Mark A. | Devices and methods for controlled manual and automatic firearm operation |
US5991043A (en) * | 1996-01-08 | 1999-11-23 | Tommy Anderson | Impact position marker for ordinary or simulated shooting |
US6000163A (en) * | 1998-04-03 | 1999-12-14 | Gordon; Terry | Photographic rifle scope apparatus and method |
US6499382B1 (en) * | 1998-08-24 | 2002-12-31 | General Dynamics Canada Ltd. | Aiming system for weapon capable of superelevation |
US6805036B2 (en) * | 2001-11-23 | 2004-10-19 | Oerlikon Contraves Ag | Method and device for judging the aiming error of a weapon system and use of the device |
US6871439B1 (en) * | 2003-09-16 | 2005-03-29 | Zyberwear, Inc. | Target-actuated weapon |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2255398A (en) * | 1991-05-02 | 1992-11-04 | Gec Ferranti Defence Syst | A ballistics system. |
FR2700840B1 (en) * | 1992-12-21 | 1996-04-26 | Thomson Csf | Stabilized weapon. |
DE19719977C1 (en) * | 1997-05-13 | 1998-10-08 | Industrieanlagen Betriebsges | Video viewing-sight with integrated weapon control system for gun |
-
2004
- 2004-09-10 US US10/938,321 patent/US20060005447A1/en not_active Abandoned
- 2004-09-13 WO PCT/US2004/029975 patent/WO2005080908A2/en active Application Filing
- 2004-09-13 EP EP04821469A patent/EP1676090A2/en not_active Withdrawn
Patent Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3659494A (en) * | 1965-06-08 | 1972-05-02 | Itek Corp | Fire control system for use in conjunction with electronic image motion stabilization systems |
US3575085A (en) * | 1968-08-21 | 1971-04-13 | Hughes Aircraft Co | Advanced fire control system |
US3766826A (en) * | 1971-02-26 | 1973-10-23 | Bofors Ab | Device for achieving aim-off for a firearm |
US3840794A (en) * | 1972-03-02 | 1974-10-08 | France Etat | Control system for tracking a moving target |
US4004729A (en) * | 1975-11-07 | 1977-01-25 | Lockheed Electronics Co., Inc. | Automated fire control apparatus |
US4370914A (en) * | 1977-04-07 | 1983-02-01 | E M I Limited | Aiming arrangements |
US4402251A (en) * | 1981-09-18 | 1983-09-06 | The United States Of America As Represented By The Secretary Of The Army | Detection of line of sight reversal and initiation of firing commands for a modified acceleration predictor fire control system engaging maneuvering targets |
US4787291A (en) * | 1986-10-02 | 1988-11-29 | Hughes Aircraft Company | Gun fire control system |
US5392688A (en) * | 1992-06-02 | 1995-02-28 | Giat Industries | Trigger for a firing weapon |
US5544439A (en) * | 1992-09-10 | 1996-08-13 | Giat Industries | Device for firing a firearm using an infrared detector |
US5686690A (en) * | 1992-12-02 | 1997-11-11 | Computing Devices Canada Ltd. | Weapon aiming system |
US5991043A (en) * | 1996-01-08 | 1999-11-23 | Tommy Anderson | Impact position marker for ordinary or simulated shooting |
US5966859A (en) * | 1997-11-14 | 1999-10-19 | Samuels; Mark A. | Devices and methods for controlled manual and automatic firearm operation |
US6000163A (en) * | 1998-04-03 | 1999-12-14 | Gordon; Terry | Photographic rifle scope apparatus and method |
US6499382B1 (en) * | 1998-08-24 | 2002-12-31 | General Dynamics Canada Ltd. | Aiming system for weapon capable of superelevation |
US6805036B2 (en) * | 2001-11-23 | 2004-10-19 | Oerlikon Contraves Ag | Method and device for judging the aiming error of a weapon system and use of the device |
US6871439B1 (en) * | 2003-09-16 | 2005-03-29 | Zyberwear, Inc. | Target-actuated weapon |
Cited By (167)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060114519A1 (en) * | 2000-10-02 | 2006-06-01 | Eastman Kodak Company | Articulating camera for digital image acquisition |
US20060107578A1 (en) * | 2000-11-13 | 2006-05-25 | R.A. Brands, Llc | Actuator assembly |
US7131366B2 (en) * | 2000-11-13 | 2006-11-07 | Ra Brands, L.L.C. | Actuator assembly |
US20080127814A1 (en) * | 2003-05-23 | 2008-06-05 | Mckendree Thomas L | method of providing integrity bounding of weapons |
US20050188826A1 (en) * | 2003-05-23 | 2005-09-01 | Mckendree Thomas L. | Method for providing integrity bounding of weapons |
US7292262B2 (en) * | 2003-07-21 | 2007-11-06 | Raytheon Company | Electronic firearm sight, and method of operating same |
US20050018041A1 (en) * | 2003-07-21 | 2005-01-27 | Towery Clay E. | Electronic firearm sight, and method of operating same |
US20100309224A1 (en) * | 2004-03-31 | 2010-12-09 | Canon Kabushiki Kaisha | Image displaying method, image displaying program, and display |
US9086790B2 (en) * | 2004-03-31 | 2015-07-21 | Canon Kabushiki Kaisha | Image displaying method, image displaying program, and display |
US7810273B2 (en) * | 2005-03-18 | 2010-10-12 | Rudolf Koch | Firearm sight having two parallel video cameras |
US20080163536A1 (en) * | 2005-03-18 | 2008-07-10 | Rudolf Koch | Sighting Mechansim For Fire Arms |
US7162806B1 (en) * | 2005-03-21 | 2007-01-16 | Travis Swiggart | Video sighting system |
US20070166669A1 (en) * | 2005-12-19 | 2007-07-19 | Raydon Corporation | Perspective tracking system |
US9052161B2 (en) | 2005-12-19 | 2015-06-09 | Raydon Corporation | Perspective tracking system |
US20070166668A1 (en) * | 2005-12-22 | 2007-07-19 | Maximillian Kusz | Optical sighting device for small arms |
US20090133572A1 (en) * | 2005-12-29 | 2009-05-28 | Men At Work Ltd. | Boresighting system and method |
US20080022575A1 (en) * | 2006-05-08 | 2008-01-31 | Honeywell International Inc. | Spotter scope |
US20080160486A1 (en) * | 2006-06-19 | 2008-07-03 | Saab Ab | Simulation system and method for determining the compass bearing of directing means of a virtual projectile/missile firing device |
US8944821B2 (en) * | 2006-06-19 | 2015-02-03 | Saab Ab | Simulation system and method for determining the compass bearing of directing means of a virtual projectile/missile firing device |
US9728095B1 (en) | 2006-08-15 | 2017-08-08 | Triggermaster, Llc | Firearm trigger pull training system and methods |
US9151564B1 (en) | 2006-08-15 | 2015-10-06 | Triggermaster, Inc. | Firearm trigger pull training system and methods |
US10247505B1 (en) * | 2006-08-15 | 2019-04-02 | Triggermaster, Llc | Trigger pull training device |
US8911235B1 (en) | 2006-08-15 | 2014-12-16 | Triggermaster, Inc. | Shooting training device |
US11788813B2 (en) * | 2006-08-15 | 2023-10-17 | Triggermaster, Llc | Trigger pull training device |
US8777620B1 (en) * | 2006-08-15 | 2014-07-15 | Triggermaster, Inc. | Firearm trigger pull training system and methods |
US8072419B2 (en) * | 2006-10-20 | 2011-12-06 | Sunplus Technology Co., Ltd. | Computer mouse having a front sight button and method for generating local coordinates with the same |
US20080094360A1 (en) * | 2006-10-20 | 2008-04-24 | Sunplus Technology Co., Ltd. | Computer mouse having a front sight button and method for generating local coordinates with the same |
US20120021385A1 (en) * | 2006-11-24 | 2012-01-26 | Trex Enterprises Corp. | Celestial weapons orientation measuring system |
US8597025B2 (en) * | 2006-11-24 | 2013-12-03 | Trex Enterprises Corp. | Celestial weapons orientation measuring system |
US9159111B2 (en) * | 2007-07-30 | 2015-10-13 | International Business Machines Corporation | Method for reporting and relating firearm discharge data to a crime reporting database |
US20140129473A1 (en) * | 2007-07-30 | 2014-05-08 | International Business Machines Corporation | Method for reporting and relating firearm discharge data to a crime reporting database |
US8818829B2 (en) * | 2007-07-30 | 2014-08-26 | International Business Machines Corporation | Method and system for reporting and relating firearm discharge data to a crime reporting database |
US20090037374A1 (en) * | 2007-07-30 | 2009-02-05 | International Business Machines Corporation | Method and system for reporting and relating firearm discharge data to a crime reporting database |
US20100100321A1 (en) * | 2008-10-16 | 2010-04-22 | Michael Koenig | System and method for use of a vehicle back-up camera as a dead-reckoning sensor |
US8855917B2 (en) * | 2008-10-16 | 2014-10-07 | Csr Technology Inc. | System and method for use of a vehicle back-up camera as a dead-reckoning sensor |
US20100196859A1 (en) * | 2009-02-01 | 2010-08-05 | John David Saugen | Combat Information System |
US8555771B2 (en) * | 2009-03-18 | 2013-10-15 | Alliant Techsystems Inc. | Apparatus for synthetic weapon stabilization and firing |
US9464871B2 (en) | 2009-09-11 | 2016-10-11 | Laurence Andrew Bay | System and method for ballistic solutions |
US20130152447A1 (en) * | 2009-12-18 | 2013-06-20 | Vidderna Jakt & Utbildning Ab | Aiming device with a reticle defining a target area at a specified distance |
US9823043B2 (en) | 2010-01-15 | 2017-11-21 | Colt Canada Ip Holding Partnership | Rail for inductively powering firearm accessories |
US9879941B2 (en) | 2010-01-15 | 2018-01-30 | Colt Canada Corporation | Method and system for providing power and data to firearm accessories |
US10477618B2 (en) | 2010-01-15 | 2019-11-12 | Colt Canada Ip Holding Partnership | Networked battle system or firearm |
US10337834B2 (en) | 2010-01-15 | 2019-07-02 | Colt Canada Ip Holding Partnership | Networked battle system or firearm |
US10477619B2 (en) | 2010-01-15 | 2019-11-12 | Colt Canada Ip Holding Partnership | Networked battle system or firearm |
US9921028B2 (en) | 2010-01-15 | 2018-03-20 | Colt Canada Ip Holding Partnership | Apparatus and method for powering and networking a rail of a firearm |
US9897411B2 (en) | 2010-01-15 | 2018-02-20 | Colt Canada Ip Holding Partnership | Apparatus and method for powering and networking a rail of a firearm |
US9891023B2 (en) | 2010-01-15 | 2018-02-13 | Colt Canada Ip Holding Partnership | Apparatus and method for inductively powering and networking a rail of a firearm |
US10060705B2 (en) | 2010-01-15 | 2018-08-28 | Colt Canada Ip Holding Partnership | Apparatus and method for powering and networking a rail of a firearm |
US10470010B2 (en) | 2010-01-15 | 2019-11-05 | Colt Canada Ip Holding Partnership | Networked battle system or firearm |
WO2011096854A1 (en) * | 2010-02-02 | 2011-08-11 | Saab Ab | Method and arrangements for firing a fire arm |
EP2531801A1 (en) * | 2010-02-02 | 2012-12-12 | Saab AB | Method and arrangements for firing a fire arm |
EP2531801A4 (en) * | 2010-02-02 | 2015-05-20 | Saab Ab | Method and arrangements for firing a fire arm |
US20130028486A1 (en) * | 2010-02-02 | 2013-01-31 | Saab Ab | Method and arrangements for firing a fire arm |
US8989449B2 (en) * | 2010-02-02 | 2015-03-24 | Saab Ab | Method and arrangements for firing a fire arm |
US9110295B2 (en) | 2010-02-16 | 2015-08-18 | Trackingpoint, Inc. | System and method of controlling discharge of a firearm |
EP2536995A4 (en) * | 2010-02-16 | 2014-11-26 | Trackingpoint Inc | Advanced firearm or air gun scope |
US9823047B2 (en) | 2010-02-16 | 2017-11-21 | Trackingpoint, Inc. | System and method of controlling discharge of a firearm |
EP2536995A2 (en) * | 2010-02-16 | 2012-12-26 | TrackingPoint, Inc. | Advanced firearm or air gun scope |
US20170059279A1 (en) * | 2010-05-04 | 2017-03-02 | Lasermax, Inc. | Encoded signal detection and display |
US11598608B2 (en) | 2010-05-04 | 2023-03-07 | Lmd Applied Science, Llc | Encoded signal detection and display |
US10323902B2 (en) | 2010-05-04 | 2019-06-18 | Lasermax Inc | Encoded signal detection and display |
US20110315767A1 (en) * | 2010-06-28 | 2011-12-29 | Lowrance John L | Automatically adjustable gun sight |
US8336776B2 (en) | 2010-06-30 | 2012-12-25 | Trijicon, Inc. | Aiming system for weapon |
US20120170815A1 (en) * | 2010-12-29 | 2012-07-05 | Kwong Wing Au | System and method for range and velocity estimation in video data as a function of anthropometric measures |
US8520895B2 (en) * | 2010-12-29 | 2013-08-27 | Honeywell International Inc. | System and method for range and velocity estimation in video data as a function of anthropometric measures |
US20120212622A1 (en) * | 2011-02-17 | 2012-08-23 | Kabushiki Kaisha Toshiba | Moving object image tracking apparatus and method |
WO2012121735A1 (en) * | 2011-03-10 | 2012-09-13 | Tesfor, Llc | Apparatus and method of targeting small weapons |
EA031066B1 (en) * | 2011-03-28 | 2018-11-30 | Смарт Шутер Лтд. | Firearm aiming system (embodiments) and method of operating the firearm |
US10097764B2 (en) | 2011-03-28 | 2018-10-09 | Smart Shooter Ltd. | Firearm, aiming system therefor, method of operating the firearm and method of reducing the probability of missing a target |
WO2012131548A1 (en) | 2011-03-28 | 2012-10-04 | Smart Shooter Ltd. | Firearm, aiming system therefor, method of operating the firearm and method of reducing the probability of missing a target |
EP2694908A4 (en) * | 2011-04-01 | 2014-10-08 | Zrf Llc | System and method for automatically targeting a weapon |
US9310163B2 (en) | 2011-04-01 | 2016-04-12 | Laurence Andrew Bay | System and method for automatically targeting a weapon |
EP2694908A2 (en) * | 2011-04-01 | 2014-02-12 | Zrf, Llc | System and method for automatically targeting a weapon |
US20120297654A1 (en) * | 2011-05-26 | 2012-11-29 | The Otis Patent Trust | Firearm sensor system |
US8733006B2 (en) * | 2011-05-26 | 2014-05-27 | The Otis Patent Trust | Firearm sensor system |
US10670371B2 (en) | 2011-09-09 | 2020-06-02 | Lmd Power Of Light Corp | Target marking system |
US20140283430A1 (en) * | 2011-09-09 | 2014-09-25 | Lasermax, Inc. | Target marking system |
US9207043B2 (en) * | 2011-09-09 | 2015-12-08 | Lasermax, Inc. | Target marking system |
US10107591B2 (en) | 2011-09-09 | 2018-10-23 | Lasermax Inc | Target marking system |
US9163894B1 (en) * | 2011-10-28 | 2015-10-20 | Lockheed Martin Corporation | Laser transmission system for use with a firearm in a battle field training exercise |
US10203179B2 (en) | 2012-01-11 | 2019-02-12 | Dale Albert Hodgson | Motorized weapon gyroscopic stabilizer |
US10782097B2 (en) * | 2012-04-11 | 2020-09-22 | Christopher J. Hall | Automated fire control device |
US20150101229A1 (en) * | 2012-04-11 | 2015-04-16 | Christopher J. Hall | Automated fire control device |
US20130286216A1 (en) * | 2012-04-30 | 2013-10-31 | Trackingpoint, Inc. | Rifle Scope Including a Circuit Configured to Track a Target |
US20130326923A1 (en) * | 2012-06-07 | 2013-12-12 | Dr. Erez Gur Ltd. | Method and device useful for aiming a firearm |
US9261331B2 (en) * | 2012-06-07 | 2016-02-16 | Dr. Erez Gur Ltd. | Method and device useful for aiming a firearm |
US11268789B2 (en) * | 2012-09-13 | 2022-03-08 | Christopher V. Beckman | Device controlling shooting based on firearm movement |
US20140118723A1 (en) * | 2012-10-29 | 2014-05-01 | Teledyne Scientific & Imaging, Llc | System for determining the spatial orientation of a movable apparatus |
US9052159B2 (en) * | 2012-10-29 | 2015-06-09 | Teledyne Scientific & Imaging, Llc | System for determining the spatial orientation of a movable apparatus |
EP2746716A1 (en) * | 2012-12-18 | 2014-06-25 | TrackingPoint, Inc. | Optical device including a mode for grouping shots for use with precision guided firearms |
US20140168447A1 (en) * | 2012-12-18 | 2014-06-19 | Trackingpoint, Inc. | Optical Device Including a Mode for Grouping Shots for Use with Precision Guided Firearms |
KR101468160B1 (en) * | 2012-12-21 | 2014-12-05 | 주식회사 도담시스템스 | Training system for improving shooting accuracy and its control method |
EP2943738A4 (en) * | 2013-01-10 | 2016-07-06 | Dale Albert Hodgson | Motorized weapon gyroscopic stabilizer |
US9127909B2 (en) | 2013-02-17 | 2015-09-08 | Smart Shooter Ltd. | Firearm aiming system with range finder, and method of acquiring a target |
US9250035B2 (en) | 2013-03-21 | 2016-02-02 | Kms Consulting, Llc | Precision aiming system for a weapon |
EP2984440A4 (en) * | 2013-04-11 | 2016-12-21 | Christopher J Hall | Automated fire control device |
US11619469B2 (en) | 2013-04-11 | 2023-04-04 | Christopher J. Hall | Automated fire control device |
WO2014169107A1 (en) | 2013-04-11 | 2014-10-16 | Hall Christopher J | Automated fire control device |
EP2811253A1 (en) * | 2013-06-07 | 2014-12-10 | TrackingPoint, Inc. | Precision guided firearm with hybrid sensor fire control |
EP2811252A1 (en) * | 2013-06-07 | 2014-12-10 | TrackingPoint, Inc. | Precision guided firearm including an optical scope configured to determine timing of discharge |
EP3179197A1 (en) * | 2013-06-07 | 2017-06-14 | TrackingPoint, Inc. | Precision guided firearm including an optical scope configured to determine timing of discharge |
EP3179196A1 (en) * | 2013-06-07 | 2017-06-14 | TrackingPoint, Inc. | Precision guided firearm with hybrid sensor fire control |
US9127907B2 (en) | 2013-06-07 | 2015-09-08 | Trackingpoint, Inc. | Precision guided firearm including an optical scope configured to determine timing of discharge |
US9222754B2 (en) | 2013-06-07 | 2015-12-29 | Trackingpoint, Inc. | Precision guided firearm with hybrid sensor fire control |
US20150211828A1 (en) * | 2014-01-28 | 2015-07-30 | Trackingpoint, Inc. | Automatic Target Acquisition for a Firearm |
WO2015116536A1 (en) * | 2014-01-28 | 2015-08-06 | Trackingpoint, Inc. | Automatic target acquisition for a firearm |
US9435603B2 (en) * | 2014-04-16 | 2016-09-06 | Hanwha Techwin Co., Ltd. | Remote weapon system and control method thereof |
US9557130B2 (en) * | 2014-05-27 | 2017-01-31 | Israel Weapon Industries (I.W.I) Ltd. | Apparatus and method for improving hit probability of a firearm |
EP2950032A1 (en) * | 2014-05-27 | 2015-12-02 | Israel Weapon Industries (I.W.I.) Ltd. | An apparatus and method for improving hit probability of a firearm |
US20150345887A1 (en) * | 2014-05-27 | 2015-12-03 | Israel Weapon Industries (I.W.I) Ltd. | Apparatus and method for improving hit probability of a firearm |
US10712116B1 (en) * | 2014-07-14 | 2020-07-14 | Triggermaster, Llc | Firearm body motion detection training system |
US20160169625A1 (en) * | 2014-12-10 | 2016-06-16 | Flir Systems, Inc. | Electronic adaptive reticle systems and methods |
US9857144B2 (en) * | 2014-12-10 | 2018-01-02 | Flir Systems, Inc. | Electronic adaptive reticle systems and methods |
US20160216082A1 (en) * | 2015-01-22 | 2016-07-28 | Colt Canada Corporation | Sensor pack for firearm |
US10323894B2 (en) * | 2015-08-19 | 2019-06-18 | Paul Imbriano | Weapons system smart device |
US10168123B2 (en) * | 2015-11-19 | 2019-01-01 | Philip Scott Lyren | Firearm system that tracks points of aim of a firearm |
US10443982B2 (en) * | 2015-11-19 | 2019-10-15 | Philip Scott Lyren | Firearm system that tracks points of aim of a firearm |
US20190137218A1 (en) * | 2015-11-19 | 2019-05-09 | Philip Scott Lyren | Firearm System that Tracks Points of Aim of a Firearm |
US9945640B2 (en) * | 2015-11-19 | 2018-04-17 | Philip Scott Lyren | Firearm system that tracks points of aim of a firearm |
US20170146319A1 (en) * | 2015-11-19 | 2017-05-25 | Philip Scott Lyren | Firearm System that Tracks Points of Aim of a Firearm |
US20180231353A1 (en) * | 2015-11-19 | 2018-08-16 | Philip Scott Lyren | Firearm System that Tracks Points of Aim of a Firearm |
US9702662B1 (en) * | 2015-12-22 | 2017-07-11 | Huntercraft Limited | Electronic sighting device with real-time information interaction |
US20220358738A1 (en) * | 2016-01-29 | 2022-11-10 | Snap Inc. | Local augmented reality persistent sticker objects |
US11727660B2 (en) * | 2016-01-29 | 2023-08-15 | Snap Inc. | Local augmented reality persistent sticker objects |
US10578403B2 (en) * | 2016-02-03 | 2020-03-03 | VK Integrated Systems, Inc. | Firearm electronic system |
US10890415B2 (en) * | 2016-02-03 | 2021-01-12 | VK Integrated Systems, Inc. | Firearm electronic system |
US20190003803A1 (en) * | 2016-02-03 | 2019-01-03 | Vk Integrated Systems | Firearm electronic system |
US20190056198A1 (en) * | 2016-02-24 | 2019-02-21 | James Anthony Pautler | Skeet and Bird Tracker |
US10782096B2 (en) * | 2016-02-24 | 2020-09-22 | James Anthony Pautler | Skeet and bird tracker |
US9823040B1 (en) * | 2016-08-23 | 2017-11-21 | Shih-Che Hu | Gun barrel unit for a toy gun |
US10648781B1 (en) * | 2017-02-02 | 2020-05-12 | Arthur J. Behiel | Systems and methods for automatically scoring shooting sports |
US10690430B2 (en) | 2017-03-08 | 2020-06-23 | Sturm, Ruger & Company, Inc. | Dynamic variable force trigger mechanism for firearms |
US10670361B2 (en) | 2017-03-08 | 2020-06-02 | Sturm, Ruger & Company, Inc. | Single loop user-adjustable electromagnetic trigger mechanism for firearms |
US10228208B2 (en) | 2017-03-08 | 2019-03-12 | Sturm, Ruger & Company, Inc. | Dynamic variable force trigger mechanism for firearms |
US10900732B2 (en) | 2017-03-08 | 2021-01-26 | Sturm, Ruger & Company, Inc. | Electromagnetic firing system for firearm with firing event tracking |
US11300378B2 (en) | 2017-03-08 | 2022-04-12 | Sturm, Ruger & Company, Inc. | Electromagnetic firing system for firearm with interruptable trigger control |
US11828556B2 (en) | 2017-08-15 | 2023-11-28 | Paspa Pharmaceuticals Pty Ltd | Firearm stabilization device |
EP3669135A4 (en) * | 2017-08-15 | 2021-08-04 | Paspa Pharmaceuticals Pty Ltd | Firearm stabilization device |
US20190113310A1 (en) * | 2017-09-15 | 2019-04-18 | Tactacam LLC | Weapon sighted camera system |
US20210010782A1 (en) * | 2017-09-15 | 2021-01-14 | Tactacam LLC | Weapon sighted camera system |
US20230037723A1 (en) * | 2017-09-15 | 2023-02-09 | Tactacam LLC | Weapon sighted camera system |
US10619976B2 (en) * | 2017-09-15 | 2020-04-14 | Tactacam LLC | Weapon sighted camera system |
US11473875B2 (en) * | 2017-09-15 | 2022-10-18 | Tactacam LLC | Weapon sighted camera system |
US11441874B2 (en) * | 2017-11-10 | 2022-09-13 | Hanwha Defense Co., Ltd. | Remote weapon control device and method for targeting and shooting multiple objects |
EP3752787A4 (en) * | 2018-02-14 | 2022-03-09 | Wilcox Industries Corp. | Weapon system |
RU192631U1 (en) * | 2018-06-26 | 2019-09-24 | Константин Александрович Идель | OFFLINE BALLISTIC COMPUTER FOR RUNNING WEAPONS |
US10803316B2 (en) * | 2018-07-06 | 2020-10-13 | Meopta U.S.A., Inc. | Computer applications integrated with handheld optical devices having cameras |
US20200012856A1 (en) * | 2018-07-06 | 2020-01-09 | Meopta U.S.A., Inc. | Computer applications integrated with handheld optical devices having cameras |
KR20200071020A (en) * | 2018-12-09 | 2020-06-18 | 이스라엘 웨폰 인더스트리즈 (“아이더블유아이”) 엘티디. | Firearm controlled by user behavior |
KR102276310B1 (en) * | 2018-12-09 | 2021-07-14 | 이스라엘 웨폰 인더스트리즈 (“아이더블유아이”) 엘티디. | Firearm controlled by user behavior |
US20200182576A1 (en) * | 2018-12-09 | 2020-06-11 | Israel Weapon Industries (I.W.I.) Ltd. | Firearm controlled by user behavior |
AU2019272045B2 (en) * | 2018-12-09 | 2021-03-18 | Israel Weapon Industries (I.W.I) Ltd. | Firearm controlled by user behavior |
EP3663697A1 (en) * | 2018-12-09 | 2020-06-10 | Israel Weapon Industries (I.W.I.) Ltd. | Firearm controlled by user behaviour |
US10900733B2 (en) * | 2018-12-09 | 2021-01-26 | Israel Weapon Industries (I.W.I) Ltd. | Firearm controlled by user behavior |
WO2021048307A1 (en) * | 2019-09-10 | 2021-03-18 | Fn Herstal S.A. | Imaging system for firearm |
EP3819585A1 (en) * | 2019-11-11 | 2021-05-12 | Israel Weapon Industries (I.W.I.) Ltd. | Firearm with automatic target acquiring and shooting |
AU2020267163B2 (en) * | 2019-11-11 | 2022-02-17 | Israel Weapon Industries (I.W.I) Ltd. | Firearm with automatic target acquiring and shooting |
US20210364256A1 (en) * | 2020-04-21 | 2021-11-25 | Axon Enterprise, Inc. | Motion-based operation for a conducted electrical weapon |
US11231252B2 (en) * | 2020-06-10 | 2022-01-25 | Brett C. Bilbrey | Method for automated weapon system with target selection of selected types of best shots |
US11754363B1 (en) * | 2020-07-29 | 2023-09-12 | Dale Albert Hodgson | Gimballed Precession Stabilization System |
US11676287B2 (en) * | 2020-10-21 | 2023-06-13 | Hanwha Aerospace Co.. Ltd | Remote-controlled weapon system in moving platform and moving target tracking method thereof |
US20220122271A1 (en) * | 2020-10-21 | 2022-04-21 | Hanwha Defense Co., Ltd. | Remote-controlled weapon system in moving platform and moving target tracking method thereof |
US11486677B2 (en) | 2021-01-07 | 2022-11-01 | Israel Weapon Industries (I.W.I) Ltd. | Grenade launcher aiming control system |
EP4027100A1 (en) * | 2021-01-07 | 2022-07-13 | Israel Weapon Industries (I.W.I.) Ltd. | Grenade launcher aiming control system |
EP4325162A3 (en) * | 2021-01-07 | 2024-04-10 | Israel Weapon Industries (I.W.I.) Ltd. | Grenade launcher aiming control system |
US20220349676A1 (en) * | 2021-05-03 | 2022-11-03 | John Aron Maguire | Novel System And Methods For Incorporating Firearm Ammunition Temperature & Thermal Susceptibility To Improve Ballistic Calculator Algorithms And Fidelity |
US11933573B1 (en) * | 2022-07-13 | 2024-03-19 | Anthony Vines | Firearm shot tracking system |
Also Published As
Publication number | Publication date |
---|---|
EP1676090A2 (en) | 2006-07-05 |
WO2005080908A2 (en) | 2005-09-01 |
WO2005080908A3 (en) | 2006-03-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20060005447A1 (en) | Processor aided firing of small arms | |
US8400619B1 (en) | Systems and methods for automatic target tracking and beam steering | |
EP2956733B1 (en) | Firearm aiming system with range finder, and method of acquiring a target | |
US10097764B2 (en) | Firearm, aiming system therefor, method of operating the firearm and method of reducing the probability of missing a target | |
AU2002210260B2 (en) | Autonomous weapon system | |
US8651381B2 (en) | Firearm sight having an ultra high definition video camera | |
AU2016320833B2 (en) | Dynamic laser marker display for aimable device | |
US9222754B2 (en) | Precision guided firearm with hybrid sensor fire control | |
AU2002210260A1 (en) | Autonomous weapon system | |
US20120097741A1 (en) | Weapon sight | |
US20170138710A1 (en) | Optically tracked projectile | |
US20130286216A1 (en) | Rifle Scope Including a Circuit Configured to Track a Target | |
US20200166310A1 (en) | Apparatus and methodology for tracking projectiles and improving the fidelity of aiming solutions in weapon systems | |
RU2251652C2 (en) | Mode of determination of a bullet's hitting point at a target |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: VITRONICS INC., NEW JERSEY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LENNER, GERALD E.;KARCHER, PHILIP B.;REEL/FRAME:016540/0560 Effective date: 20040910 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |