US20190137219A1 - Semi-autonomous motorized weapon systems - Google Patents

Semi-autonomous motorized weapon systems Download PDF

Info

Publication number
US20190137219A1
US20190137219A1 US16/181,153 US201816181153A US2019137219A1 US 20190137219 A1 US20190137219 A1 US 20190137219A1 US 201816181153 A US201816181153 A US 201816181153A US 2019137219 A1 US2019137219 A1 US 2019137219A1
Authority
US
United States
Prior art keywords
target
weapon
target point
firing
motorized
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/181,153
Inventor
Bryan Sterling Bockmon
Corbin Chase Johnston
Jason R. Gallia
George Lee Krasovec
Henry Matthew Dittmer
Jay David Marks
Christopher James Owens
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aimlock Inc
Original Assignee
Aimlock Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aimlock Inc filed Critical Aimlock Inc
Priority to US16/181,153 priority Critical patent/US20190137219A1/en
Assigned to Aimlock Inc. reassignment Aimlock Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BOCKMON, BRYAN STERLING, DITTMER, HENRY MATTHEW, GALLIA, Jason R., JOHNSTON, CORBIN CHASE, KRASOVEC, GEORGE LEE, MARKS, JAY DAVID, OWENS, CHRISTOPHER JAMES
Publication of US20190137219A1 publication Critical patent/US20190137219A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/14Indirect aiming means
    • F41G3/16Sighting devices adapted for indirect laying of fire
    • F41G3/165Sighting devices adapted for indirect laying of fire using a TV-monitor
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41AFUNCTIONAL FEATURES OR DETAILS COMMON TO BOTH SMALLARMS AND ORDNANCE, e.g. CANNONS; MOUNTINGS FOR SMALLARMS OR ORDNANCE
    • F41A17/00Safety arrangements, e.g. safeties
    • F41A17/08Safety arrangements, e.g. safeties for inhibiting firing in a specified direction, e.g. at a friendly person or at a protected area
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/04Aiming or laying means for dispersing fire from a battery ; for controlling spread of shots; for coordinating fire from spaced weapons
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G5/00Elevating or traversing control systems for guns
    • F41G5/06Elevating or traversing control systems for guns using electric means for remote control
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G5/00Elevating or traversing control systems for guns
    • F41G5/14Elevating or traversing control systems for guns for vehicle-borne guns
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G5/00Elevating or traversing control systems for guns
    • F41G5/14Elevating or traversing control systems for guns for vehicle-borne guns
    • F41G5/16Elevating or traversing control systems for guns for vehicle-borne guns gyroscopically influenced

Definitions

  • This disclosure generally relates to autonomous and semi-autonomous motorized weapons systems. More specifically, the present disclosure relates to hardware- and software-based techniques for efficient operation of motorized weapons systems, via improvements in target identification and selection, autonomous actuation of motor and targeting systems, dynamic tracking, and trajectory measurement and assessment.
  • kill chain refers to the sequence of actions performed between the first detection of potential targets, and the elimination of the targets.
  • the sequence of actions within a kill chain generally may include the following: (1) Find—identifying and locating a target, (2) Fix or Track—determining the accurate location of the target, (3) Target—time-critical targeting, including predicting where the target may pop-up, (4) Engage—firing on the target, and (5) Assess—determining whether or not the target has been hit and/or eliminated.
  • Conventional weapon systems may include various components for achieving the above steps of a kill chain, including cameras and sensors to identify targets, display screens and controls (e.g., joysticks) to allow an operator to identify targets and aim the weapon, and a variety of weapons that may be fired at the target.
  • Such systems may include “fully autonomous” weapons systems, which are capable of targeting and firing without any intervention by a human operator, “semi-autonomous” weapons systems, which may use automated software target tracking tools but still rely on a human operator for target selection and firing commands, “supervised autonomous” weapons systems, which may be granted permission to react to threats autonomously, and/or manual weapon systems that are operated entirely by the human operator.
  • Techniques described herein relate to hardware- and software-based solutions for operating motorized weapons systems, including target identification and selection techniques, autonomous actuation of motor and targeting systems, dynamic tracking, and trajectory measurement and assessment techniques.
  • Certain embodiments described herein correspond to semi-autonomous motorized weapon systems, which may include various combinations of hardware such as weapons capable of firing munitions, two-axis and/or three-axis mounts configured to support and position the weapons, motors coupled to the mounts and configured to move the mounts to specified positions to control the direction to which the weapons is aimed, and/or operator interface components such as operator controls and a target display device.
  • such a semi-autonomous motorized weapon system may be implemented with various hardware-based and software-based components configured to determine target points associated with targets at a remote locations, determine one or more areas having boundaries surrounding the target points, such boundary areas determined based on the likelihood of the weapon hitting the target when aimed at the boundary in comparison to predetermined likelihood thresholds.
  • Such embodiments may be further configured to engage the motor of the motorized weapon system, with instructions to move the mount from an initial position to a target position at which the weapon is aimed at the target point, and during engagement of the motor, to periodically determine, during the movement of the mount toward the target position, whether the weapon is aimed at a position within the boundary area surrounding the target point.
  • the semi-autonomous motorized weapon system may disable a manual firing mechanism of the weapon system to prevent firing of the weapon by an operator, whereas when it is determined during the movement of the mount toward the target position, that the weapon is aimed at a position within the area surrounding the target point, the semi-autonomous motorized weapon system may enable (or re-enable) the manual firing mechanism to allow firing of the weapon.
  • the semi-autonomous motorized weapon system may be configured to receive and execute firing commands from operators, via the manual firing mechanism, thereby firing the weapon at times when the manual firing mechanism is enabled.
  • Additional techniques described herein include weapon-agnostic motorized weapon systems, including weapon-agnostic targeting/firing systems that may support various different types or models of weapons, as well as implementation of operation-specific rules of engagement that may be received and enforced by the weapon-agnostic targeting and firing systems.
  • Further techniques described herein include minimum confidence thresholds for target selection and/or prioritization via semi-autonomous weapons systems, which may be separate determinations from target identification confidence and/or target verification confidence.
  • Still further techniques described herein may include sensor-based real-time projectile firing assessment and automatic correction of targeting algorithms based on accuracy evaluations.
  • the various techniques described herein further include combinations of autonomous target selection, prioritization, and re-selection by targeting/firing systems within semi-autonomous motorized weapon systems, dynamic target tracking of both primary and secondary targets including target movement predictions and weapon/projectile characteristics, autonomous motor actuation to automatically orient the weapon toward the primary target before receiving any operator input, simplified user interfaces and operator controls, and enabling/disabling of the firing mechanism depending on the projected point of impact of the weapon, thereby providing increased system efficiency, increased rate of firing, improved weapon system accuracy, and reduced operator error, along with the other technical advantages described herein.
  • FIG. 1 is a depiction of a motorized weapon system, in accordance with one or more embodiments of the present invention.
  • FIG. 2 is a block diagram illustrating example component architecture diagram of a motorized weapon system, in accordance with one or more embodiments of the present invention.
  • FIGS. 3A-3C are illustrative drawings depicting the mounting and application of a motorized weapon system in accordance with one or more embodiments of the present invention, within different engagement environments.
  • FIG. 4 is a flowchart illustrating an example process of using a motorized weapon system to engage one or more targets, in accordance with certain embodiments of the present invention.
  • FIG. 5 is an example screen of a user interface displayed to an operator of a motorized weapon system during engagement of one or more targets, in accordance with certain embodiments of the present invention.
  • FIG. 6 is another example screen of a user interface displayed to an operator of a motorized weapon system during engagement of one or more targets, in accordance with certain embodiments of the present invention.
  • FIG. 7 is a flowchart illustrating an example process of disabling or enabling a firing mechanism of a motorized weapon system during engagement of the motor to move the weapon, in accordance with certain embodiments of the present invention.
  • FIGS. 8A and 8B are example screens of a user interface displayed to an operator of a motorized weapon system during engagement of the motor to move the weapon toward a target point, in accordance with certain embodiments of the present invention.
  • FIG. 9 is a schematic illustration of a computer system configured to perform techniques in accordance with certain embodiments of the present invention.
  • circuits, systems, networks, processes, and other components may be shown as components in block diagram form in order not to obscure the embodiments in unnecessary detail.
  • well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the embodiments.
  • individual embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged.
  • a process is terminated when its operations are completed, but could have additional steps not included in a figure.
  • a process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination can correspond to a return of the function to the calling function or the main function.
  • computer-readable medium includes, but is not limited non-transitory media such as portable or fixed storage devices, optical storage devices, and various other mediums capable of storing, containing or carrying instruction(s) and/or data.
  • a code segment or computer-executable instructions may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or program statements.
  • a code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
  • embodiments may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof.
  • the program code or code segments to perform the necessary tasks may be stored in a computer-readable medium.
  • a processor(s) may perform the necessary tasks.
  • semi-autonomous motorized weapon systems may include various combinations of hardware such as weapons capable of firing munitions, two-axis and/or three-axis mounts configured to support and position the weapons, motors coupled to the mounts and configured to move the mounts to specified positions to control the direction to which the weapons is aimed, and/or operator interface components such as operator controls and a target display device.
  • a semi-autonomous motorized weapon system may be implemented with various hardware-based and software-based components configured to determine target points associated with targets at a remote locations, determine one or more areas having boundaries surrounding the target points, such boundary areas determined based on the likelihood of the weapon hitting the target when aimed at the boundary in comparison to predetermined likelihood thresholds.
  • Such embodiments may be further configured to engage the motor of the motorized weapon system, with instructions to move the mount from an initial position to a target position at which the weapon is aimed at the target point, and during engagement of the motor, to periodically determine, during the movement of the mount toward the target position, whether the weapon is aimed at a position within the boundary area surrounding the target point.
  • the semi-autonomous motorized weapon system may disable a manual firing mechanism of the weapon system to prevent firing of the weapon by an operator, whereas when it is determined during the movement of the mount toward the target position, that the weapon is aimed at a position within the area surrounding the target point, the semi-autonomous motorized weapon system may enable (or re-enable) the manual firing mechanism to allow firing of the weapon.
  • the semi-autonomous motorized weapon system may be configured to receive and execute firing commands from operators, via the manual firing mechanism, thereby firing the weapon at times when the manual firing mechanism is enabled.
  • Additional techniques described herein include weapon-agnostic motorized weapon systems, including weapon-agnostic targeting/firing systems that may support various different types or models of weapons, as well as implementation of operation-specific rules of engagement that may be received and enforced by the weapon-agnostic targeting and firing systems.
  • Further techniques described herein include minimum confidence thresholds for target selection and/or prioritization via semi-autonomous weapons systems, which may be separate determinations from target identification confidence and/or target verification confidence.
  • Still further techniques described herein may include sensor-based real-time projectile firing assessment and automatic correction of targeting algorithms based on accuracy evaluations.
  • the various techniques described herein further include combinations of autonomous target selection, prioritization, and re-selection by targeting/firing systems within semi-autonomous motorized weapon systems, dynamic target tracking of both primary and secondary targets including target movement predictions and weapon/projectile characteristics, autonomous motor actuation to automatically orient the weapon toward the primary target before receiving any operator input, simplified user interfaces and operator controls for operating the semi-autonomous motorized weapon systems, and enabling/disabling of the firing mechanism depending on the projected point of impact of the weapon, thereby providing increased system efficiency, increased rate of firing, improved weapon system accuracy, and reduced operator error, along with the other technical advantages described herein.
  • weapon system 100 may include a weapon 110 with ammunition feed 115 , a gimbal mount 120 , a camera/sensor unit 125 . Additionally, in this example, the weapon system 100 includes a base/housing 130 , which contains and obscures additional components of the system 100 , including the motor, servos, targeting system, processing and memory components, communications system, firing controls, and various other components described herein.
  • weapon system 100 may be a remotely operated weapon stations (ROWS), including stabilization and auto-targeting technology.
  • the targeting system of weapon system 100 may be configured to perform rapid target selection and acquisition, and increased hit probabilities.
  • Weapon system 100 may be compatible with many different types of weapon 110 and different corresponding types of ammunition, and as discussed below, the operation of the targeting system and other components of the weapon system 100 may depend on knowledge of which type of weapon 110 and ammunition is currently in use.
  • weapon system 100 may be fully integrated, with auto-targeting capabilities, and/or remote operation.
  • Weapon system 100 also may be capable of being mounted to various different types of platforms, including tripods, buildings, ground vehicles (e.g., trucks, tanks, cars, jeeps), all-terrain vehicles (ATVs), utility task vehicles (UTVs), boats, fixed-wing aircraft, helicopters, and drones.
  • ground vehicles e.g., trucks, tanks, cars, jeeps
  • ATVs all-terrain vehicles
  • UUVs utility task vehicles
  • boats fixed-wing aircraft, helicopters, and drones.
  • various embodiments of weapon systems 100 may include capabilities for automatic target detection, selection, and re-selection, active stabilization, automatic ballistic solutions, target tagging, and/or continuous target tracking.
  • weapon 110 may any type of gun, armament, or ordinance, including without limitation, off-the-shelf firearms, large caliber rifles, machine guns, autocannons, grenade launchers, rockets, and/or directed energy weapons such as lasers, high-power microwave emitters, and other undisclosed devices.
  • the weapon 110 may be attached to the weapon system 100 using a 2-axis or 3-axis mechanical gimbal mount 120 , capable of controlling azimuth and yaw, elevation and pitch, and possibly cant and roll.
  • a closed loop servomotor within the weapon system 100 may be configured to drive the gimbal to an identified target.
  • a firing mechanism within the weapon system may be configured to fire the weapon 110 , either electronically or by manually pulling the trigger, in response to a firing command from a human operator and/or additional firing instructions received from a targeting/firing component of the weapon system 110 .
  • Camera/sensor unit 125 may include an array of various different sensors configured to collect data at the weapon system 100 , and transmit the sensor/image data back to the internal software systems of the weapon system 100 (e.g., targeting system/component, firing control, ballistics engine) and/or to a display device for outputting to an operator.
  • Cameras/sensors within the sensor unit 125 may include, for example, cameras sensitive in various spectrums such as visible and infrared (IR), for day and night visibility, as well as rangefinders (e.g., LIDAR, RADAR, ultrasonic, etc.) to determine distance to target.
  • IR visible and infrared
  • rangefinders e.g., LIDAR, RADAR, ultrasonic, etc.
  • Additional sensors within the sensor unit 125 may include rate gyros (e.g., MEMS or fiber optic gyros), which may be used to stabilize the weapon 110 within the mount 120 .
  • rate gyros e.g., MEMS or fiber optic gyros
  • Magnetometers and accelerometers also may be included within the weapon system 100 , and may be used for canceling gyro drift.
  • Accelerometers also may be used to detect and respond to vehicle accelerations (i.e., when the weapon system 100 is mounted on a vehicle), and vibrations caused by vehicle movement and/or terrain and weather.
  • Sensors 125 also may include wind speed sensors, including hot-wire, laser/LIDAR, sonic and other types of anemometers.
  • a global positioning system (GPS) receiver or other positioning devices may be included within the sensor unit 125 , in order to determine the weapon location, head, and velocity to compute firing solutions, and for use in situations where external target coordinates are provided. It should also be understood that for each of the cameras and/or sensors described above and elsewhere herein, the cameras/sensors may be housed within the sensor unit 125 , positioned elsewhere in the weapon system 100 , installed on a structure or vehicle on which the weapon system 100 is mounted, or installed at a separate remote location and configured to transmit wireless sensor data back to the weapon system 100 .
  • GPS global positioning system
  • weapon system 200 may correspond to same weapon system 100 discussed above, and/or other variations of weapon systems described herein.
  • weapon system 200 includes a weapon 225 , mount 230 , motor 235 , and a camera/sensor unit 245 .
  • Weapon system 200 also includes a targeting/firing system 210 , described below in more detail, which may be implemented in hardware, software, or a combination of hardware and software.
  • weapon system 200 may include operator-facing components, including controls 245 and a display screen 250 .
  • the targeting/firing system 210 may be configured to control drive the motor 235 to a particular target point, and to initiate firing of the weapon 225 .
  • the camera/sensor unit 240 may collect image and sensor data, and transmit that data back to the targeting/firing system 210 for use in target detecting, selection, and tracking functionality.
  • image and sensor data may be transmitted directly from the sensor unit 240 to the display 250 for rendering/use in an operator user interface.
  • the targeting/firing system 210 also may transmit various targeting data to the display device 250 for presentation to the operator, and may receive from the operator firing commands and/or other control commands via the operator controls 245 .
  • weapon systems 200 may include turrets or platform-mounted guns which include the weapon/motor 225 - 235 , camera/sensor unit 240 , targeting/firing system 210 , as well as the operator controls 245 and display 250 .
  • some or all of the components of a weapon system 200 may non-integrated and located remoted from the others.
  • the weapon/motor 225 - 235 and a subset of the sensors/cameras 240 may be located near the potential targets, while the targeting/firing system 210 and operator interface components 245 - 250 may be in a distance remote location.
  • Certain sensors 240 may be located at or near the weapon 225 (e.g., to measure distance to target, current location, weapon movement and vibration, wind and weather conditions, etc.), while other sensors 240 may be positioned at or near the target and/or at other angles to the target, while still other sensors or cameras 240 may be remotely located (e.g., drone-based cameras, satellite imagery, etc.).
  • each of the components may include network transceivers and interfaces configured for secure network communication, including components for data encryption and transmission over public or private computer networks, satellite transmission systems, and/or secure short-range wireless communications, etc.
  • the targeting/firing system 210 may receive input data from various data sources, and analyze the data to identify, select, and prioritize targets, actuate the motor 235 , dynamically track targets, generate firing solutions, and control firing of the weapon 225 .
  • the targeting/firing system 210 may receive data from one or more cameras/sensor units 240 , including a GPS unit 211 .
  • the sensor data may include images of targets and potential targets, distance/range data, heat or infrared data, audio data, vehicle or weapon location data, vehicle or weapon movement and vibration data, wind and weather condition data, and any other sensor data described herein.
  • one or more data stores may store system configuration and operation data, including a rules data store 213 and a profiles data store 214 .
  • the rules data store 213 may include, for example, target identification rules, target selection/priority rules, firing rules, and other rules of engagement, each of which may depend on the particular operation, the current location of the weapon system 200 , the individual operator, etc.
  • the profiles data store 213 may include, for example, individual user profiles with user preferences and parameters, weapon profiles, and/or ballistic profiles that may include specifications for individual weapon types and ammunition types that may be used to calculate maximize ranges and targeting solutions.
  • one or more communication modules 212 within the targeting/firing system 210 may be used to receive commands and other data from the current operator and/or from a separate command centers. As discussed below, commands received from a command center or other higher-level authority may be to control the target selection and rules of engagement for particular operations.
  • Communication modules 212 also may be used to receive or retrieve sensor data from remote sensor systems, including satellite data, image data from remote cameras, target GPS data, weather data, etc.
  • the targeting/firing system 210 may include various components (e.g., targeting component 220 ) configured to receive and analyze the various data to performing target functions including subcomponents for target detection 221 , target selection 222 , target tracking 223 , and firing control 215 , among others.
  • the operator controls 245 and display screen 250 may correspond to the input/output interface between the human operator and the weapon system 200 .
  • certain weapons systems 200 may be fully autonomous, or may operate in a supervised autonomous mode, in which case the operator controls 245 and display screen 250 need not be present.
  • the operator controls 245 and display screen 250 may be remotely located in some embodiments, allowing the operators to control the weapon system 200 from a separate location that may be a few feet away or across the globe.
  • the display device 250 may receive and output various user interview views to the operator, including views described below for identifying and highlighting targets, obscuring non-targets, rendering target points, weapon trajectories, confidence ranges, and providing various additional sensor readings to the operator.
  • the operator controls 245 may allow the operator to identify, select, and mark targets, and to fire the weapon 225 .
  • the operator controls 245 may include a fire button 246 (to fire the weapon 225 ), and a “next target” button 247 to instruct the target component 220 to re-select the next priority target.
  • the operator controls might include only these two buttons, and need not include a joystick for aiming tracking, etc.
  • FIGS. 3A-3C these drawings illustrate the operation of motorized weapons systems on three different vehicle-based mounting platforms.
  • a motorized weapon system is mounted on a stationary or moving vehicle 306 .
  • the remote weapon system 304 holds the firearm 305 , and various sensors may be installed in the frame of reference of the firearm 305 , in the frame of reference of the gimballed remote control, and/or in the frame of reference of the vehicle 306 .
  • the field of view 307 is represented by dotted lines.
  • a crosshair 301 shows the current projected point of impact.
  • the crosshair 301 is not yet on target, and it may be assumed that the motor is engaged driving the firearm to the target position, or the operator has not yet confirmed the target.
  • the targeting system in these examples shows a primary target 302 identified by a doubled-dashed box, and a secondary target which has been identified but not yet targeted, is shown within a singled dashed box 303 .
  • FIG. 3B shows a similar set of components, but in this case, the scenario is a maritime use with an armed boat 306 as the vehicle.
  • FIG. 3C shows yet another scenario in which the vehicle 306 is a helicopter.
  • FIG. 3C also illustrates that the system may identify multiple secondary targets 303 within the field of view 307 .
  • FIG. 4 a flow diagram is shown illustrating a process by which a motorized weapon system may identify, target, engage, and fire on one or more targets.
  • the steps in this process may be performed by one or more components in the example motorized weapon system 200 discussed above, such as targeting/firing system 210 and the subsystems thereof, in conjunction with the weapon/mount/motor components 225 - 235 , one or more sensor units 240 , operator interface components 245 - 250 , and/or various remote and external systems.
  • process steps described herein need not be limited to the specific systems and hardware implementations described above in FIGS. 1-3 , but may be performed within other motorized weapon systems and environments comprising other combinations of the hardware and software components described herein.
  • the components of the motorized weapon system 200 may identify and verify one or more targets, using sensor units 240 and/or additional data sources.
  • the identification and/or verification of targets may be performed fully autonomously by the system 200 .
  • image data from cameras and sensor data from other sensors 240 e.g., range to target data, heat data, audio, etc.
  • data from additional sources may be used as well, including imagery or sensor data from remote sensor or imaging systems (e.g., other weapons systems 200 , fixed cameras, drones, satellites, etc.).
  • the targeting/firing system 210 may be configured to calculate approximate range data using passive ranging techniques. For example, heights of known objects (or presumed heights) may be used to calculate the distance of those objects from the weapon system 200 . Additional sources of target data also may be received via communication modules 212 , which may include the GPS coordinates of targets, or bearing to targets, received from a command center. Such image data and other sensor data received from additional data sources may be used by the targeting/firing system 210 to triangulate or confirm a target's location, or verify the identity of a target, etc.
  • target identification and target verification refer to related but separate techniques.
  • Target identification or target detection refers to the analysis of camera images, sensor data, etc., to detect objects and identify the detected object as potential targets for the weapon system 200 (e.g., vehicles, structures, weapons, individuals, etc.), rather than generally non-target objects such as rocks, trees, hills, shadows, and the like.
  • Target verification or target confirmation refers to additional analyses of the same images/sensor data, and/or additional sources images/sensor data, to determine whether or not the identified potential target should be selected for targeting by the weapon system 200 .
  • Target verification techniques may be based on the configuration of the system and priorities of the particular mission, etc.
  • target verification techniques for vehicles may include identifying the size of a vehicle target (e.g., based on image analysis, target range, heat signatures from engines, etc.), the vehicle type (e.g., based on image analysis, and comparisons to a database 214 of target/non-target images), the presence of weapons on a target or proximate to a target, etc.
  • the size, shape, color, movement, audio and heat signatures of a vehicle may be analyzed to determine if that vehicle is a drone, helicopter, aircraft, boat, tank, truck, jeep, or car, whether the target is a military or civilian vehicle, the number of individuals and/or weapons on the vehicle, and the like, all of which may be used be a rules database 213 to determine whether the vehicle is a target non-target.
  • Target verification also may include identifying particular insignia on targets, and for human targets, facial recognition and/or biometric recognition to confirm the identity of the target.
  • both target identification and target verification in step 401 may be performed fully autonomously by the weapon system 200 , using the techniques described above.
  • target identification and/or verification may include semi-autonomous or manual steps.
  • the rules of engagement for particular operations may require that each target be visually confirmed by a human operator.
  • Such visual confirmation may be performed by the operator, as described in steps 406 - 407 below. Additionally or alternatively, the visual confirmation may be received from a different user, such as a commanding officer at a remote command center or other authorized user.
  • the weapon system 200 may be configured to transmit imagery and other sensor data to one or more remote locations, and then to receive the instructions identifying the potential target as a selected target or a non-target, from the remote authorized user/command center via a communication module 212 .
  • These remote visual confirmation techniques may be entirely transparent with respect to the operator of the weapon system 200 in some cases, that is, if a target is not selected/confirmed by a remote authorized user then that target might not ever be rendered or selected via the operator display device and/or might not be selectable by the operator during steps 406 - 407 .
  • both target identification and target selection in step 401 may be based on sets of rules received via a rules database 213 or other sources.
  • Target selection rules may be based on target type (e.g., types of vehicles, individuals (if any), and structures, etc.), target size, target distance, the presence and types of weapons on a target, the uniform/insignia on a target, and the like.
  • Additional rules may relate to the probability that the target has been accurately identified (e.g., level of confidence of facial recognition, vehicle type identification, insignia recognition, etc.), the probability that the weapon system 200 will be able to hit the selected target (e.g., based on target distance, target movement, weapon and ammunition type, wind and weather conditions, etc.), and/or the presence of potential collateral damage that may occur if the target is fired upon (e.g., based on detection of friendly and non-targets in the proximity of the identified target).
  • Different sets of rules may be applied for different operators, different weapons 225 and ammunition types, different times, and/or different physical locations for the engagement.
  • target identification, selection, and prioritization rules may be selected and applied by the targeting/firing system 210 for an engagement with a particular operator, at a particular date and time, using a particular weapon/ammunition type, in a particular country/region of the engagement, having particular lighting or weather conditions, and so on
  • an entirely different set of target identification, selection, and prioritization rules may be selected and applied by the targeting/firing system 210 if one or more of these variables (e.g., operator, time, weapon or ammunition type, engagement location or environmental conditions, etc.) changes.
  • the targeting/firing system 210 of the motorized weapon system 200 may be configured to prioritize the multiple targets, thereby determining a firing order.
  • target prioritization techniques similarly may be on imagery and sensor data, as well as sets of operational rules that may apply to operators, weapons, locations, etc.
  • Examples of target prioritization rules may include, without limitation, rules that prioritize vehicles over human targets, certain types of vehicles over other types of vehicles, armored vehicles over non-armored vehicles, armed targets over non-armed targets, uniformed/insignia targets over non-uniformed or insignia targets, close targets over far targets, advancing targets over stationary or retreating targets, higher confidence targets (i.e., higher probability of weapon being able to hit the target) over lower confidence targets, targets firing weapons over targets not firing weapons, and/or any combination of these criteria.
  • the targeting/firing system 210 may evaluate the current target distance and trajectory of all advancing and armed targets (e.g., missiles, drones, ground vehicles, and individuals, etc.), in order to prioritize the targets in the order in which they would first reach the current position (or future position) of the weapon system 200 .
  • targets e.g., missiles, drones, ground vehicles, and individuals, etc.
  • These target prioritization rules also may include rules determining how particular types of targets may be targeted. For example, such rules may include the desired point of impact for a particular target type (e.g., the engine of boat, the center of mass of an individual, etc.).
  • rules or algorithms may be applied for prioritizing targets, depending on the current operator, current location, current date/time, and/or based on predefined operation-specific rules of engagement. Further, rules or algorithms for prioritization may be based on or adjusted in view of current conditions, such as the current amount of ammunition of the weapon system 200 (e.g., lower ammunition circumstances may cause prioritization of most valuable/important targets first), the current wind or weather conditions (e.g., in which closer and/or higher confidence targets may be prioritized), or based on nearby friendly or non-hostile targets (e.g., in which closer and/or higher confidence targets may be prioritized).
  • current conditions such as the current amount of ammunition of the weapon system 200 (e.g., lower ammunition circumstances may cause prioritization of most valuable/important targets first), the current wind or weather conditions (e.g., in which closer and/or higher confidence targets may be prioritized), or based on nearby friendly or non-hostile targets (e.g., in which closer and/or higher confidence targets may be prioritized).
  • certain prioritizing algorithms may adjust the priorities of a set of targets to reduce and/or minimize the lag time between successive firings of the weapon, for instance, by prioritizing a set of nearby targets successively in the priority rank order, in order to reduce the firing latency time required to drive the weapon 225 through the sequence of targets.
  • operators may be permitted to switch on-the-fly between different rules or algorithms for target selection and prioritization. Such switching capabilities may be based the rank and/or authorization level of the operator, and in some cases may require that a request for approval be transmitted from the weapons system 200 to a high-level user at a remote command center.
  • a display screen is shown displaying an example user interface 500 that may be generated by a motorized weapon system 200 during engagement of a set of targets.
  • a plurality of targets have been identified and selected within the range and proximity of the weapon system 200 .
  • the targets have been prioritized to select a primary target 501 , several secondary targets 502 , and several non-targets 503 (e.g., friendly or non-hostile vehicles or individuals).
  • the primary target 501 is indicated with a double dotted line
  • the secondary targets 502 are indicated with a single dotted line
  • the non-targets have no lines.
  • example user interface 500 includes two operator controls: a fire button 510 to allow the user to fire the weapon 225 , and a next button 515 to allow the user to select the next target in the priority list.
  • fire button 510 is shaded indicating that the weapon 225 cannot currently be fired. As described below in more detail, this may represent a feature in which the operator's firing control mechanism 246 is disabled whenever the weapon 225 is not currently aimed at a selected target.
  • the next button 515 is enabled in this example, indicating that the next mechanism 247 that allows the operator to change the primary target 501 to the next highest priority target 502 in the priority list may be enabled even when the crosshairs 505 are not yet positioned on the primary target 501 .
  • the kill chain sequence may continue by performing the functionality of steps 403 - 410 in a continuous loop for each of the targets selected in step 401 , and in the priority order of the target prioritization performed in step 402 . Therefore, the first iteration of steps 403 - 410 may be performed for the highest priority target, the second iteration of steps 403 - 410 may be performed for the second highest priority target, and so on.
  • the targeting/firing system 210 may perform a dynamic tracking technique to determine a firing solution for that target.
  • a firing solution refers to a precise firing position for the weapon (e.g., an azimuth/horizontal angle and altitude/elevation angle) and a precise firing time calculated by the targeting/firing system 210 to hit the primary target.
  • target tracking need not be performed, and the firing solution may be computed based on a number of factors, including the target distance and target bearing from the weapon 225 , the muzzle velocity of the weapon 225 , the aerodynamic drag of the projectile/ammunition to be fired, the wind and weather conditions, and gravity (any one of which may vary based on the current conditions).
  • dynamic target tracking may be required to generate a firing solution, introducing additional variables which may increase the complexity and uncertainty of the firing solution calculation.
  • dynamic target tracking may involve calculating the anticipated direction and velocity of the target.
  • the targeting/firing system 210 may assume that the primary target will continue along its current course with the same velocity and direction. If the target is currently moving along a curved path, and/or is currently accelerating or decelerating, then the targeting/firing system 210 may assume the same curved path and/or the same acceleration/declaration pattern, and may extrapolate out based on those variables. Further, in some embodiments, the targeting/firing system 210 may anticipate future changes in course or speed, based on factors such as upcoming obstructions in the target's path, curves in roads, previous flight patterns, etc.
  • the determination of a firing solution for a moving target also may take into account the anticipated time to drive the motor 235 so that the weapon is positioned at the correct firing point, and the anticipated amount of time between the firing command and when the projectile/ammunition will reach the target.
  • the time to drive the motor 235 may be calculated based on the distance the gun is to be driven, the speed of the motor and/or the weight of the weapon 225 .
  • the amount of time between receiving a firing command and when the projectile/ammunition will reach the target may be based on the muzzle velocity of the weapon 225 , the aerodynamic drag of the projectile/ammunition to be fired, the wind and weather conditions, etc.
  • an anticipated delay for operator reaction time e.g., 0.5 seconds, 1 second
  • an anticipated delay for operator reaction time also may be included in the firing solution calculation.
  • FIG. 6 another example user interface 600 is shown that may be generated by a motorized weapon system 200 during engagement of one or more targets.
  • the targeting/firing system 210 has assessed that the target 601 is moving toward the lower-right direction of the interface 600 . Based on the factors discussed above, namely (a) the anticipated movement of the target 601 , (b) the time required to engage the motor 235 and drive the weapon to the firing point, and (c) the time for the projectile/ammunition to be fired and reach the target, the targeting/firing system 210 may calculate the firing solution.
  • the crosshairs 605 represents the point at which the weapon 225 is currently aimed
  • the point 606 represents the desired point of impact on the target 601
  • point 607 represents the firing solution determined by the targeting/firing system 210 .
  • the motor 235 is currently re-positioning the weapon toward the firing solution point 607
  • the firing solution computation has taken into account the time reposition the weapon 225 and the projectile time-to-target. Potentially, the firing solution computation also may take into account a short time delay to fire the weapon, and/or an anticipated operator decision time delay.
  • example interface 600 also includes three operator controls: a fire button 610 , a next button 615 , and a safe button 620 .
  • the fire button 610 allows the operator to fire the weapon 225 , but in some cases might be enabled only after the weapon 225 has reached the firing solution point 607 .
  • the next button 615 allows the operator not to fire the weapon 225 at the primary target 601 , but instead to re-select the next highest priority target in the priority list. In this example, the primary target 601 may be moved to the back of the priority list or elsewhere in the priority list, based on the operator's selection of the next control 615 .
  • the safe button 620 allows the operator to mark the currently selected primary target 601 as a friendly or non-target object, thereby removing it from the set of selected targets determined in step 401 and priority list of step 402 .
  • the configuration settings of the targeting/firing system 210 may determine that a target marked as safe by an operator during one engagement might thereafter be excluded from target selection/prioritization in future engagements.
  • weapon system 200 may transmit data identifying any targets marked as safe to other weapons systems 200 in the same general location, so that those other weapons systems 200 may automatically remove the target marked as safe from their target selection/prioritization lists as well.
  • step 404 was described above as performed for only a single target (i.e., the current highest priority target), in some embodiments, the targeting/firing system 210 may continuously performing dynamic tracking for all targets selected/prioritized in steps 401 - 402 . In such cases, by performing dynamic tracking on the selected secondary target(s), before the completion of the firing sequence 403 - 410 for the primary target, the targeting/firing system 210 may more quickly and efficiently determine the firing solution for the next primary target as soon as the firing sequence 403 - 410 is completed for the first primary target.
  • the targeting/firing system 210 may potentially re-order the prioritization sequence determined in step 402 , for example, based on movement of the secondary targets and/or based on newly received data about one or more of the secondary targets (e.g., improved verification information, additional threat information, etc.).
  • the targeting/firing system 210 may engage the motor 235 to drive the orientation of the weapon 225 toward the firing solution determined for the primary target in step 404
  • the motor 235 may be engaged to aim the weapon 225 from its currently aimed position 605 , to the determined firing solution point 607 . It may be noted from this example, that (a) the weapon 225 may be driven not toward the current position point of the target 606 , but instead to the future position point 607 , and (b) that the motor 235 may be engaged and the weapon 225 may be driven to this point by the targeting/firing system 210 in a fully autonomous manner, before any action has been taken by the operator to view, select, mark, or engage this target.
  • the targeting/firing system 210 may generate and transmit a user interface to be rendered for the operator via one or more display devices 250 .
  • the human operator may be located at the weapon system 200 or remote to the weapon system 200 , in which case the user interface may be transmitted via the communication module 212 over one or more secure computer networks, wireless networks, satellite networks, etc.
  • the user interface provided in step 406 may correspond to user interfaces 500 and/or 600 discussed above, although several variations may be implemented in different embodiments.
  • the primary target 501 may be marked by a particular scheme that is different from the secondary targets and from non-targets.
  • the user interface may automatically zoom in on the primary target (as in screen 600 ) to allow the operator the best possible visual of the target. Additionally or alternatively, secondary targets and/or non-targets may be blocked out, hidden, or otherwise obscured to prevent confusion or distraction by the operator. Further, in different embodiments, each of the various different target points discussed above (e.g., crosshairs 605 representing current weapon aiming point, the current target position point 606 , and/or firing solution target point 607 ) may or may not be rendered within the user interface, and/or may be shown in different colors, using different graphics and icons, etc.
  • the user interface generated and rendered in step 406 may include additional components such as side menus, overlays, and the like, to convey any relevant sensor information about the target or the firing environment.
  • additional components such as side menus, overlays, and the like, to convey any relevant sensor information about the target or the firing environment.
  • Examples of such sensor that may be included in the operator user interface may include the target type, target name/identifier of verified (if known) and confidence level of the verified name/identifier, distance to target, current wind and weather conditions, current status of weapon 225 and ammunition supply, number of other secondary targets, etc.
  • the targeting/firing system 210 may receive engagement instructions from the operator, via operator controls 245 .
  • the operator controls might only include two buttons: a fire button and next button.
  • the operator controls might include only three buttons: a fire button, a next button, and safe button.
  • any number of different/additional operator controls may be included in other embodiments (e.g., mouse/joystick for aiming, manual override, target selection controls, etc.), there are certain technical advantages associated with a limited interface such as a two-button or three-button interface as shown 500 - 600 , including simplification of operator interface, reduction or real-time operator errors, increased speed to weapon firing, etc.
  • the dynamic target tracking may continue for the primary target as well as the secondary targets selected by the targeting/firing system 210 .
  • the firing solution may be updated during this time delay and the motor 235 may be continuously engaged so that the weapon 225 is continuously aimed at the most recent firing solution target point.
  • the target identification, selection, and prioritization techniques discussed above in steps 401 and 402 may be updated, automatically and entirely transparently to the operator, to re-select and re-prioritize the targets based on new imagery, sensor data, and other relevant data received during the time delay between steps 406 - 407 .
  • the targeting/firing system 210 may perform the received instructions in steps 408 - 410 .
  • the fire command ( 408 ) is an operator instruction to fire the weapon 225 , and in some cases might be enabled only after the weapon 225 has reached the firing solution target point.
  • the targeting/firing system 210 may initiate firing of the weapon 225 , and then return to perform steps 403 - 410 for the next highest priority target. Additionally, in some embodiments, the targeting/firing system 210 may be configured to evaluate the accuracy of the projectile fired in step 410 , and may perform a real-time automatic correction in the targeting algorithm based on the accuracy evaluation. For example, upon firing a shot in step 410 , the targeting/firing system 210 may be configured to activate one or more cameras or sensors from sensor units 240 (which may be local or remote), to detect the landing time and location of the projectile.
  • Additional sensors such as audio sensors, heat sensors, etc., also may be used to determine where the projectile hit/landed.
  • the projectile landing/hit data may compared to the firing solution/target point data that was determined by the targeting/firing system 210 prior to firing the projectile. If the shot was off target by an amount greater than a predetermined accuracy threshold, then the targeting/firing system 210 may be configured to adjust its targeting algorithm in real-time, so that the updated algorithm may be used in the next iteration of steps 403 - 410 . Additionally, if the shot was off target by a sufficient amount that the target was missed, then the targeting/firing system 210 may be further configured to re-insert the previously fired upon target back into the priority list of selected targets.
  • the next command is an operator instruction not to fire the weapon 225 at the target, but to retain the target within the set of selected targets/target priority list, and then to re-select the next highest priority target in the priority list.
  • a next command in step 409 may cause the target to be placed at the back of the priority list of selected targets, or may cause the target to placed immediately after the next highest priority target in the priority list.
  • a safe command is an operator instruction to mark the target as a friendly or non-target object, thereby removing it from the set of selected targets and target priority list.
  • the target may not be selected again by the targeting/firing system 210 , during at least the current engagement by the current weapon system 200 .
  • a target marked as safe during step 410 during an engagement at one weapon system 200 also might be excluded from target selection in future engagements of the weapon system 200 , and/or during current and future engagements at different weapons systems 200 .
  • the various techniques discussed above with reference to FIG. 4 including without limitation: (a) autonomous target selection, prioritization, and re-selection by the targeting/firing system 210 , (b) dynamic target tracking of both the primary target and secondary targets that takes into account target movement, weapon/projectile characteristics, etc., (c) autonomous actuation of the motor to automatically orient the weapon toward the primary target before receiving any operator input, (d) a simplified user interface and operator controls, and (e) enabling/disabling of the firing mechanism depending on the projected point of impact of the weapon, alone and in combination, provide increased system efficiency, increased rate of firing, improved weapon system accuracy, and reduced operator error, along with the other technical advantages described herein.
  • certain aspects of the present disclosure relate to techniques for disabling and re-enabling an operator firing control (e.g., 246 ), during the period of time when the motor 235 of a motorized weapon system 200 is engaged and the weapon 225 is being positioned and oriented toward a determined target point for firing.
  • the process of engaging the motor 235 of the weapon system 200 to position the weapon 225 to fire on a particular target point may take anywhere from a fraction of second to several seconds, depending on factors size as the motor size and speed, gun size and weight, angular distance to be traveled, etc.
  • the projected point of impact of a projectile fired from the weapon 225 may become closer and closer to the target point, and similarly, the likelihood of hitting the target may increase continuously until a maximum likelihood is reached when the projected point of impact of the weapon 225 (e.g., marked by crosshairs 505 , 605 , etc.) is directly on the determined firing solution target point.
  • the probability of hitting the target might never be 100 %.
  • the likelihood of hitting the target is determined to be sufficiently high, e.g., above a predetermined likelihood threshold, then the targeting/firing system 210 may be configured to enable firing of the weapon 225 (and/or automatically fire the weapon 225 ).
  • the targeting/firing system 210 may be configured to determine if/when the predetermined likelihood threshold for hitting the target is reached during the time period when the motor 235 is engaged in positioning the weapon 225 , but before the crosshairs 505 are directly on the target (i.e., before the projected point of impact of the weapon 225 is directly on the determined firing solution target point).
  • the targeting/firing system 210 may be configured to disable the operator firing mechanism 246 when the current likelihood of hitting the target is below the predetermined likelihood threshold, based on the position/orientation of the weapon 225 and other factors. The operator firing mechanism 246 then may be re-enabled in response to the targeting/firing system 210 determining that the current likelihood of hitting the target is above the predetermined likelihood threshold.
  • FIG. 7 a flow diagram is shown illustrating a process of disabling and/or re-enabling the firing mechanism of a motorized weapon system while the motor is engaged to move the weapon to a target point.
  • the steps in this process may be performed by one or more components in the example motorized weapon system 200 discussed above, such as targeting/firing system 210 and the subsystems thereof, in conjunction with the weapon/mount/motor components 225 - 235 , one or more sensor units 240 , operator interface components 245 - 250 , and/or various remote and external systems.
  • process steps described herein such as determination of likelihood thresholds for hitting targets, and corresponding boundary areas for motorized weapons systems, need not be limited to the specific systems and hardware implementations described above in FIGS. 1-3 , but may be performed within other motorized weapon systems and environments comprising other combinations of the hardware and software components described herein.
  • step 701 a motorized weapon system 200 has identified and selected a particular target, and determines a firing solution and/or target point for the selected target.
  • step 701 may be similar or identical to step 404 discussed above.
  • one or both of the target and the weapon system 200 may potentially be moving during this process.
  • the firing solution target point may be computed based on factors including the target distance, target bearing from the weapon 225 , muzzle velocity of the weapon 225 , aerodynamic drag of the projectile/ammunition to be fired, the wind and weather conditions, and gravity (any one of which may vary based on the current conditions).
  • dynamic target tracking may be required to generate a firing solution, and additional variables may increase the complexity and uncertainty of the firing solution calculation.
  • dynamic target tracking may be used to determine the current velocity and direction of travel of both the weapon system 200 and the target, and that data may be used to calculate the anticipated velocity and direction of travel of both in the near future.
  • the targeting/firing system 210 may assume that both the weapon system 200 and the target may continue along their current course with the same velocity and direction, and if either is currently moving along a curved path and/or is currently accelerating/decelerating, then the targeting/firing system 210 may assume the same curved path and/or the same acceleration/declaration in the near future.
  • the determination of a firing solution e.g., predicted future coordinates at a future firing time
  • the targeting/firing system 210 may build in an anticipated delay for operator reaction time (e.g., 0.5 seconds, 1 second) which may be included in the firing solution calculations for moving targets.
  • the targeting/firing system 210 of the motorized weapon system 200 may determine a boundary area surrounding the target point determined in step 701 .
  • the boundary area may be referred to as a “confidence lock” boundary, because as discussed below, the firing mechanism may be disabled when the projected point of impact of the weapon is outside of this area.
  • the boundary area may be a circle or other two-dimensional closed shape surrounding the target point. A simple example of a circular boundary area 807 is shown in FIGS. 8A-8B , discussed in more detail below.
  • the boundaries of the area may correspond to a predetermined likelihood threshold of hitting the target and need not be any particular shape.
  • the likelihood of the weapon 225 hitting the target may be calculated as a probability P, which may be the same for every point on the boundary of the area and is also the same as a predetermined likelihood threshold set by the targeting/firing system 210 .
  • P a probability of the likelihood of hitting the target is less than P, and for any shot taken when the weapon crosshairs are inside of the boundary area, the likelihood of hitting the target is greater than P.
  • the boundary area may be circular, as shown in FIGS. 8A-8B .
  • Circular boundaries may generally apply when the determined probability P is the probability of the hitting the target point.
  • the boundary area may be target-shaped (e.g., a larger vehicle-shaped boundary surrounding the target vehicle, a larger person-shaped boundary surrounding the target person, etc.).
  • the boundary area may assume a more elongated shape in the direction of the movement, to account for the additional targeting uncertainties caused by the movement of the weapon system 200 or target.
  • the boundary area may be shaped like a horizontally-elongated circle (or horizontally-elongated vehicle shape).
  • the boundary area may be defined in terms of angular coordinates (e.g., azimuth and altitude) from the perspective of the weapon 225 .
  • the size of the boundary area determined in step 702 may be based on any combination of factors that may introduce uncertainty in the point of impact calculation of the weapon 225 with respect to the target.
  • the size of the boundary area (e.g., in terms of angular degrees or coordinates) may be based on one or more of the target size, distance between the weapon 225 and the target, the general accuracy and precision data for the weapon type 225 and ammunition type, and other factors such as wind, vibration level of the weapon 225 during movement by the motor, and current movement of the weapon system 200 and/or the target.
  • the boundary area may be relatively small.
  • the boundary area may be relatively large.
  • step 703 the targeting/firing system 210 engages the motor 235 to position and orient the weapon 225 toward the target point identified in step 701 .
  • step 703 may be similar or identical to step 405 , discussed above.
  • the engagement of the motor 635 may drive the position and orientation of the weapon 225 to a predicted point of impact of the stationary target point 606 .
  • the engagement of the motor 635 may drive the position and orientation of the weapon 225 to a separate predicted future target point (e.g., 607 ) determined by a firing solution calculation based on predicted target movement and anticipated time delays until firing and impact.
  • the targeting/firing system 210 may compute the projected point of impact if a projectile were fired from the weapon 225 at that time.
  • the projected point of impact corresponds to the calculation of the crosshairs (e.g., 505 and 605 ) discussed above and shown in FIGS. 5 and 6 .
  • the calculation of the projected point of impact may be based on the specifications of the weapon system 200 and/or collected sensor data, such as the current position and orientation of the gun, the distance to target and bearing of the target from the weapon 225 , the muzzle velocity of the weapon 225 , the aerodynamic drag of the projectile to be fired, the current wind and weather conditions, and gravity (which may vary based on the current elevation).
  • the targeting/firing system 210 may compare the projected point of impact computed in step 704 to the “confidence lock” boundary area defined in step 702 . This may be straightforward comparison of angular coordinates from the perspective of the weapon 225 . If the current point of impact of the weapon 225 is projected to fall outside of the defined boundary area ( 705 :No), then in step 706 the targeting/firing system 210 may disable the operator firing mechanism 246 thereby preventing the weapon 225 from being fired.
  • the targeting/firing system 210 may enable (or re-enable) the operator firing mechanism 246 , thereby allowing the operator to fire the weapon 225 .
  • the targeting/firing system 210 may be configured to perform a rapid post-firing command movement of the weapon 225 in order to further improve shot confidence. For instance, after the operator pushes the enabled firing mechanism 246 , rather than immediately firing the weapon 225 , the targeting/firing system 210 in some cases may engage the motor 235 for a short amount of time (e.g., 50 ms, 100 ms, 200 ms, etc.), in response to a determination that the corresponding small weapon movement may significantly increase shot confidence.
  • a short amount of time e.g., 50 ms, 100 ms, 200 ms, etc.
  • These short post-firing command movements may be performed in the case of moving targets and/or moving weapon systems 200 , in the event of a sudden change in the trajectory of the target, to correct for a lag in operator reaction time, and/or as part of a firing burst to increase hit probability.
  • FIGS. 8A and 8B two example user interface screens 800 are shown, during a process of engaging the motor 235 of a motorized weapon system 200 to position and orient the weapon 225 at a selected target point 806 .
  • a circular “confidence lock” boundary area 807 has been defined by the targeting/firing system 210 , outside of which firing of the weapon 225 is to be disabled.
  • FIG. 8A when the projected point of impact 805 of the weapon 225 falls outside of the boundary area 225 , the operator may be unable to fire the weapon 225 (as indicated by the shaded fire button 810 ).
  • FIG. 8A when the projected point of impact 805 of the weapon 225 falls outside of the boundary area 225 , the operator may be unable to fire the weapon 225 (as indicated by the shaded fire button 810 ).
  • FIG. 8A when the projected point of impact 805 of the weapon 225 falls outside of the boundary area 225 , the operator may be unable to fire the weapon 225 (as indicated by the shaded
  • the motor 235 has now oriented the weapon 225 closer to the target point 806 , and the projected point of impact 805 now falls within the boundary area 807 . Therefore, as shown in FIG. 8B , the fire button is now re-enabled allowing the weapon 225 to be fired by the user. It is further noted in this example that the next button 815 and the safe button 820 , which are discussed above in reference to FIGS. 5-6 , are active and enabled regardless of the current orientation of the weapon 225 .
  • steps 704 - 707 may be performed multiple times while the motor 235 is engaged and the weapon 225 is moving toward the target point.
  • targeting/firing system 210 may perform steps 704 - 707 on a continuous loop at all times while the motor 235 is engaged, or in some cases even when the motor 235 is not engaged.
  • the targeting/firing system 210 may be configured to initiate an instance of steps 704 in accordance with a schedule (e.g., every 100 ms, 200 ms, 500 ms, etc.).
  • these steps may be performed periodically or continuously even when the motor 235 is not moving and the crosshairs 805 are fixed on the target point 806 .
  • a new action such as a change in movement of the target 801 or the weapon system 200 , an object obscuring the target 801 , and/or new sensor readings (e.g., a change in wind conditions) may temporarily cause the probability level of the weapon 225 hitting the target to drop below the predetermine likelihood threshold and out of the confidence lock boundary area 807 , requiring a minor adjust via the motor 235 or other corrective action by the weapon system 200 .
  • a motorized weapon system 200 may implement a minimum confidence threshold for target selection and/or prioritization.
  • this minimum confidence threshold may be a separate determination from the level of confidence computed by the system 200 for identifying or verifying a target. Rather, this minimum confidence threshold may refer to the level of confidence that the weapon system 200 is able to hit the identified target.
  • the targeting/firing system 210 may determine that the confidence level that the weapon system 200 will hit the target is not sufficiently high to fire the weapon 235 .
  • Environmental conditions such as wind or weather conditions, lighting conditions, and/or other objects potentially obscuring the target object also may lower the confidence level computed by the targeting/firing system 210 for hitting the target.
  • the targeting/firing system 210 when the confidence level computed by the targeting/firing system 210 falls below the predetermined threshold for target, that target may be automatically deprioritized so that it is not selectable by the operator (or selectable only via manual override). However, the targeting/firing system 210 may continue to monitor and dynamically track the low-confidence target, and may re-enable target selection and firing capabilities on that target as soon as the confidence level of hitting the target returns to above the minimum confidence threshold.
  • the minimum confidence threshold is another operation-specific variable that may be altered based on the operation, the particular operator, the location, and other factors.
  • the firing/targeting system 210 may continuously assess and evaluate its target accuracy, which may result the system 210 increasing or decreasing the confidence levels it had previously computed for one or more selected targets. As an example, if a first target is initially determined to be too small and too far away to have a sufficiently high confidence level for firing on the target, the firing/targeting system 210 may instead select a number of closer targets and may fire on those targets.
  • the firing/targeting system 210 may be better able to evaluate the range, lighting, wind conditions, and the like, so that the confidence level for the hitting the first target now may be increased based on the accuracy feedback from the closer targets.
  • a motorized weapon system 200 may be weapon-agnostic, in that a weapon system 200 may support many different types or models of weapons 235 , including various firearms, large caliber rifles, machine guns, autocannons, grenade launchers, rockets, and/or directed energy weapons such as lasers, high-power microwave emitters, and other undisclosed devices.
  • the targeting/firing system 210 may weapon profiles in data store 214 and/or weapon-specific rules in data store 213 , that allow the weapon system 200 to perform the techniques discussed herein in a similar or identical manner regardless of the current weapon type.
  • the targeting/firing system 210 , sensor units 240 , and the operator interface 245 - 250 may function identically regardless of the type of motor 235 , mount 230 , and weapon 225 integrated into the system 200 . Because systems 200 having different types of weapons 225 , mounts 230 , and/or motors 235 , may perform differently in some respects (e.g., time required to re-position and re-orient the weapon 225 , maximum range of weapon, type, size, and speed of projectiles fired, etc.), the targeting/firing system 210 may be configured to initially determine these weapon-specific data factors, and adjust the techniques described herein to provide a uniform operator experience.
  • the targeting/firing system 210 of a first weapon system 200 may automatically select targets based on the firing range of the weapon 225 installed on that system 200 , whereas a different system 200 might select more or less targets based on its having a weapon 225 with a different range.
  • a first weapon system 200 may prioritize a set of selected targets taking into account the speed of the motor 235 on that system 200 , whereas a different system 200 might prioritize the same set of targets differently as a result of having a different motor speed.
  • different sensor units 240 have different numbers, types, and/or qualities of cameras and other sensors, may result in different sets of input provided to the targeting/firing systems 210 .
  • a first weapon system 200 may have sufficient data to select and verify a target with high confidence, while a second weapon system 200 with different cameras/sensors 240 would not select because it could not verify the target with a sufficient confidence level.
  • the different behaviors of the weapon systems 200 resulting from different weapons 225 , mounts 230 , motors 235 , and/or sensor units 240 may be entirely transparent to the operator.
  • operators of weapons systems 200 need not ever know what weapon 225 they are firing, and the entire operator interface may function identically regardless of the particular weapon, motor, mount, or sensor unit.
  • Additional techniques applicable to the above examples include the implementation of operation-specific rules of engagement that may be retrieved/received and enforced by the targeting/firing system 210 .
  • specific rules of engagement and/or operational parameters for the motorized weapon system may include different requirements or parameters for target identification and selection, different minimum confidence thresholds for firing the weapon 225 , different target prioritization algorithms, and so on.
  • the motorized weapon system 200 may be configured to receive a set of operation-specific rules of engagement from a remote command center via a secure communication channel, store and apply those operation-specific rules during the appropriate operation.
  • specific rules of engagement and/or sets of operational parameters may be associated with specific operators, operator rank, engagement location (e.g., country, region, etc.).
  • operators having sufficient rank and/or authorization levels may be permitted to manually override certain rules of engagement and/or operational parameters of the weapon system 200 , and to apply the operator's own preferred rules/parameters in place. Additionally or alternatively, such overrides may require outside approval, and thus upon receiving a rule/parameter override request from the operator, the weapon system may be configured to transmit a secure request for override approval a remote command center.
  • the target points for selected targets are computed based on a desired point of impact location on the target (e.g., an engine of a boat or vehicle, the center of mass of an individual, etc.).
  • the targeting/firing system 210 may be configured with warning shot capabilities in which the desired point of impact location is not on the target.
  • the rules of engagement enforced by the targeting/firing system 210 for a particular operation may dictate that only warning shots are to be fired at particular selected target. Alternatively, such rules may dictate that at least one initial warning shot is to be fired at a selected target before an attempt is made to hit the target.
  • the operator controls 245 also may include a warning shot mode that can be activated by the operator, independent of the rules of engagement of the operation, to allow the operator to independently fire one or more warning shots on any selected target.
  • the firing solution may be adjusted to assure that the projectiles fired by the weapon 225 will miss the target.
  • the targeting/firing system 210 may determine the preferred location of a desired warning shot based on the type and size of the target (e.g., the number and position of warning shots for human targets may be different than for vehicle targets), the orientation and/or the direction of movement of the target (e.g., it may be desirable to firing a warning shot directly in front of the target), and so on.
  • Implementation of the techniques, blocks, steps and means described above may be done in various ways. For example, these techniques, blocks, steps and means may be implemented in hardware, software, or a combination thereof.
  • the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
  • embodiments may be implemented by hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof.
  • the program code or code segments to perform the necessary tasks may be stored in a machine readable medium such as a storage medium.
  • a code segment or machine-executable instruction may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements.
  • a code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
  • the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein.
  • Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein.
  • software codes may be stored in a memory.
  • Memory may be implemented within the processor or external to the processor.
  • the term “memory” refers to any type of long term, short term, volatile, nonvolatile, or other storage medium and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
  • the term “storage medium” may represent one or more memories for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information.
  • ROM read only memory
  • RAM random access memory
  • magnetic RAM magnetic RAM
  • core memory magnetic disk storage mediums
  • optical storage mediums flash memory devices and/or other machine readable mediums for storing information.
  • machine-readable medium includes, but is not limited to portable or fixed storage devices, optical storage devices, and/or various other storage mediums capable of storing that contain or carry instruction(s) and/or data.
  • FIG. 9 provides a schematic illustration of one embodiment of a computer system 900 that can perform various steps of the methods provided by various embodiments. It should be noted that FIG. 9 is meant only to provide a generalized illustration of various components, any or all of which may be utilized as appropriate. FIG. 9 , therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
  • the computer system 900 is shown comprising hardware elements that can be electrically coupled via a bus 905 (or may otherwise be in communication, as appropriate).
  • the hardware elements may include one or more processors 910 , including without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, video decoders, and/or the like); one or more input devices 915 , which can include without limitation a mouse, a keyboard, remote control, and/or the like; and one or more output devices 920 , which can include without limitation a display device, a printer, and/or the like.
  • processors 910 including without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, video decoders, and/or the like)
  • input devices 915 which can include without limitation a mouse, a keyboard, remote control, and/or the like
  • output devices 920 which can include without limitation a display device,
  • the computer system 900 may further include (and/or be in communication with) one or more non-transitory storage devices 925 , which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory (“RAM”), and/or a read-only memory (“ROM”), which can be programmable, flash-updateable and/or the like.
  • RAM random access memory
  • ROM read-only memory
  • Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like.
  • the computer system 900 might also include a communications subsystem 930 , which can include without limitation a modem, a network card (wireless or wired), an infrared communication device, a wireless communication device, and/or a chipset (such as a BluetoothTM device, an 802.11 device, a WiFi device, a WiMax device, cellular communication device, etc.), and/or the like.
  • the communications subsystem 930 may permit data to be exchanged with a network (such as the network described below, to name one example), other computer systems, and/or any other devices described herein.
  • the computer system 900 will further comprise a working memory 935 , which can include a RAM or ROM device, as described above.
  • the computer system 900 also can comprise software elements, shown as being currently located within the working memory 935 , including an operating system 940 , device drivers, executable libraries, and/or other code, such as one or more application programs 945 , which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • an operating system 940 device drivers, executable libraries, and/or other code
  • application programs 945 may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods.
  • a set of these instructions and/or code might be stored on a non-transitory computer-readable storage medium, such as the non-transitory storage device(s) 925 described above.
  • the storage medium might be incorporated within a computer system, such as computer system 900 .
  • the storage medium might be separate from a computer system (e.g., a removable medium, such as a compact disc), and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon.
  • These instructions might take the form of executable code, which is executable by the computer system 500 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer system 900 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.), then takes the form of executable code.
  • some embodiments may employ a computer system (such as the computer system 900 ) to perform methods in accordance with various embodiments of the invention.
  • some or all of the procedures of such methods are performed by the computer system 900 in response to processor 910 executing one or more sequences of one or more instructions (which might be incorporated into the operating system 940 and/or other code, such as an application program 945 ) contained in the working memory 935 .
  • Such instructions may be read into the working memory 935 from another computer-readable medium, such as one or more of the non-transitory storage device(s) 925 .
  • execution of the sequences of instructions contained in the working memory 935 might cause the processor(s) 910 to perform one or more procedures of the methods described herein.
  • machine-readable medium refers to any medium that participates in providing data that causes a machine to operate in a specific fashion. These mediums may be non-transitory.
  • various computer-readable media might be involved in providing instructions/code to processor(s) 910 for execution and/or might be used to store and/or carry such instructions/code.
  • a computer-readable medium is a physical and/or tangible storage medium.
  • Such a medium may take the form of a non-volatile media or volatile media.
  • Non-volatile media include, for example, optical and/or magnetic disks, such as the non-transitory storage device(s) 925 .
  • Volatile media include, without limitation, dynamic memory, such as the working memory 935 .
  • Common forms of physical and/or tangible computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, any other physical medium with patterns of marks, a RAM, a PROM, EPROM, a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read instructions and/or code.
  • Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 910 for execution.
  • the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer.
  • a remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer system 900 .
  • the communications subsystem 930 (and/or components thereof) generally will receive signals, and the bus 905 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the working memory 935 , from which the processor(s) 910 retrieves and executes the instructions.
  • the instructions received by the working memory 935 may optionally be stored on a non-transitory storage device 925 either before or after execution by the processor(s) 910 .
  • computer system 900 can be distributed across a network. For example, some processing may be performed in one location using a first processor while other processing may be performed by another processor remote from the first processor. Other components of computer system 900 may be similarly distributed. As such, computer system 900 may be interpreted as a distributed computing system that performs processing in multiple locations. In some instances, computer system 900 may be interpreted as a single computing device, such as a distinct laptop, desktop computer, or the like, depending on the context.
  • configurations may be described as a process which is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure.
  • examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.

Abstract

Various techniques are described herein for controlling autonomous and semi-autonomous motorized weapons systems. In various embodiments, semi-autonomous motorized weapons systems may perform automated target identification, selection and prioritization techniques. Dynamic target tracking may be performed, for both primary and secondary targets, in cases of stationary and moving targets and weapon systems. A motorized weapon system then may be actuated automatically toward a firing solution target point, during which the operator-controlled firing mechanism may be enabled or disabled based on the projected point of impact of the weapon in comparison to a determined boundary area associated with the target.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application is a non-provisional of and claims priority to U.S. Provisional Patent Application No. 62/581,280, filed Nov. 3, 2017, entitled “SEMI-AUTONOMOUS TARGETING OF REMOTELY OPERATED WEAPONS.” The entire contents of provisional application no. 62/581,280 is incorporated herein by reference for all purposes.
  • BACKGROUND 1. Field of the Invention
  • This disclosure generally relates to autonomous and semi-autonomous motorized weapons systems. More specifically, the present disclosure relates to hardware- and software-based techniques for efficient operation of motorized weapons systems, via improvements in target identification and selection, autonomous actuation of motor and targeting systems, dynamic tracking, and trajectory measurement and assessment.
  • 2. Description of Related Art
  • Within the context of motorized weapons systems, the concept of a “kill chain” refers to the sequence of actions performed between the first detection of potential targets, and the elimination of the targets. The sequence of actions within a kill chain generally may include the following: (1) Find—identifying and locating a target, (2) Fix or Track—determining the accurate location of the target, (3) Target—time-critical targeting, including predicting where the target may pop-up, (4) Engage—firing on the target, and (5) Assess—determining whether or not the target has been hit and/or eliminated.
  • Conventional weapon systems may include various components for achieving the above steps of a kill chain, including cameras and sensors to identify targets, display screens and controls (e.g., joysticks) to allow an operator to identify targets and aim the weapon, and a variety of weapons that may be fired at the target. Such systems may include “fully autonomous” weapons systems, which are capable of targeting and firing without any intervention by a human operator, “semi-autonomous” weapons systems, which may use automated software target tracking tools but still rely on a human operator for target selection and firing commands, “supervised autonomous” weapons systems, which may be granted permission to react to threats autonomously, and/or manual weapon systems that are operated entirely by the human operator.
  • Typically, conventional weapons systems rely on an “operator centric” approach to perform the actions in the kill chain sequence. Such systems often prioritize the interface and environment provided to the human operator. First, the human operator may be put in a safe environment, and the operator's eyesight may be improved using broad spectrum and high-resolution options. The weapon may be stabilized from motion and vibration, to allow the operator to find and track the target via a joystick and cursor or similar interface. After these steps, image recognition software may be used to attempt to recognize the target that been selected and tracked by the operator, and trajectory adjustments may be applied. Such systems and processes may result in a number of technical problems and inefficiencies, including difficulties of targeting and tracking when the operator is in a moving vehicle, difficulties selection and identification of targets and inefficiencies in selecting follow-on targets, and operator-based assessment and correction of weapon targeting and firing.
  • BRIEF SUMMARY
  • Techniques described herein relate to hardware- and software-based solutions for operating motorized weapons systems, including target identification and selection techniques, autonomous actuation of motor and targeting systems, dynamic tracking, and trajectory measurement and assessment techniques. Certain embodiments described herein correspond to semi-autonomous motorized weapon systems, which may include various combinations of hardware such as weapons capable of firing munitions, two-axis and/or three-axis mounts configured to support and position the weapons, motors coupled to the mounts and configured to move the mounts to specified positions to control the direction to which the weapons is aimed, and/or operator interface components such as operator controls and a target display device. In some embodiments, such a semi-autonomous motorized weapon system may be implemented with various hardware-based and software-based components configured to determine target points associated with targets at a remote locations, determine one or more areas having boundaries surrounding the target points, such boundary areas determined based on the likelihood of the weapon hitting the target when aimed at the boundary in comparison to predetermined likelihood thresholds. Such embodiments may be further configured to engage the motor of the motorized weapon system, with instructions to move the mount from an initial position to a target position at which the weapon is aimed at the target point, and during engagement of the motor, to periodically determine, during the movement of the mount toward the target position, whether the weapon is aimed at a position within the boundary area surrounding the target point. When determining, during the movement of the mount toward the target position, that the weapon is not aimed at a position within the area surrounding the target point, the semi-autonomous motorized weapon system may disable a manual firing mechanism of the weapon system to prevent firing of the weapon by an operator, whereas when it is determined during the movement of the mount toward the target position, that the weapon is aimed at a position within the area surrounding the target point, the semi-autonomous motorized weapon system may enable (or re-enable) the manual firing mechanism to allow firing of the weapon. Finally, the semi-autonomous motorized weapon system may be configured to receive and execute firing commands from operators, via the manual firing mechanism, thereby firing the weapon at times when the manual firing mechanism is enabled.
  • Additional techniques described herein include weapon-agnostic motorized weapon systems, including weapon-agnostic targeting/firing systems that may support various different types or models of weapons, as well as implementation of operation-specific rules of engagement that may be received and enforced by the weapon-agnostic targeting and firing systems. Further techniques described herein include minimum confidence thresholds for target selection and/or prioritization via semi-autonomous weapons systems, which may be separate determinations from target identification confidence and/or target verification confidence. Still further techniques described herein may include sensor-based real-time projectile firing assessment and automatic correction of targeting algorithms based on accuracy evaluations.
  • The various techniques described herein further include combinations of autonomous target selection, prioritization, and re-selection by targeting/firing systems within semi-autonomous motorized weapon systems, dynamic target tracking of both primary and secondary targets including target movement predictions and weapon/projectile characteristics, autonomous motor actuation to automatically orient the weapon toward the primary target before receiving any operator input, simplified user interfaces and operator controls, and enabling/disabling of the firing mechanism depending on the projected point of impact of the weapon, thereby providing increased system efficiency, increased rate of firing, improved weapon system accuracy, and reduced operator error, along with the other technical advantages described herein.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a depiction of a motorized weapon system, in accordance with one or more embodiments of the present invention.
  • FIG. 2 is a block diagram illustrating example component architecture diagram of a motorized weapon system, in accordance with one or more embodiments of the present invention.
  • FIGS. 3A-3C are illustrative drawings depicting the mounting and application of a motorized weapon system in accordance with one or more embodiments of the present invention, within different engagement environments.
  • FIG. 4 is a flowchart illustrating an example process of using a motorized weapon system to engage one or more targets, in accordance with certain embodiments of the present invention.
  • FIG. 5 is an example screen of a user interface displayed to an operator of a motorized weapon system during engagement of one or more targets, in accordance with certain embodiments of the present invention.
  • FIG. 6 is another example screen of a user interface displayed to an operator of a motorized weapon system during engagement of one or more targets, in accordance with certain embodiments of the present invention.
  • FIG. 7 is a flowchart illustrating an example process of disabling or enabling a firing mechanism of a motorized weapon system during engagement of the motor to move the weapon, in accordance with certain embodiments of the present invention.
  • FIGS. 8A and 8B are example screens of a user interface displayed to an operator of a motorized weapon system during engagement of the motor to move the weapon toward a target point, in accordance with certain embodiments of the present invention.
  • FIG. 9 is a schematic illustration of a computer system configured to perform techniques in accordance with certain embodiments of the present invention.
  • In the appended figures, similar components and/or features may have the same reference label. Further, various compo of the same type may be distinguished by following the reference label by a dash and a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
  • DETAILED DESCRIPTION
  • In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of various embodiments of the present invention. It will be apparent, however, to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details. In other instances, well-known structures and devices are shown in block diagram form.
  • The ensuing description provides exemplary embodiments only, and is not intended to limit the scope, applicability, or configuration of the disclosure. Rather, the ensuing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing an exemplary embodiment. It should be understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the invention as set forth in the appended claims.
  • Specific details are given in the following description to provide a thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, circuits, systems, networks, processes, and other components may be shown as components in block diagram form in order not to obscure the embodiments in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the embodiments.
  • Also, it is noted that individual embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed, but could have additional steps not included in a figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination can correspond to a return of the function to the calling function or the main function.
  • The term “computer-readable medium” includes, but is not limited non-transitory media such as portable or fixed storage devices, optical storage devices, and various other mediums capable of storing, containing or carrying instruction(s) and/or data. A code segment or computer-executable instructions may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
  • Furthermore, embodiments may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks may be stored in a computer-readable medium. A processor(s) may perform the necessary tasks.
  • Various techniques (e.g., methods, systems, computing devices, non-transitory computer-readable storage memory storing a plurality of instructions executable by one or more processors, etc.) are described herein for hardware- and software-based solutions for operating motorized weapons systems, including target identification and selection techniques, autonomous actuation of motor and targeting systems, dynamic tracking, and trajectory measurement and assessment techniques. Certain embodiments described herein correspond to semi-autonomous motorized weapon systems, which may include various combinations of hardware such as weapons capable of firing munitions, two-axis and/or three-axis mounts configured to support and position the weapons, motors coupled to the mounts and configured to move the mounts to specified positions to control the direction to which the weapons is aimed, and/or operator interface components such as operator controls and a target display device. In some embodiments, such a semi-autonomous motorized weapon system may be implemented with various hardware-based and software-based components configured to determine target points associated with targets at a remote locations, determine one or more areas having boundaries surrounding the target points, such boundary areas determined based on the likelihood of the weapon hitting the target when aimed at the boundary in comparison to predetermined likelihood thresholds. Such embodiments may be further configured to engage the motor of the motorized weapon system, with instructions to move the mount from an initial position to a target position at which the weapon is aimed at the target point, and during engagement of the motor, to periodically determine, during the movement of the mount toward the target position, whether the weapon is aimed at a position within the boundary area surrounding the target point. When determining, during the movement of the mount toward the target position, that the weapon is not aimed at a position within the area surrounding the target point, the semi-autonomous motorized weapon system may disable a manual firing mechanism of the weapon system to prevent firing of the weapon by an operator, whereas when it is determined during the movement of the mount toward the target position, that the weapon is aimed at a position within the area surrounding the target point, the semi-autonomous motorized weapon system may enable (or re-enable) the manual firing mechanism to allow firing of the weapon. Finally, the semi-autonomous motorized weapon system may be configured to receive and execute firing commands from operators, via the manual firing mechanism, thereby firing the weapon at times when the manual firing mechanism is enabled.
  • Additional techniques described herein include weapon-agnostic motorized weapon systems, including weapon-agnostic targeting/firing systems that may support various different types or models of weapons, as well as implementation of operation-specific rules of engagement that may be received and enforced by the weapon-agnostic targeting and firing systems. Further techniques described herein include minimum confidence thresholds for target selection and/or prioritization via semi-autonomous weapons systems, which may be separate determinations from target identification confidence and/or target verification confidence. Still further techniques described herein may include sensor-based real-time projectile firing assessment and automatic correction of targeting algorithms based on accuracy evaluations.
  • The various techniques described herein further include combinations of autonomous target selection, prioritization, and re-selection by targeting/firing systems within semi-autonomous motorized weapon systems, dynamic target tracking of both primary and secondary targets including target movement predictions and weapon/projectile characteristics, autonomous motor actuation to automatically orient the weapon toward the primary target before receiving any operator input, simplified user interfaces and operator controls for operating the semi-autonomous motorized weapon systems, and enabling/disabling of the firing mechanism depending on the projected point of impact of the weapon, thereby providing increased system efficiency, increased rate of firing, improved weapon system accuracy, and reduced operator error, along with the other technical advantages described herein.
  • With reference now to FIG. 1, a depiction of an illustrative motorized weapon system 100 is shown. As shown in this example, weapon system 100 may include a weapon 110 with ammunition feed 115, a gimbal mount 120, a camera/sensor unit 125. Additionally, in this example, the weapon system 100 includes a base/housing 130, which contains and obscures additional components of the system 100, including the motor, servos, targeting system, processing and memory components, communications system, firing controls, and various other components described herein.
  • In some embodiments, weapon system 100 may be a remotely operated weapon stations (ROWS), including stabilization and auto-targeting technology. The targeting system of weapon system 100 may be configured to perform rapid target selection and acquisition, and increased hit probabilities. Weapon system 100 may be compatible with many different types of weapon 110 and different corresponding types of ammunition, and as discussed below, the operation of the targeting system and other components of the weapon system 100 may depend on knowledge of which type of weapon 110 and ammunition is currently in use. As discussed in more detail below, weapon system 100 may be fully integrated, with auto-targeting capabilities, and/or remote operation. Weapon system 100 also may be capable of being mounted to various different types of platforms, including tripods, buildings, ground vehicles (e.g., trucks, tanks, cars, jeeps), all-terrain vehicles (ATVs), utility task vehicles (UTVs), boats, fixed-wing aircraft, helicopters, and drones. As described in further detail below, various embodiments of weapon systems 100 may include capabilities for automatic target detection, selection, and re-selection, active stabilization, automatic ballistic solutions, target tagging, and/or continuous target tracking.
  • As noted above, weapon 110 may any type of gun, armament, or ordinance, including without limitation, off-the-shelf firearms, large caliber rifles, machine guns, autocannons, grenade launchers, rockets, and/or directed energy weapons such as lasers, high-power microwave emitters, and other undisclosed devices. The weapon 110 may be attached to the weapon system 100 using a 2-axis or 3-axis mechanical gimbal mount 120, capable of controlling azimuth and yaw, elevation and pitch, and possibly cant and roll. A closed loop servomotor within the weapon system 100 may be configured to drive the gimbal to an identified target. A firing mechanism within the weapon system may be configured to fire the weapon 110, either electronically or by manually pulling the trigger, in response to a firing command from a human operator and/or additional firing instructions received from a targeting/firing component of the weapon system 110.
  • Camera/sensor unit 125 may include an array of various different sensors configured to collect data at the weapon system 100, and transmit the sensor/image data back to the internal software systems of the weapon system 100 (e.g., targeting system/component, firing control, ballistics engine) and/or to a display device for outputting to an operator. Cameras/sensors within the sensor unit 125 may include, for example, cameras sensitive in various spectrums such as visible and infrared (IR), for day and night visibility, as well as rangefinders (e.g., LIDAR, RADAR, ultrasonic, etc.) to determine distance to target. Additional sensors within the sensor unit 125 may include rate gyros (e.g., MEMS or fiber optic gyros), which may be used to stabilize the weapon 110 within the mount 120. Magnetometers and accelerometers also may be included within the weapon system 100, and may be used for canceling gyro drift. Accelerometers also may be used to detect and respond to vehicle accelerations (i.e., when the weapon system 100 is mounted on a vehicle), and vibrations caused by vehicle movement and/or terrain and weather. Sensors 125 also may include wind speed sensors, including hot-wire, laser/LIDAR, sonic and other types of anemometers. Additionally, as described below, a global positioning system (GPS) receiver or other positioning devices may be included within the sensor unit 125, in order to determine the weapon location, head, and velocity to compute firing solutions, and for use in situations where external target coordinates are provided. It should also be understood that for each of the cameras and/or sensors described above and elsewhere herein, the cameras/sensors may be housed within the sensor unit 125, positioned elsewhere in the weapon system 100, installed on a structure or vehicle on which the weapon system 100 is mounted, or installed at a separate remote location and configured to transmit wireless sensor data back to the weapon system 100.
  • Referring now to FIG. 2, a block diagram is shown illustrating various components and systems, and the computing/communication architecture within a motorized weapon system. In this example, weapon system 200 may correspond to same weapon system 100 discussed above, and/or other variations of weapon systems described herein. As in the example above, weapon system 200 includes a weapon 225, mount 230, motor 235, and a camera/sensor unit 245. Weapon system 200 also includes a targeting/firing system 210, described below in more detail, which may be implemented in hardware, software, or a combination of hardware and software. Additionally, weapon system 200 may include operator-facing components, including controls 245 and a display screen 250.
  • As indicated by the arrows shown in the diagram of weapon system 200, the targeting/firing system 210 may be configured to control drive the motor 235 to a particular target point, and to initiate firing of the weapon 225. The camera/sensor unit 240 may collect image and sensor data, and transmit that data back to the targeting/firing system 210 for use in target detecting, selection, and tracking functionality. In some cases, image and sensor data may be transmitted directly from the sensor unit 240 to the display 250 for rendering/use in an operator user interface. The targeting/firing system 210 also may transmit various targeting data to the display device 250 for presentation to the operator, and may receive from the operator firing commands and/or other control commands via the operator controls 245.
  • In some embodiments, all components of a weapon system 200 may be co-located and installed together as a single integrated system. For instance, weapon systems 200 may include turrets or platform-mounted guns which include the weapon/motor 225-235, camera/sensor unit 240, targeting/firing system 210, as well as the operator controls 245 and display 250. However, in other embodiments, some or all of the components of a weapon system 200 may non-integrated and located remoted from the others. For example, in some cases the weapon/motor 225-235 and a subset of the sensors/cameras 240 may be located near the potential targets, while the targeting/firing system 210 and operator interface components 245-250 may be in a distance remote location. Certain sensors 240 may be located at or near the weapon 225 (e.g., to measure distance to target, current location, weapon movement and vibration, wind and weather conditions, etc.), while other sensors 240 may be positioned at or near the target and/or at other angles to the target, while still other sensors or cameras 240 may be remotely located (e.g., drone-based cameras, satellite imagery, etc.). In embodiments in which certain components of a weapon system 200 are located remotely from others, each of the components may include network transceivers and interfaces configured for secure network communication, including components for data encryption and transmission over public or private computer networks, satellite transmission systems, and/or secure short-range wireless communications, etc.
  • The targeting/firing system 210 may receive input data from various data sources, and analyze the data to identify, select, and prioritize targets, actuate the motor 235, dynamically track targets, generate firing solutions, and control firing of the weapon 225. In order to perform these functions, the targeting/firing system 210 may receive data from one or more cameras/sensor units 240, including a GPS unit 211. The sensor data may include images of targets and potential targets, distance/range data, heat or infrared data, audio data, vehicle or weapon location data, vehicle or weapon movement and vibration data, wind and weather condition data, and any other sensor data described herein. Additionally, one or more data stores may store system configuration and operation data, including a rules data store 213 and a profiles data store 214. The rules data store 213 may include, for example, target identification rules, target selection/priority rules, firing rules, and other rules of engagement, each of which may depend on the particular operation, the current location of the weapon system 200, the individual operator, etc. The profiles data store 213 may include, for example, individual user profiles with user preferences and parameters, weapon profiles, and/or ballistic profiles that may include specifications for individual weapon types and ammunition types that may be used to calculate maximize ranges and targeting solutions. Additionally, one or more communication modules 212 within the targeting/firing system 210 may be used to receive commands and other data from the current operator and/or from a separate command centers. As discussed below, commands received from a command center or other higher-level authority may be to control the target selection and rules of engagement for particular operations. Communication modules 212 also may be used to receive or retrieve sensor data from remote sensor systems, including satellite data, image data from remote cameras, target GPS data, weather data, etc. The targeting/firing system 210 may include various components (e.g., targeting component 220) configured to receive and analyze the various data to performing target functions including subcomponents for target detection 221, target selection 222, target tracking 223, and firing control 215, among others.
  • The operator controls 245 and display screen 250 may correspond to the input/output interface between the human operator and the weapon system 200. As noted above, certain weapons systems 200 may be fully autonomous, or may operate in a supervised autonomous mode, in which case the operator controls 245 and display screen 250 need not be present. Additionally, the operator controls 245 and display screen 250 may be remotely located in some embodiments, allowing the operators to control the weapon system 200 from a separate location that may be a few feet away or across the globe. The display device 250 may receive and output various user interview views to the operator, including views described below for identifying and highlighting targets, obscuring non-targets, rendering target points, weapon trajectories, confidence ranges, and providing various additional sensor readings to the operator. The operator controls 245 may allow the operator to identify, select, and mark targets, and to fire the weapon 225. As shown in this example, the operator controls 245 may include a fire button 246 (to fire the weapon 225), and a “next target” button 247 to instruct the target component 220 to re-select the next priority target. In certain embodiments, the operator controls might include only these two buttons, and need not include a joystick for aiming tracking, etc.
  • Referring briefly to FIGS. 3A-3C, these drawings illustrate the operation of motorized weapons systems on three different vehicle-based mounting platforms. In the example of FIG. 3A, a motorized weapon system is mounted on a stationary or moving vehicle 306. The remote weapon system 304 holds the firearm 305, and various sensors may be installed in the frame of reference of the firearm 305, in the frame of reference of the gimballed remote control, and/or in the frame of reference of the vehicle 306. In these examples, the field of view 307 is represented by dotted lines. A crosshair 301 shows the current projected point of impact. In each of FIGS. 3A-3C, the crosshair 301 is not yet on target, and it may be assumed that the motor is engaged driving the firearm to the target position, or the operator has not yet confirmed the target. The targeting system in these examples shows a primary target 302 identified by a doubled-dashed box, and a secondary target which has been identified but not yet targeted, is shown within a singled dashed box 303. FIG. 3B shows a similar set of components, but in this case, the scenario is a maritime use with an armed boat 306 as the vehicle. FIG. 3C shows yet another scenario in which the vehicle 306 is a helicopter. FIG. 3C also illustrates that the system may identify multiple secondary targets 303 within the field of view 307.
  • Referring now to FIG. 4, a flow diagram is shown illustrating a process by which a motorized weapon system may identify, target, engage, and fire on one or more targets. As described below, the steps in this process may be performed by one or more components in the example motorized weapon system 200 discussed above, such as targeting/firing system 210 and the subsystems thereof, in conjunction with the weapon/mount/motor components 225-235, one or more sensor units 240, operator interface components 245-250, and/or various remote and external systems. However, it should be understood that process steps described herein, such as target identification and prioritization, dynamic target tracking, semi-autonomous target selection, motor actuation and firing control/locking capabilities, and the like, need not be limited to the specific systems and hardware implementations described above in FIGS. 1-3, but may be performed within other motorized weapon systems and environments comprising other combinations of the hardware and software components described herein.
  • In step 401, the components of the motorized weapon system 200 may identify and verify one or more targets, using sensor units 240 and/or additional data sources. In some embodiments, the identification and/or verification of targets may be performed fully autonomously by the system 200. For example, image data from cameras and sensor data from other sensors 240 (e.g., range to target data, heat data, audio, etc.) may be used to identify one or more targets within the range and proximity of the weapon system 200. In some cases, data from additional sources may be used as well, including imagery or sensor data from remote sensor or imaging systems (e.g., other weapons systems 200, fixed cameras, drones, satellites, etc.). For example, if sensor unit 240 does not include a rangefinder and/or if exact range to target data is not available, the targeting/firing system 210 may be configured to calculate approximate range data using passive ranging techniques. For example, heights of known objects (or presumed heights) may be used to calculate the distance of those objects from the weapon system 200. Additional sources of target data also may be received via communication modules 212, which may include the GPS coordinates of targets, or bearing to targets, received from a command center. Such image data and other sensor data received from additional data sources may be used by the targeting/firing system 210 to triangulate or confirm a target's location, or verify the identity of a target, etc.
  • As used herein, target identification and target verification refer to related but separate techniques. Target identification (or target detection) refers to the analysis of camera images, sensor data, etc., to detect objects and identify the detected object as potential targets for the weapon system 200 (e.g., vehicles, structures, weapons, individuals, etc.), rather than generally non-target objects such as rocks, trees, hills, shadows, and the like. Target verification (or target confirmation) refers to additional analyses of the same images/sensor data, and/or additional sources images/sensor data, to determine whether or not the identified potential target should be selected for targeting by the weapon system 200. Target verification techniques may be based on the configuration of the system and priorities of the particular mission, etc. For example, target verification techniques for vehicles may include identifying the size of a vehicle target (e.g., based on image analysis, target range, heat signatures from engines, etc.), the vehicle type (e.g., based on image analysis, and comparisons to a database 214 of target/non-target images), the presence of weapons on a target or proximate to a target, etc. For example, the size, shape, color, movement, audio and heat signatures of a vehicle may be analyzed to determine if that vehicle is a drone, helicopter, aircraft, boat, tank, truck, jeep, or car, whether the target is a military or civilian vehicle, the number of individuals and/or weapons on the vehicle, and the like, all of which may be used be a rules database 213 to determine whether the vehicle is a target non-target. Target verification also may include identifying particular insignia on targets, and for human targets, facial recognition and/or biometric recognition to confirm the identity of the target.
  • In some cases, both target identification and target verification in step 401 may be performed fully autonomously by the weapon system 200, using the techniques described above. In other cases, target identification and/or verification may include semi-autonomous or manual steps. For example, the rules of engagement for particular operations may require that each target be visually confirmed by a human operator. Such visual confirmation may be performed by the operator, as described in steps 406-407 below. Additionally or alternatively, the visual confirmation may be received from a different user, such as a commanding officer at a remote command center or other authorized user. In such cases, the weapon system 200 may be configured to transmit imagery and other sensor data to one or more remote locations, and then to receive the instructions identifying the potential target as a selected target or a non-target, from the remote authorized user/command center via a communication module 212. These remote visual confirmation techniques may be entirely transparent with respect to the operator of the weapon system 200 in some cases, that is, if a target is not selected/confirmed by a remote authorized user then that target might not ever be rendered or selected via the operator display device and/or might not be selectable by the operator during steps 406-407.
  • As noted above, both target identification and target selection in step 401 may be based on sets of rules received via a rules database 213 or other sources. Target selection rules may be based on target type (e.g., types of vehicles, individuals (if any), and structures, etc.), target size, target distance, the presence and types of weapons on a target, the uniform/insignia on a target, and the like. Additional rules may relate to the probability that the target has been accurately identified (e.g., level of confidence of facial recognition, vehicle type identification, insignia recognition, etc.), the probability that the weapon system 200 will be able to hit the selected target (e.g., based on target distance, target movement, weapon and ammunition type, wind and weather conditions, etc.), and/or the presence of potential collateral damage that may occur if the target is fired upon (e.g., based on detection of friendly and non-targets in the proximity of the identified target). Different sets of rules may be applied for different operators, different weapons 225 and ammunition types, different times, and/or different physical locations for the engagement. For instance, while one set of target identification, selection, and prioritization rules may be selected and applied by the targeting/firing system 210 for an engagement with a particular operator, at a particular date and time, using a particular weapon/ammunition type, in a particular country/region of the engagement, having particular lighting or weather conditions, and so on, an entirely different set of target identification, selection, and prioritization rules may be selected and applied by the targeting/firing system 210 if one or more of these variables (e.g., operator, time, weapon or ammunition type, engagement location or environmental conditions, etc.) changes.
  • In step 402, for scenarios in which multiple targets have been identified and selected in step 401, the targeting/firing system 210 of the motorized weapon system 200 may be configured to prioritize the multiple targets, thereby determining a firing order. As with the techniques for target identification and selection described above, target prioritization techniques similarly may be on imagery and sensor data, as well as sets of operational rules that may apply to operators, weapons, locations, etc. Examples of target prioritization rules may include, without limitation, rules that prioritize vehicles over human targets, certain types of vehicles over other types of vehicles, armored vehicles over non-armored vehicles, armed targets over non-armed targets, uniformed/insignia targets over non-uniformed or insignia targets, close targets over far targets, advancing targets over stationary or retreating targets, higher confidence targets (i.e., higher probability of weapon being able to hit the target) over lower confidence targets, targets firing weapons over targets not firing weapons, and/or any combination of these criteria. In some examples, the targeting/firing system 210 may evaluate the current target distance and trajectory of all advancing and armed targets (e.g., missiles, drones, ground vehicles, and individuals, etc.), in order to prioritize the targets in the order in which they would first reach the current position (or future position) of the weapon system 200. These target prioritization rules also may include rules determining how particular types of targets may be targeted. For example, such rules may include the desired point of impact for a particular target type (e.g., the engine of boat, the center of mass of an individual, etc.).
  • Additionally, different sets of rules or algorithms may be applied for prioritizing targets, depending on the current operator, current location, current date/time, and/or based on predefined operation-specific rules of engagement. Further, rules or algorithms for prioritization may be based on or adjusted in view of current conditions, such as the current amount of ammunition of the weapon system 200 (e.g., lower ammunition circumstances may cause prioritization of most valuable/important targets first), the current wind or weather conditions (e.g., in which closer and/or higher confidence targets may be prioritized), or based on nearby friendly or non-hostile targets (e.g., in which closer and/or higher confidence targets may be prioritized). Additionally, certain prioritizing algorithms may adjust the priorities of a set of targets to reduce and/or minimize the lag time between successive firings of the weapon, for instance, by prioritizing a set of nearby targets successively in the priority rank order, in order to reduce the firing latency time required to drive the weapon 225 through the sequence of targets.
  • In various embodiments, operators may be permitted to switch on-the-fly between different rules or algorithms for target selection and prioritization. Such switching capabilities may be based the rank and/or authorization level of the operator, and in some cases may require that a request for approval be transmitted from the weapons system 200 to a high-level user at a remote command center.
  • Referring briefly to FIG. 5, a display screen is shown displaying an example user interface 500 that may be generated by a motorized weapon system 200 during engagement of a set of targets. In this example, a plurality of targets have been identified and selected within the range and proximity of the weapon system 200. The targets have been prioritized to select a primary target 501, several secondary targets 502, and several non-targets 503 (e.g., friendly or non-hostile vehicles or individuals). In this example, the primary target 501 is indicated with a double dotted line, the secondary targets 502 are indicated with a single dotted line, and the non-targets have no lines. It should be understood that different types of user interface indicators may be used in other embodiments, such a green border (or other color) for the primary target 501, and a different color for secondary targets 502. In some examples, the secondary targets 502 might not be indicated at all on the user interface 500, until a secondary target 502 becomes the primary target 501. In other examples, only N number of the secondary targets 502 might be identified on user interface 500, such as the only next highest priority target 502, or the two next highest priority targets, etc. Additionally, non-targets 503 may be entirely obscured or blocked out, so as not to distract the operator. Crosshairs 505 are also displayed in this example, representing the point at which the weapon 225 of the weapon system 200 is currently aimed.
  • Finally, example user interface 500 includes two operator controls: a fire button 510 to allow the user to fire the weapon 225, and a next button 515 to allow the user to select the next target in the priority list. In this example, fire button 510 is shaded indicating that the weapon 225 cannot currently be fired. As described below in more detail, this may represent a feature in which the operator's firing control mechanism 246 is disabled whenever the weapon 225 is not currently aimed at a selected target. However, it will be noted that the next button 515 is enabled in this example, indicating that the next mechanism 247 that allows the operator to change the primary target 501 to the next highest priority target 502 in the priority list may be enabled even when the crosshairs 505 are not yet positioned on the primary target 501.
  • The kill chain sequence may continue by performing the functionality of steps 403-410 in a continuous loop for each of the targets selected in step 401, and in the priority order of the target prioritization performed in step 402. Therefore, the first iteration of steps 403-410 may be performed for the highest priority target, the second iteration of steps 403-410 may be performed for the second highest priority target, and so on.
  • In step 404, for the current highest priority target in the prioritization list, the targeting/firing system 210 may perform a dynamic tracking technique to determine a firing solution for that target. A firing solution refers to a precise firing position for the weapon (e.g., an azimuth/horizontal angle and altitude/elevation angle) and a precise firing time calculated by the targeting/firing system 210 to hit the primary target. For stationary targets, target tracking need not be performed, and the firing solution may be computed based on a number of factors, including the target distance and target bearing from the weapon 225, the muzzle velocity of the weapon 225, the aerodynamic drag of the projectile/ammunition to be fired, the wind and weather conditions, and gravity (any one of which may vary based on the current conditions).
  • When the target is moving and/or anticipated to be moving, dynamic target tracking may be required to generate a firing solution, introducing additional variables which may increase the complexity and uncertainty of the firing solution calculation. Initially, dynamic target tracking may involve calculating the anticipated direction and velocity of the target. In some embodiments, the targeting/firing system 210 may assume that the primary target will continue along its current course with the same velocity and direction. If the target is currently moving along a curved path, and/or is currently accelerating or decelerating, then the targeting/firing system 210 may assume the same curved path and/or the same acceleration/declaration pattern, and may extrapolate out based on those variables. Further, in some embodiments, the targeting/firing system 210 may anticipate future changes in course or speed, based on factors such as upcoming obstructions in the target's path, curves in roads, previous flight patterns, etc.
  • In addition to dynamically tracking the target in order to anticipate the future position of the target, the determination of a firing solution for a moving target also may take into account the anticipated time to drive the motor 235 so that the weapon is positioned at the correct firing point, and the anticipated amount of time between the firing command and when the projectile/ammunition will reach the target. The time to drive the motor 235 may be calculated based on the distance the gun is to be driven, the speed of the motor and/or the weight of the weapon 225. The amount of time between receiving a firing command and when the projectile/ammunition will reach the target may be based on the muzzle velocity of the weapon 225, the aerodynamic drag of the projectile/ammunition to be fired, the wind and weather conditions, etc. Additionally, in some cases, an anticipated delay for operator reaction time (e.g., 0.5 seconds, 1 second) also may be included in the firing solution calculation.
  • Referring briefly to FIG. 6, another example user interface 600 is shown that may be generated by a motorized weapon system 200 during engagement of one or more targets. In this example, only a single primary target 601 is shown, and the targeting/firing system 210 has assessed that the target 601 is moving toward the lower-right direction of the interface 600. Based on the factors discussed above, namely (a) the anticipated movement of the target 601, (b) the time required to engage the motor 235 and drive the weapon to the firing point, and (c) the time for the projectile/ammunition to be fired and reach the target, the targeting/firing system 210 may calculate the firing solution. In this example, the crosshairs 605 represents the point at which the weapon 225 is currently aimed, the point 606 represents the desired point of impact on the target 601, and point 607 represents the firing solution determined by the targeting/firing system 210. As shown in user interface 600, the motor 235 is currently re-positioning the weapon toward the firing solution point 607, and the firing solution computation has taken into account the time reposition the weapon 225 and the projectile time-to-target. Potentially, the firing solution computation also may take into account a short time delay to fire the weapon, and/or an anticipated operator decision time delay.
  • Further, example interface 600 also includes three operator controls: a fire button 610, a next button 615, and a safe button 620. As discussed above, the fire button 610 allows the operator to fire the weapon 225, but in some cases might be enabled only after the weapon 225 has reached the firing solution point 607. The next button 615 allows the operator not to fire the weapon 225 at the primary target 601, but instead to re-select the next highest priority target in the priority list. In this example, the primary target 601 may be moved to the back of the priority list or elsewhere in the priority list, based on the operator's selection of the next control 615. Finally, the safe button 620 allows the operator to mark the currently selected primary target 601 as a friendly or non-target object, thereby removing it from the set of selected targets determined in step 401 and priority list of step 402. Thus, after an operator has marked a target using the safe mechanism 615, it may not be selected again by the targeting/firing system 210, at least during the current engagement by the current weapon system 200. In some embodiments, the configuration settings of the targeting/firing system 210 may determine that a target marked as safe by an operator during one engagement might thereafter be excluded from target selection/prioritization in future engagements. Additional or alternatively, weapon system 200 may transmit data identifying any targets marked as safe to other weapons systems 200 in the same general location, so that those other weapons systems 200 may automatically remove the target marked as safe from their target selection/prioritization lists as well.
  • Although step 404 was described above as performed for only a single target (i.e., the current highest priority target), in some embodiments, the targeting/firing system 210 may continuously performing dynamic tracking for all targets selected/prioritized in steps 401-402. In such cases, by performing dynamic tracking on the selected secondary target(s), before the completion of the firing sequence 403-410 for the primary target, the targeting/firing system 210 may more quickly and efficiently determine the firing solution for the next primary target as soon as the firing sequence 403-410 is completed for the first primary target. Additionally, while dynamically tracking a plurality of secondary target(s), the targeting/firing system 210 may potentially re-order the prioritization sequence determined in step 402, for example, based on movement of the secondary targets and/or based on newly received data about one or more of the secondary targets (e.g., improved verification information, additional threat information, etc.).
  • In step 405, the targeting/firing system 210 may engage the motor 235 to drive the orientation of the weapon 225 toward the firing solution determined for the primary target in step 404 Thus, referring again to FIG. 6, the motor 235 may be engaged to aim the weapon 225 from its currently aimed position 605, to the determined firing solution point 607. It may be noted from this example, that (a) the weapon 225 may be driven not toward the current position point of the target 606, but instead to the future position point 607, and (b) that the motor 235 may be engaged and the weapon 225 may be driven to this point by the targeting/firing system 210 in a fully autonomous manner, before any action has been taken by the operator to view, select, mark, or engage this target.
  • In step 406, the targeting/firing system 210 may generate and transmit a user interface to be rendered for the operator via one or more display devices 250. As discussed above, the human operator may be located at the weapon system 200 or remote to the weapon system 200, in which case the user interface may be transmitted via the communication module 212 over one or more secure computer networks, wireless networks, satellite networks, etc. In various embodiments, the user interface provided in step 406 may correspond to user interfaces 500 and/or 600 discussed above, although several variations may be implemented in different embodiments. For instance, as noted above, the primary target 501 may be marked by a particular scheme that is different from the secondary targets and from non-targets. In some cases, the user interface may automatically zoom in on the primary target (as in screen 600) to allow the operator the best possible visual of the target. Additionally or alternatively, secondary targets and/or non-targets may be blocked out, hidden, or otherwise obscured to prevent confusion or distraction by the operator. Further, in different embodiments, each of the various different target points discussed above (e.g., crosshairs 605 representing current weapon aiming point, the current target position point 606, and/or firing solution target point 607) may or may not be rendered within the user interface, and/or may be shown in different colors, using different graphics and icons, etc. Finally, the user interface generated and rendered in step 406 may include additional components such as side menus, overlays, and the like, to convey any relevant sensor information about the target or the firing environment. Examples of such sensor that may be included in the operator user interface may include the target type, target name/identifier of verified (if known) and confidence level of the verified name/identifier, distance to target, current wind and weather conditions, current status of weapon 225 and ammunition supply, number of other secondary targets, etc.
  • In step 407, the targeting/firing system 210 may receive engagement instructions from the operator, via operator controls 245. As illustrated in FIG. 5, in some embodiments, the operator controls might only include two buttons: a fire button and next button. Or, as illustrated in FIG. 6, the operator controls might include only three buttons: a fire button, a next button, and safe button. Although any number of different/additional operator controls may be included in other embodiments (e.g., mouse/joystick for aiming, manual override, target selection controls, etc.), there are certain technical advantages associated with a limited interface such as a two-button or three-button interface as shown 500-600, including simplification of operator interface, reduction or real-time operator errors, increased speed to weapon firing, etc.
  • Additionally, as noted above during the discussion of the dynamic target tracking, there may be time delay between steps 406 and 407, for target analysis, evaluation, and decision-making by the operator. During this time delay, the dynamic tracking may continue for the primary target as well as the secondary targets selected by the targeting/firing system 210. Thus, while the operator deliberates on whether or not to fire on a target between steps 406 and 407, for moving targets and/or other circumstances (e.g., a detected change in the wind), the firing solution may be updated during this time delay and the motor 235 may be continuously engaged so that the weapon 225 is continuously aimed at the most recent firing solution target point. Additionally, for excess delays or deliberations between steps 406 and 407, the target identification, selection, and prioritization techniques discussed above in steps 401 and 402 may be updated, automatically and entirely transparently to the operator, to re-select and re-prioritize the targets based on new imagery, sensor data, and other relevant data received during the time delay between steps 406-407.
  • After receiving the firing/engagement instructions from the operator in step 407, the targeting/firing system 210 may perform the received instructions in steps 408-410. In this example, similar to that shown in FIG. 6, there are only three possible operator instructions with respect to the primary target shown in the user interface: fire on the target (step 408), do not fire on the target and proceed to the next target (step 409), and do not fire on the target and mark the target as a non-target (step 410). As discussed above, the fire command (408) is an operator instruction to fire the weapon 225, and in some cases might be enabled only after the weapon 225 has reached the firing solution target point. When the operator selects the fire button 246 (or other fire command) in step 408, the targeting/firing system 210 may initiate firing of the weapon 225, and then return to perform steps 403-410 for the next highest priority target. Additionally, in some embodiments, the targeting/firing system 210 may be configured to evaluate the accuracy of the projectile fired in step 410, and may perform a real-time automatic correction in the targeting algorithm based on the accuracy evaluation. For example, upon firing a shot in step 410, the targeting/firing system 210 may be configured to activate one or more cameras or sensors from sensor units 240 (which may be local or remote), to detect the landing time and location of the projectile. Additional sensors such as audio sensors, heat sensors, etc., also may be used to determine where the projectile hit/landed. The projectile landing/hit data may compared to the firing solution/target point data that was determined by the targeting/firing system 210 prior to firing the projectile. If the shot was off target by an amount greater than a predetermined accuracy threshold, then the targeting/firing system 210 may be configured to adjust its targeting algorithm in real-time, so that the updated algorithm may be used in the next iteration of steps 403-410. Additionally, if the shot was off target by a sufficient amount that the target was missed, then the targeting/firing system 210 may be further configured to re-insert the previously fired upon target back into the priority list of selected targets.
  • The next command (step 409) is an operator instruction not to fire the weapon 225 at the target, but to retain the target within the set of selected targets/target priority list, and then to re-select the next highest priority target in the priority list. In various examples, a next command in step 409 may cause the target to be placed at the back of the priority list of selected targets, or may cause the target to placed immediately after the next highest priority target in the priority list. Finally, a safe command (step 410) is an operator instruction to mark the target as a friendly or non-target object, thereby removing it from the set of selected targets and target priority list. Thus, after step 410, the target may not be selected again by the targeting/firing system 210, during at least the current engagement by the current weapon system 200. As noted above, in some embodiments, a target marked as safe during step 410 during an engagement at one weapon system 200 also might be excluded from target selection in future engagements of the weapon system 200, and/or during current and future engagements at different weapons systems 200.
  • Thus, the various techniques discussed above with reference to FIG. 4, including without limitation: (a) autonomous target selection, prioritization, and re-selection by the targeting/firing system 210, (b) dynamic target tracking of both the primary target and secondary targets that takes into account target movement, weapon/projectile characteristics, etc., (c) autonomous actuation of the motor to automatically orient the weapon toward the primary target before receiving any operator input, (d) a simplified user interface and operator controls, and (e) enabling/disabling of the firing mechanism depending on the projected point of impact of the weapon, alone and in combination, provide increased system efficiency, increased rate of firing, improved weapon system accuracy, and reduced operator error, along with the other technical advantages described herein.
  • As mentioned above, certain aspects of the present disclosure relate to techniques for disabling and re-enabling an operator firing control (e.g., 246), during the period of time when the motor 235 of a motorized weapon system 200 is engaged and the weapon 225 is being positioned and oriented toward a determined target point for firing. The process of engaging the motor 235 of the weapon system 200 to position the weapon 225 to fire on a particular target point may take anywhere from a fraction of second to several seconds, depending on factors size as the motor size and speed, gun size and weight, angular distance to be traveled, etc. During the time period when the motor 235 is engaged in positioning the weapon 225, the projected point of impact of a projectile fired from the weapon 225 may become closer and closer to the target point, and similarly, the likelihood of hitting the target may increase continuously until a maximum likelihood is reached when the projected point of impact of the weapon 225 (e.g., marked by crosshairs 505, 605, etc.) is directly on the determined firing solution target point. Because many unknown variables may exist during the weapon firing process (e.g., exact target distance and bearing, exact muzzle velocity and aerodynamic drag of projectile, future target movement, exact wind and air pressure conditions, exact weapon vibration, and so on), the probability of hitting the target might never be 100%. However, when the likelihood of hitting the target is determined to be sufficiently high, e.g., above a predetermined likelihood threshold, then the targeting/firing system 210 may be configured to enable firing of the weapon 225 (and/or automatically fire the weapon 225).
  • Accordingly, in some embodiments, the targeting/firing system 210 may be configured to determine if/when the predetermined likelihood threshold for hitting the target is reached during the time period when the motor 235 is engaged in positioning the weapon 225, but before the crosshairs 505 are directly on the target (i.e., before the projected point of impact of the weapon 225 is directly on the determined firing solution target point). In such embodiments, the targeting/firing system 210 may be configured to disable the operator firing mechanism 246 when the current likelihood of hitting the target is below the predetermined likelihood threshold, based on the position/orientation of the weapon 225 and other factors. The operator firing mechanism 246 then may be re-enabled in response to the targeting/firing system 210 determining that the current likelihood of hitting the target is above the predetermined likelihood threshold. These aspects are described below in more detail with reference to FIGS. 7-8.
  • Referring now to FIG. 7, a flow diagram is shown illustrating a process of disabling and/or re-enabling the firing mechanism of a motorized weapon system while the motor is engaged to move the weapon to a target point. As described below, the steps in this process may be performed by one or more components in the example motorized weapon system 200 discussed above, such as targeting/firing system 210 and the subsystems thereof, in conjunction with the weapon/mount/motor components 225-235, one or more sensor units 240, operator interface components 245-250, and/or various remote and external systems. However, it should be understood that process steps described herein, such as determination of likelihood thresholds for hitting targets, and corresponding boundary areas for motorized weapons systems, need not be limited to the specific systems and hardware implementations described above in FIGS. 1-3, but may be performed within other motorized weapon systems and environments comprising other combinations of the hardware and software components described herein.
  • In step 701, a motorized weapon system 200 has identified and selected a particular target, and determines a firing solution and/or target point for the selected target. Thus, step 701 may be similar or identical to step 404 discussed above. As noted above, one or both of the target and the weapon system 200 may potentially be moving during this process. When both the targets and the weapon 225 are stationary, target tracking need not be performed, and the firing solution target point may be computed based on factors including the target distance, target bearing from the weapon 225, muzzle velocity of the weapon 225, aerodynamic drag of the projectile/ammunition to be fired, the wind and weather conditions, and gravity (any one of which may vary based on the current conditions). However, when one or both of the selected target and the weapon 225 are moving and/or are anticipated to be moving, dynamic target tracking may be required to generate a firing solution, and additional variables may increase the complexity and uncertainty of the firing solution calculation. For example, dynamic target tracking may be used to determine the current velocity and direction of travel of both the weapon system 200 and the target, and that data may be used to calculate the anticipated velocity and direction of travel of both in the near future. In some cases, the targeting/firing system 210 may assume that both the weapon system 200 and the target may continue along their current course with the same velocity and direction, and if either is currently moving along a curved path and/or is currently accelerating/decelerating, then the targeting/firing system 210 may assume the same curved path and/or the same acceleration/declaration in the near future. As noted above, when performing dynamical tracking on a moving target, the determination of a firing solution (e.g., predicted future coordinates at a future firing time) also may take into account the anticipated time to engage the motor 235 to position and orient the weapon at the correct firing point, as well as the anticipated time lag for the fired projectile to reach the target. Additionally, in some cases, the targeting/firing system 210 may build in an anticipated delay for operator reaction time (e.g., 0.5 seconds, 1 second) which may be included in the firing solution calculations for moving targets.
  • In step 702, the targeting/firing system 210 of the motorized weapon system 200 may determine a boundary area surrounding the target point determined in step 701. In some examples, the boundary area may be referred to as a “confidence lock” boundary, because as discussed below, the firing mechanism may be disabled when the projected point of impact of the weapon is outside of this area. From the perspective of the weapon system 200, the boundary area may be a circle or other two-dimensional closed shape surrounding the target point. A simple example of a circular boundary area 807 is shown in FIGS. 8A-8B, discussed in more detail below. The boundaries of the area may correspond to a predetermined likelihood threshold of hitting the target and need not be any particular shape. That is, when the projected point of impact of the weapon 225 is directly on any point of the boundary of the area, the likelihood of the weapon 225 hitting the target may be calculated as a probability P, which may be the same for every point on the boundary of the area and is also the same as a predetermined likelihood threshold set by the targeting/firing system 210. Thus, for any shot taken when the weapon crosshairs are outside of the boundary area, the likelihood of hitting the target is less than P, and for any shot taken when the weapon crosshairs are inside of the boundary area, the likelihood of hitting the target is greater than P.
  • In some embodiments, the boundary area may be circular, as shown in FIGS. 8A-8B. Circular boundaries may generally apply when the determined probability P is the probability of the hitting the target point. However, if the determined probability P is the probability of hitting any point on the target, then the boundary area may be target-shaped (e.g., a larger vehicle-shaped boundary surrounding the target vehicle, a larger person-shaped boundary surrounding the target person, etc.). When either the target or the weapon system 200 is current moving, the boundary area may assume a more elongated shape in the direction of the movement, to account for the additional targeting uncertainties caused by the movement of the weapon system 200 or target. For example, for a horizontally moving target vehicle and/or horizontally moving weapon system, the boundary area may be shaped like a horizontally-elongated circle (or horizontally-elongated vehicle shape). In any of these examples, the boundary area may be defined in terms of angular coordinates (e.g., azimuth and altitude) from the perspective of the weapon 225.
  • The size of the boundary area determined in step 702 may be based on any combination of factors that may introduce uncertainty in the point of impact calculation of the weapon 225 with respect to the target. For instance, the size of the boundary area (e.g., in terms of angular degrees or coordinates) may be based on one or more of the target size, distance between the weapon 225 and the target, the general accuracy and precision data for the weapon type 225 and ammunition type, and other factors such as wind, vibration level of the weapon 225 during movement by the motor, and current movement of the weapon system 200 and/or the target. In scenarios where there is a high degree of confidence in the predictive accuracy of the weapon's crosshairs, the boundary area may be relatively small. In contrast, for scenarios of greater uncertainty of the relevant variables, and where the confidence level is in the predictive accuracy of the weapon's crosshairs is lower, than the boundary area may be relatively large.
  • In step 703, the targeting/firing system 210 engages the motor 235 to position and orient the weapon 225 toward the target point identified in step 701. Thus, step 703 may be similar or identical to step 405, discussed above. For example, referring back to FIG. 6, if the target 601 is stationary, then the engagement of the motor 635 may drive the position and orientation of the weapon 225 to a predicted point of impact of the stationary target point 606. If the target 601 is moving, then the engagement of the motor 635 may drive the position and orientation of the weapon 225 to a separate predicted future target point (e.g., 607) determined by a firing solution calculation based on predicted target movement and anticipated time delays until firing and impact.
  • In step 704, at a particular point of time when the motor 235 is engaged and the weapon 225 is moving, the targeting/firing system 210 may compute the projected point of impact if a projectile were fired from the weapon 225 at that time. The projected point of impact corresponds to the calculation of the crosshairs (e.g., 505 and 605) discussed above and shown in FIGS. 5 and 6. The calculation of the projected point of impact may be based on the specifications of the weapon system 200 and/or collected sensor data, such as the current position and orientation of the gun, the distance to target and bearing of the target from the weapon 225, the muzzle velocity of the weapon 225, the aerodynamic drag of the projectile to be fired, the current wind and weather conditions, and gravity (which may vary based on the current elevation).
  • In step 705, the targeting/firing system 210 may compare the projected point of impact computed in step 704 to the “confidence lock” boundary area defined in step 702. This may be straightforward comparison of angular coordinates from the perspective of the weapon 225. If the current point of impact of the weapon 225 is projected to fall outside of the defined boundary area (705:No), then in step 706 the targeting/firing system 210 may disable the operator firing mechanism 246 thereby preventing the weapon 225 from being fired. However, if the current point of impact of the weapon 225 is projected to fall within the defined boundary area (705:Yes), then in step 707 the targeting/firing system 210 may enable (or re-enable) the operator firing mechanism 246, thereby allowing the operator to fire the weapon 225.
  • In some embodiments, after the operator firing mechanism 246 has been re-enabled in step 707, and the operator fires on the target, the targeting/firing system 210 may be configured to perform a rapid post-firing command movement of the weapon 225 in order to further improve shot confidence. For instance, after the operator pushes the enabled firing mechanism 246, rather than immediately firing the weapon 225, the targeting/firing system 210 in some cases may engage the motor 235 for a short amount of time (e.g., 50 ms, 100 ms, 200 ms, etc.), in response to a determination that the corresponding small weapon movement may significantly increase shot confidence. These short post-firing command movements may be performed in the case of moving targets and/or moving weapon systems 200, in the event of a sudden change in the trajectory of the target, to correct for a lag in operator reaction time, and/or as part of a firing burst to increase hit probability.
  • Referring briefly to FIGS. 8A and 8B, two example user interface screens 800 are shown, during a process of engaging the motor 235 of a motorized weapon system 200 to position and orient the weapon 225 at a selected target point 806. In these examples, a circular “confidence lock” boundary area 807 has been defined by the targeting/firing system 210, outside of which firing of the weapon 225 is to be disabled. As shown in FIG. 8A, when the projected point of impact 805 of the weapon 225 falls outside of the boundary area 225, the operator may be unable to fire the weapon 225 (as indicated by the shaded fire button 810). In FIG. 8B, the motor 235 has now oriented the weapon 225 closer to the target point 806, and the projected point of impact 805 now falls within the boundary area 807. Therefore, as shown in FIG. 8B, the fire button is now re-enabled allowing the weapon 225 to be fired by the user. It is further noted in this example that the next button 815 and the safe button 820, which are discussed above in reference to FIGS. 5-6, are active and enabled regardless of the current orientation of the weapon 225.
  • As further shown in FIG. 7, the functionality of steps 704-707 may be performed multiple times while the motor 235 is engaged and the weapon 225 is moving toward the target point. In some embodiments, targeting/firing system 210 may perform steps 704-707 on a continuous loop at all times while the motor 235 is engaged, or in some cases even when the motor 235 is not engaged. Additionally or alternatively, the targeting/firing system 210 may be configured to initiate an instance of steps 704 in accordance with a schedule (e.g., every 100 ms, 200 ms, 500 ms, etc.).
  • As mentioned above, these steps may be performed periodically or continuously even when the motor 235 is not moving and the crosshairs 805 are fixed on the target point 806. In these scenarios, a new action such as a change in movement of the target 801 or the weapon system 200, an object obscuring the target 801, and/or new sensor readings (e.g., a change in wind conditions) may temporarily cause the probability level of the weapon 225 hitting the target to drop below the predetermine likelihood threshold and out of the confidence lock boundary area 807, requiring a minor adjust via the motor 235 or other corrective action by the weapon system 200.
  • Using similar techniques to those discussed above in referenced to FIG. 7, certain embodiments of a motorized weapon system 200 may implement a minimum confidence threshold for target selection and/or prioritization. In some cases, this minimum confidence threshold may be a separate determination from the level of confidence computed by the system 200 for identifying or verifying a target. Rather, this minimum confidence threshold may refer to the level of confidence that the weapon system 200 is able to hit the identified target. For example, if an identified and verified target is too far away from weapon system 200, is moving too fast or too erratically, is too small, is not within a sufficiently direct line-of-sight of the weapon 225, then the targeting/firing system 210 may determine that the confidence level that the weapon system 200 will hit the target is not sufficiently high to fire the weapon 235. Environmental conditions such as wind or weather conditions, lighting conditions, and/or other objects potentially obscuring the target object also may lower the confidence level computed by the targeting/firing system 210 for hitting the target. In such embodiments, when the confidence level computed by the targeting/firing system 210 falls below the predetermined threshold for target, that target may be automatically deprioritized so that it is not selectable by the operator (or selectable only via manual override). However, the targeting/firing system 210 may continue to monitor and dynamically track the low-confidence target, and may re-enable target selection and firing capabilities on that target as soon as the confidence level of hitting the target returns to above the minimum confidence threshold. The minimum confidence threshold is another operation-specific variable that may be altered based on the operation, the particular operator, the location, and other factors.
  • In some embodiments, over the course of a particular operation (or multiple operations at or near the same location) the firing/targeting system 210 may continuously assess and evaluate its target accuracy, which may result the system 210 increasing or decreasing the confidence levels it had previously computed for one or more selected targets. As an example, if a first target is initially determined to be too small and too far away to have a sufficiently high confidence level for firing on the target, the firing/targeting system 210 may instead select a number of closer targets and may fire on those targets. Then, by analyzing the firing trajectories and accuracies of hitting the closer targets, the firing/targeting system 210 may be better able to evaluate the range, lighting, wind conditions, and the like, so that the confidence level for the hitting the first target now may be increased based on the accuracy feedback from the closer targets.
  • As demonstrated in the above examples, a motorized weapon system 200 may be weapon-agnostic, in that a weapon system 200 may support many different types or models of weapons 235, including various firearms, large caliber rifles, machine guns, autocannons, grenade launchers, rockets, and/or directed energy weapons such as lasers, high-power microwave emitters, and other undisclosed devices. Further, the targeting/firing system 210 may weapon profiles in data store 214 and/or weapon-specific rules in data store 213, that allow the weapon system 200 to perform the techniques discussed herein in a similar or identical manner regardless of the current weapon type. In some embodiments, the targeting/firing system 210, sensor units 240, and the operator interface 245-250 may function identically regardless of the type of motor 235, mount 230, and weapon 225 integrated into the system 200. Because systems 200 having different types of weapons 225, mounts 230, and/or motors 235, may perform differently in some respects (e.g., time required to re-position and re-orient the weapon 225, maximum range of weapon, type, size, and speed of projectiles fired, etc.), the targeting/firing system 210 may be configured to initially determine these weapon-specific data factors, and adjust the techniques described herein to provide a uniform operator experience.
  • For instance, the targeting/firing system 210 of a first weapon system 200 may automatically select targets based on the firing range of the weapon 225 installed on that system 200, whereas a different system 200 might select more or less targets based on its having a weapon 225 with a different range. In other example, a first weapon system 200 may prioritize a set of selected targets taking into account the speed of the motor 235 on that system 200, whereas a different system 200 might prioritize the same set of targets differently as a result of having a different motor speed. As yet another example, different sensor units 240 have different numbers, types, and/or qualities of cameras and other sensors, may result in different sets of input provided to the targeting/firing systems 210. As a result, a first weapon system 200 may have sufficient data to select and verify a target with high confidence, while a second weapon system 200 with different cameras/sensors 240 would not select because it could not verify the target with a sufficient confidence level. In all of these examples, the different behaviors of the weapon systems 200, resulting from different weapons 225, mounts 230, motors 235, and/or sensor units 240 may be entirely transparent to the operator. In some cases, operators of weapons systems 200 need not ever know what weapon 225 they are firing, and the entire operator interface may function identically regardless of the particular weapon, motor, mount, or sensor unit. These similarities may apply to the operator interface with respect to the kill chain sequence described in reference to FIG. 4, the enabling/disabling of the operator's firing mechanism based on the confidence lock area boundary described in reference to FIG. 7, the related technique of enforcing a minimum confidence threshold for targeting/firing discussed above, and all other techniques described herein.
  • Additional techniques applicable to the above examples include the implementation of operation-specific rules of engagement that may be retrieved/received and enforced by the targeting/firing system 210. As discussed above, specific rules of engagement and/or operational parameters for the motorized weapon system may include different requirements or parameters for target identification and selection, different minimum confidence thresholds for firing the weapon 225, different target prioritization algorithms, and so on. In some embodiments, the motorized weapon system 200 may be configured to receive a set of operation-specific rules of engagement from a remote command center via a secure communication channel, store and apply those operation-specific rules during the appropriate operation. As noted above, specific rules of engagement and/or sets of operational parameters may be associated with specific operators, operator rank, engagement location (e.g., country, region, etc.). In some embodiments, operators having sufficient rank and/or authorization levels may be permitted to manually override certain rules of engagement and/or operational parameters of the weapon system 200, and to apply the operator's own preferred rules/parameters in place. Additionally or alternatively, such overrides may require outside approval, and thus upon receiving a rule/parameter override request from the operator, the weapon system may be configured to transmit a secure request for override approval a remote command center.
  • In several examples above, the target points for selected targets, including stationary and moving targets, are computed based on a desired point of impact location on the target (e.g., an engine of a boat or vehicle, the center of mass of an individual, etc.). However, in some embodiments, the targeting/firing system 210 may be configured with warning shot capabilities in which the desired point of impact location is not on the target. For instance, the rules of engagement enforced by the targeting/firing system 210 for a particular operation may dictate that only warning shots are to be fired at particular selected target. Alternatively, such rules may dictate that at least one initial warning shot is to be fired at a selected target before an attempt is made to hit the target. In some cases, the operator controls 245 also may include a warning shot mode that can be activated by the operator, independent of the rules of engagement of the operation, to allow the operator to independently fire one or more warning shots on any selected target.
  • When the targeting/firing system 210 is configured to operate in a warning shot mode, the firing solution may be adjusted to assure that the projectiles fired by the weapon 225 will miss the target. In some embodiments, the targeting/firing system 210 may determine the preferred location of a desired warning shot based on the type and size of the target (e.g., the number and position of warning shots for human targets may be different than for vehicle targets), the orientation and/or the direction of movement of the target (e.g., it may be desirable to firing a warning shot directly in front of the target), and so on.
  • Implementation of the techniques, blocks, steps and means described above may be done in various ways. For example, these techniques, blocks, steps and means may be implemented in hardware, software, or a combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
  • Furthermore, embodiments may be implemented by hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof. When implemented in software, firmware, middleware, scripting language, and/or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine readable medium such as a storage medium. A code segment or machine-executable instruction may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
  • For a firmware and/or software implementation, the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein. For example, software codes may be stored in a memory. Memory may be implemented within the processor or external to the processor. As used herein the term “memory” refers to any type of long term, short term, volatile, nonvolatile, or other storage medium and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
  • Moreover, as disclosed herein, the term “storage medium” may represent one or more memories for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information. The term “machine-readable medium” includes, but is not limited to portable or fixed storage devices, optical storage devices, and/or various other storage mediums capable of storing that contain or carry instruction(s) and/or data.
  • A computer system as illustrated in FIG. 9 may be incorporated as part of the previously described systems, such as to execute the client interface, perform the functionality of orchestration systems and/or datacenters, etc. FIG. 9 provides a schematic illustration of one embodiment of a computer system 900 that can perform various steps of the methods provided by various embodiments. It should be noted that FIG. 9 is meant only to provide a generalized illustration of various components, any or all of which may be utilized as appropriate. FIG. 9, therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
  • The computer system 900 is shown comprising hardware elements that can be electrically coupled via a bus 905 (or may otherwise be in communication, as appropriate). The hardware elements may include one or more processors 910, including without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, video decoders, and/or the like); one or more input devices 915, which can include without limitation a mouse, a keyboard, remote control, and/or the like; and one or more output devices 920, which can include without limitation a display device, a printer, and/or the like.
  • The computer system 900 may further include (and/or be in communication with) one or more non-transitory storage devices 925, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory (“RAM”), and/or a read-only memory (“ROM”), which can be programmable, flash-updateable and/or the like. Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like.
  • The computer system 900 might also include a communications subsystem 930, which can include without limitation a modem, a network card (wireless or wired), an infrared communication device, a wireless communication device, and/or a chipset (such as a Bluetooth™ device, an 802.11 device, a WiFi device, a WiMax device, cellular communication device, etc.), and/or the like. The communications subsystem 930 may permit data to be exchanged with a network (such as the network described below, to name one example), other computer systems, and/or any other devices described herein. In many embodiments, the computer system 900 will further comprise a working memory 935, which can include a RAM or ROM device, as described above.
  • The computer system 900 also can comprise software elements, shown as being currently located within the working memory 935, including an operating system 940, device drivers, executable libraries, and/or other code, such as one or more application programs 945, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein. Merely by way of example, one or more procedures described with respect to the method(s) discussed above might be implemented as code and/or instructions executable by a computer (and/or a processor within a computer); in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods.
  • A set of these instructions and/or code might be stored on a non-transitory computer-readable storage medium, such as the non-transitory storage device(s) 925 described above. In some cases, the storage medium might be incorporated within a computer system, such as computer system 900. In other embodiments, the storage medium might be separate from a computer system (e.g., a removable medium, such as a compact disc), and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon. These instructions might take the form of executable code, which is executable by the computer system 500 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer system 900 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.), then takes the form of executable code.
  • It will be apparent to those skilled in the art that substantial variations may be made in accordance with specific requirements. For example, customized hardware might also be used, and/or particular elements might be implemented in hardware, software (including portable software, such as applets, etc.), or both. Further, connection to other computing devices such as network input/output devices may be employed.
  • As mentioned above, in one aspect, some embodiments may employ a computer system (such as the computer system 900) to perform methods in accordance with various embodiments of the invention. According to a set of embodiments, some or all of the procedures of such methods are performed by the computer system 900 in response to processor 910 executing one or more sequences of one or more instructions (which might be incorporated into the operating system 940 and/or other code, such as an application program 945) contained in the working memory 935. Such instructions may be read into the working memory 935 from another computer-readable medium, such as one or more of the non-transitory storage device(s) 925. Merely by way of example, execution of the sequences of instructions contained in the working memory 935 might cause the processor(s) 910 to perform one or more procedures of the methods described herein.
  • The terms “machine-readable medium,” “computer-readable storage medium” and “computer-readable medium,” as used herein, refer to any medium that participates in providing data that causes a machine to operate in a specific fashion. These mediums may be non-transitory. In an embodiment implemented using the computer system 900, various computer-readable media might be involved in providing instructions/code to processor(s) 910 for execution and/or might be used to store and/or carry such instructions/code. In many implementations, a computer-readable medium is a physical and/or tangible storage medium. Such a medium may take the form of a non-volatile media or volatile media. Non-volatile media include, for example, optical and/or magnetic disks, such as the non-transitory storage device(s) 925. Volatile media include, without limitation, dynamic memory, such as the working memory 935.
  • Common forms of physical and/or tangible computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, any other physical medium with patterns of marks, a RAM, a PROM, EPROM, a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read instructions and/or code.
  • Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 910 for execution. Merely by way of example, the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer. A remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer system 900.
  • The communications subsystem 930 (and/or components thereof) generally will receive signals, and the bus 905 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the working memory 935, from which the processor(s) 910 retrieves and executes the instructions. The instructions received by the working memory 935 may optionally be stored on a non-transitory storage device 925 either before or after execution by the processor(s) 910.
  • 100981 It should further be understood that the components of computer system 900 can be distributed across a network. For example, some processing may be performed in one location using a first processor while other processing may be performed by another processor remote from the first processor. Other components of computer system 900 may be similarly distributed. As such, computer system 900 may be interpreted as a distributed computing system that performs processing in multiple locations. In some instances, computer system 900 may be interpreted as a single computing device, such as a distinct laptop, desktop computer, or the like, depending on the context.
  • The methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various procedures or components as appropriate. For instance, in alternative configurations, the methods may be performed in an order different from that described, and/or various stages may be added, omitted, and/or combined. Also, features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configurations may be combined in a similar manner. Also, technology evolves and, thus, many of the elements are examples and do not limit the scope of the disclosure or claims.
  • Specific details are given in the description to provide a thorough understanding of example configurations (including implementations). However, configurations may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the configurations. This description provides example configurations only, and does not limit the scope, applicability, or configurations of the claims. Rather, the preceding description of the configurations will provide those skilled in the art with an enabling description for implementing described techniques. Various changes may be made in the function and arrangement of elements without departing from the spirit or scope of the disclosure.
  • Also, configurations may be described as a process which is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure. Furthermore, examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
  • Having described several example configurations, various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the disclosure. For example, the above elements may be components of a larger system, wherein other rules may take precedence over or otherwise modify the application of the invention. Also, a number of steps may be undertaken before, during, or after the above elements are considered.

Claims (20)

What is claimed is:
1. A semi-autonomous motorized weapon system, comprising:
a weapon capable of firing munitions;
a two-axis or three-axis mount configured to support and position the weapon;
a motor coupled to the mount and configured to move the mount to specified positions, thereby controlling the direction to which the weapon is aimed;
a manual firing mechanism coupled to the weapon;
a processing unit comprising one or more processors; and
memory coupled with and readable by the processing unit and storing therein a set of instructions which, when executed by the processing unit, causes the semi-autonomous motorized weapon system to:
determine a target point associated with a target, at a remote location from the weapon system;
determine an area having a boundary surrounding the target point, wherein the boundary of the area is determined by comparing a likelihood of the weapon hitting the target when aimed at the boundary to a predetermined likelihood threshold, such that the weapon, when aimed at any point within the area, has a likelihood of hitting the target higher than the predetermined likelihood threshold;
after determining the target point, engage the motor with instructions to move the mount from an initial position to a target position at which the weapon is aimed at the target point;
during the engagement of the motor:
(1) periodically determine, during the movement of the mount toward the target position, whether the weapon is aimed at a position within the area surrounding the target point;
(2) in response to determining, during the movement of the mount toward the target position, that the weapon is not aimed at a position within the area surrounding the target point, disable the manual firing mechanism of the weapon system to prevent firing of the weapon; and
(3) in response to determining, during the movement of the mount toward the target position, that the weapon is aimed at a position within the area surrounding the target point, enable the manual firing mechanism to allow firing of the weapon by an operator;
receive a firing command from an operator, via the manual firing mechanism; and
in response to firing command being received at a time when the manual firing mechanism is enabled, firing the weapon.
2. The semi-autonomous motorized weapon system of claim 1, wherein determining the area surrounding the target point comprises:
calculating an angular distance, from the perspective of the weapon system, between the target point and the boundary of the area surrounding the target point.
3. The semi-autonomous motorized weapon system of claim 2, wherein the angular distance is calculated based on a first determined distance between the weapon system and the target point, and a second spherical radius distance determined based on characteristics of the target.
4. The semi-autonomous motorized weapon system of claim 1, wherein determining the area surrounding the target point comprises:
calculating a distance between the target point and the boundary of the area surrounding the target point, wherein the distance is calculated based on the size of the target..
5. The semi-autonomous motorized weapon system of claim 1, wherein determining the area surrounding the target point comprises:
calculating a distance between the target point and the boundary of the area surrounding the target point, wherein the distance is calculated based on characteristics of at least one of: the weapon, the mount, or the motor of the weapon system.
6. The semi-autonomous motorized weapon system of claim 1, wherein determining the area surrounding the target point comprises:
calculating a distance between the target point and the boundary of the area surrounding the target point, wherein the distance is calculated based on detected movement of the target.
7. The semi-autonomous motorized weapon system of claim 1, wherein determining the area surrounding the target point comprises:
calculating a distance between the target point and the boundary of the area surrounding the target point, wherein the distance is calculated based on movement of the weapon system.
8. The semi-autonomous motorized weapon system of claim 1, further comprising a display screen, wherein the memory stores additional instructions which, when executed by the processing unit, further causes the semi-autonomous motorized weapon system to:
output an augmented reality user interface via a display screen, the augmented reality user interface displaying an image of the target, a computer-generated indication of the target point, and a computer-generated indication of the boundary of the area surrounding the target point.
9. A method of operating a motorized weapon system, the method comprising:
determining, by a targeting system of the motorized weapon system, a target point associated with a target, at a remote location from the motorized weapon system;
determining, by the targeting system of the motorized weapon system, an area having a boundary surrounding the target point, wherein the boundary of the area is determined by comparing a likelihood of a weapon of the motorized weapon system hitting the target when aimed at the boundary to a predetermined likelihood threshold, so that the weapon, when aimed at any point within the area, has a likelihood of hitting the target higher than the predetermined likelihood threshold;
initiating, by the targeting system, engagement of a motor of the motorized weapon system, to move a mount from an initial position to a target position at which the weapon is aimed at the target point;
during the engagement of the motor:
(1) periodically determining by the targeting system, during the movement of the mount toward the target position, whether the weapon is aimed at a position within the area surrounding the target point;
(2) in response to determining by the targeting system, during the movement of the mount toward the target position, that the weapon is not aimed at a position within the area surrounding the target point, disabling a firing mechanism of the weapon system to prevent firing of the weapon; and
(3) in response to determining by the targeting system, during the movement of the mount toward the target position, that the weapon is aimed at a position within the area surrounding the target point, enabling the firing mechanism to allow firing of the weapon;
receiving, by the motorized weapon system, a firing command via the firing mechanism; and
in response to firing command being received when the firing mechanism is enabled, initiating firing of the weapon.
10. The method of operating a motorized weapon system of claim 9, wherein determining the area surrounding the target point comprises:
calculating an angular distance, from the perspective of the motorized weapon system, between the target point and the boundary of the area surrounding the target point.
11. The method of operating a motorized weapon system of claim 10, wherein the angular distance is calculated based on a first determined distance between the motorized weapon system and the target point, and a second spherical radius distance determined based on characteristics of the target.
12. The method of operating a motorized weapon system of claim 9, wherein determining the area surrounding the target point comprises:
calculating a distance between the target point and the boundary of the area surrounding the target point, wherein the distance is calculated based on the size of the target.
13. The method of operating a motorized weapon system of claim 9, wherein determining the area surrounding the target point comprises:
calculating a distance between the target point and the boundary of the area surrounding the target point, wherein the distance is calculated based on characteristics of at least one of: the weapon, the mount, or the motor of the motorized weapon system.
14. The method of operating a motorized weapon system of claim 9, wherein determining the area surrounding the target point comprises:
calculating a distance between the target point and the boundary of the area surrounding the target point, wherein the distance is calculated based on detected movement of the target.
15. The method of operating a motorized weapon system of claim 9, wherein determining the area surrounding the target point comprises:
calculating a distance between the target point and the boundary of the area surrounding the target point, wherein the distance is calculated based on movement of the weapon system.
16. The method of operating a motorized weapon system of claim 9, further comprising:
outputting an augmented reality user interface via a display screen, the augmented reality user interface displaying an image of the target, a computer-generated indication of the target point, and a computer-generated indication of the boundary of the area surrounding the target point.
17. One or more non-transitory computer-readable media, comprising computer-executable instructions, which when executed by one or more processors, perform actions including:
determining, by a targeting system of a motorized weapon system, a target point associated with a target, at a remote location from the motorized weapon system;
determining, by the targeting system of the motorized weapon system, an area having a boundary surrounding the target point, wherein the boundary of the area is determined by comparing a likelihood of a weapon of the motorized weapon system hitting the target when aimed at the boundary to a predetermined likelihood threshold, so that the weapon, when aimed at any point within the area, has a likelihood of hitting the target higher than the predetermined likelihood threshold;
initiating, by the targeting system, engagement of a motor of the motorized weapon system, to move a mount from an initial position to a target position at which the weapon is aimed at the target point;
during the engagement of the motor:
(1) periodically determining by the targeting system, during the movement of the mount toward the target position, whether the weapon is aimed at a position within the area surrounding the target point;
(2) in response to determining by the targeting system, during the movement of the mount toward the target position, that the weapon is not aimed at a position within the area surrounding the target point, disabling a firing mechanism of the motorized weapon system to prevent firing of the weapon; and
(3) in response to determining by the targeting system, during the movement of the mount toward the target position, that the weapon is aimed at a position within the area surrounding the target point, enabling the firing mechanism to allow firing of the weapon;
receiving, by the motorized weapon system, a firing command via the firing mechanism; and
in response to firing command being received when the firing mechanism is enabled, initiating firing of the weapon.
18. The non-transitory computer-readable media of claim 17, wherein determining the area surrounding the target point comprises:
calculating a distance between the target point and the boundary of the area surrounding the target point, wherein the distance is calculated based on detected movement of the target.
19. The non-transitory computer-readable media of claim 17, wherein determining the area surrounding the target point comprises:
calculating a distance between the target point and the boundary of the area surrounding the target point, wherein the distance is calculated based on movement of the motorized weapon system.
20. The non-transitory computer-readable media of claim 17, wherein the computer-executable instructions, when executed by the one or more processors, perform further actions including:
outputting an augmented reality user interface via a display screen, the augmented reality user interface displaying an image of the target, a computer-generated indication of the target point, and a computer-generated indication of the boundary of the area surrounding the target point.
US16/181,153 2017-11-03 2018-11-05 Semi-autonomous motorized weapon systems Abandoned US20190137219A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/181,153 US20190137219A1 (en) 2017-11-03 2018-11-05 Semi-autonomous motorized weapon systems

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762581280P 2017-11-03 2017-11-03
US16/181,153 US20190137219A1 (en) 2017-11-03 2018-11-05 Semi-autonomous motorized weapon systems

Publications (1)

Publication Number Publication Date
US20190137219A1 true US20190137219A1 (en) 2019-05-09

Family

ID=66328447

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/181,153 Abandoned US20190137219A1 (en) 2017-11-03 2018-11-05 Semi-autonomous motorized weapon systems

Country Status (4)

Country Link
US (1) US20190137219A1 (en)
EP (1) EP3704437A4 (en)
AU (1) AU2018423158A1 (en)
WO (1) WO2019221782A2 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019221782A3 (en) * 2017-11-03 2019-12-26 Aimlock Inc. Semi-autonomous motorized weapon systems
US20200182576A1 (en) * 2018-12-09 2020-06-11 Israel Weapon Industries (I.W.I.) Ltd. Firearm controlled by user behavior
CN111609753A (en) * 2020-06-01 2020-09-01 中光智控(北京)科技有限公司 Trigger control method and system
US20210097340A1 (en) * 2019-09-30 2021-04-01 Suzuki Motor Corporation Teaching Data Creation Device and Image Classification Device
US20210140733A1 (en) * 2019-11-11 2021-05-13 Israel Weapon Industries (I.W.I.) Ltd. Firearm with automatic target acquiring and shooting
US20210389071A1 (en) * 2020-06-10 2021-12-16 David H. Sitrick Automatic Weapon Subsystem Selecting Target, ID Target, Fire
US20210389088A1 (en) * 2020-06-10 2021-12-16 Jacob W. Bilbrey Autonomous + Automated Weapon System for Drones with Additional Linked Weapons
US11231252B2 (en) * 2020-06-10 2022-01-25 Brett C. Bilbrey Method for automated weapon system with target selection of selected types of best shots
US20220038262A1 (en) * 2020-07-31 2022-02-03 The United State Of America As Represented By The Secretary Of The Army Secure cryptographic system for datalinks
US11274904B2 (en) 2019-10-25 2022-03-15 Aimlock Inc. Remotely operable weapon mount
US11307575B2 (en) * 2019-04-16 2022-04-19 The Boeing Company Autonomous ground attack system
US20220205762A1 (en) * 2020-12-31 2022-06-30 Smart Shooter Ltd. Dual Mode Weapon-Mounted Fire Control System
CN115038928A (en) * 2020-02-03 2022-09-09 贝以系统哈格伦斯公司 Embedded target tracking training
US11441874B2 (en) * 2017-11-10 2022-09-13 Hanwha Defense Co., Ltd. Remote weapon control device and method for targeting and shooting multiple objects
US20220349677A1 (en) * 2019-03-12 2022-11-03 P2K Technologies LLC Device for locating, sharing, and engaging targets with firearms
US11499791B2 (en) 2019-10-25 2022-11-15 Aimlock Inc. Trigger and safety actuating device and method therefor
US11525649B1 (en) * 2020-07-15 2022-12-13 Flex Force Enterprises Inc. Weapon platform operable in remote control and crew-served operating modes
US20220404832A1 (en) * 2019-11-14 2022-12-22 Bae Systems Plc A weapon system
US20230037964A1 (en) * 2021-08-09 2023-02-09 Allan Mann Firearm Safety Control System
US20230056472A1 (en) * 2021-08-19 2023-02-23 Raytheon Company Firing cutout rapid generation aided by machine learning
US20230243622A1 (en) * 2022-01-31 2023-08-03 Robo Duels Inc. Method of operation of a mounted weapon and system for weapon stabilization and target tracking

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3618456A (en) * 1968-09-12 1971-11-09 Rheinmetall Gmbh Firing zone limiting apparatus
US5834677A (en) * 1995-07-20 1998-11-10 Giat Industries Stabilizing device for a small fire arm
US5974940A (en) * 1997-08-20 1999-11-02 Bei Sensors & Systems Company, Inc. Rifle stabilization system for erratic hand and mobile platform motion
US6269730B1 (en) * 1999-10-22 2001-08-07 Precision Remotes, Inc. Rapid aiming telepresent system
US7210392B2 (en) * 2000-10-17 2007-05-01 Electro Optic Systems Pty Limited Autonomous weapon system
US8601736B2 (en) * 2008-11-04 2013-12-10 Tommy Andersson Method and a device for stabilizing aiming direction for rifles and handguns and fire arm
US20140360072A1 (en) * 2013-06-07 2014-12-11 John Hancock Lupher Precision Guided Firearm Including an Optical Scope Configured to Determine Timing of Discharge
US20150101229A1 (en) * 2012-04-11 2015-04-16 Christopher J. Hall Automated fire control device
US9033232B2 (en) * 2010-08-20 2015-05-19 Rocksight Holdings, Llc Active stabilization targeting correction for handheld firearms
US20150211828A1 (en) * 2014-01-28 2015-07-30 Trackingpoint, Inc. Automatic Target Acquisition for a Firearm
US9435603B2 (en) * 2014-04-16 2016-09-06 Hanwha Techwin Co., Ltd. Remote weapon system and control method thereof
US9612088B2 (en) * 2014-05-06 2017-04-04 Raytheon Company Shooting system with aim assist
US9784529B1 (en) * 2015-04-07 2017-10-10 Matthew G. Angle Small arms stabilization system
US20190145738A1 (en) * 2017-11-10 2019-05-16 Hanwha Land Systems Co., Ltd. Remote weapon control device and method for targeting and shooting multiple objects
US20190310042A1 (en) * 2018-04-04 2019-10-10 Sed C. HIMMICH Weapon lock control system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100819801B1 (en) * 2006-03-03 2008-04-07 삼성테크윈 주식회사 Automatic shooting mechanism and sentry robot having the same
IL206142A0 (en) * 2010-06-02 2011-02-28 Rafael Advanced Defense Sys Firing mechanism security apparatus for remotely controlled automatic machine gun
IL211966A (en) * 2011-03-28 2016-12-29 Smart Shooter Ltd Firearm, aiming system therefor, method of operating the firearm and method of reducing the probability of missing a target
US8833231B1 (en) * 2012-01-22 2014-09-16 Raytheon Company Unmanned range-programmable airburst weapon system for automated tracking and prosecution of close-in targets
SA115360300B1 (en) * 2014-02-14 2017-08-29 ميريل افياشين، انك. Modular weapon station system
IL232828A (en) * 2014-05-27 2015-06-30 Israel Weapon Ind I W I Ltd Apparatus and method for improving hit probability of a firearm
FR3026174B1 (en) * 2014-09-24 2018-03-02 Philippe Levilly TELEOPERATED SYSTEM FOR SELECTIVE TARGET PROCESSING
AU2018423158A1 (en) * 2017-11-03 2020-05-21 Aimlock Inc. Semi-autonomous motorized weapon systems

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3618456A (en) * 1968-09-12 1971-11-09 Rheinmetall Gmbh Firing zone limiting apparatus
US5834677A (en) * 1995-07-20 1998-11-10 Giat Industries Stabilizing device for a small fire arm
US5974940A (en) * 1997-08-20 1999-11-02 Bei Sensors & Systems Company, Inc. Rifle stabilization system for erratic hand and mobile platform motion
US6269730B1 (en) * 1999-10-22 2001-08-07 Precision Remotes, Inc. Rapid aiming telepresent system
US7210392B2 (en) * 2000-10-17 2007-05-01 Electro Optic Systems Pty Limited Autonomous weapon system
US8601736B2 (en) * 2008-11-04 2013-12-10 Tommy Andersson Method and a device for stabilizing aiming direction for rifles and handguns and fire arm
US9033232B2 (en) * 2010-08-20 2015-05-19 Rocksight Holdings, Llc Active stabilization targeting correction for handheld firearms
US20150101229A1 (en) * 2012-04-11 2015-04-16 Christopher J. Hall Automated fire control device
US20140360072A1 (en) * 2013-06-07 2014-12-11 John Hancock Lupher Precision Guided Firearm Including an Optical Scope Configured to Determine Timing of Discharge
US20150211828A1 (en) * 2014-01-28 2015-07-30 Trackingpoint, Inc. Automatic Target Acquisition for a Firearm
US9435603B2 (en) * 2014-04-16 2016-09-06 Hanwha Techwin Co., Ltd. Remote weapon system and control method thereof
US9612088B2 (en) * 2014-05-06 2017-04-04 Raytheon Company Shooting system with aim assist
US9784529B1 (en) * 2015-04-07 2017-10-10 Matthew G. Angle Small arms stabilization system
US20190145738A1 (en) * 2017-11-10 2019-05-16 Hanwha Land Systems Co., Ltd. Remote weapon control device and method for targeting and shooting multiple objects
US20190310042A1 (en) * 2018-04-04 2019-10-10 Sed C. HIMMICH Weapon lock control system

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019221782A3 (en) * 2017-11-03 2019-12-26 Aimlock Inc. Semi-autonomous motorized weapon systems
US11441874B2 (en) * 2017-11-10 2022-09-13 Hanwha Defense Co., Ltd. Remote weapon control device and method for targeting and shooting multiple objects
US20200182576A1 (en) * 2018-12-09 2020-06-11 Israel Weapon Industries (I.W.I.) Ltd. Firearm controlled by user behavior
US10900733B2 (en) * 2018-12-09 2021-01-26 Israel Weapon Industries (I.W.I) Ltd. Firearm controlled by user behavior
US20220349677A1 (en) * 2019-03-12 2022-11-03 P2K Technologies LLC Device for locating, sharing, and engaging targets with firearms
US11307575B2 (en) * 2019-04-16 2022-04-19 The Boeing Company Autonomous ground attack system
US20210097340A1 (en) * 2019-09-30 2021-04-01 Suzuki Motor Corporation Teaching Data Creation Device and Image Classification Device
US11274904B2 (en) 2019-10-25 2022-03-15 Aimlock Inc. Remotely operable weapon mount
US11499791B2 (en) 2019-10-25 2022-11-15 Aimlock Inc. Trigger and safety actuating device and method therefor
US20210140733A1 (en) * 2019-11-11 2021-05-13 Israel Weapon Industries (I.W.I.) Ltd. Firearm with automatic target acquiring and shooting
EP4273495A3 (en) * 2019-11-14 2024-01-24 BAE SYSTEMS plc A weapon system
US20220404832A1 (en) * 2019-11-14 2022-12-22 Bae Systems Plc A weapon system
US11860632B2 (en) * 2019-11-14 2024-01-02 Bae Systems Plc Weapon system
CN115038928A (en) * 2020-02-03 2022-09-09 贝以系统哈格伦斯公司 Embedded target tracking training
CN111609753A (en) * 2020-06-01 2020-09-01 中光智控(北京)科技有限公司 Trigger control method and system
US20210389088A1 (en) * 2020-06-10 2021-12-16 Jacob W. Bilbrey Autonomous + Automated Weapon System for Drones with Additional Linked Weapons
US11231252B2 (en) * 2020-06-10 2022-01-25 Brett C. Bilbrey Method for automated weapon system with target selection of selected types of best shots
US20210389071A1 (en) * 2020-06-10 2021-12-16 David H. Sitrick Automatic Weapon Subsystem Selecting Target, ID Target, Fire
US11525649B1 (en) * 2020-07-15 2022-12-13 Flex Force Enterprises Inc. Weapon platform operable in remote control and crew-served operating modes
US20220038262A1 (en) * 2020-07-31 2022-02-03 The United State Of America As Represented By The Secretary Of The Army Secure cryptographic system for datalinks
US11606194B2 (en) * 2020-07-31 2023-03-14 United States Government As Represented By The Secretary Of The Army Secure cryptographic system for datalinks
US20220205762A1 (en) * 2020-12-31 2022-06-30 Smart Shooter Ltd. Dual Mode Weapon-Mounted Fire Control System
WO2022144845A1 (en) * 2020-12-31 2022-07-07 Smart Shooter Ltd. Dual mode weapon-mounted fire control system
US11713944B2 (en) * 2020-12-31 2023-08-01 Smart Shooter Ltd. Dual mode weapon-mounted fire control system
US20230037964A1 (en) * 2021-08-09 2023-02-09 Allan Mann Firearm Safety Control System
US11933559B2 (en) * 2021-08-09 2024-03-19 Allan Mann Firearm safety control system
WO2023023253A1 (en) * 2021-08-19 2023-02-23 Raytheon Company Firing cutout rapid generation aided by machine learning
US20230056472A1 (en) * 2021-08-19 2023-02-23 Raytheon Company Firing cutout rapid generation aided by machine learning
US20230243622A1 (en) * 2022-01-31 2023-08-03 Robo Duels Inc. Method of operation of a mounted weapon and system for weapon stabilization and target tracking

Also Published As

Publication number Publication date
WO2019221782A3 (en) 2019-12-26
WO2019221782A9 (en) 2020-02-20
WO2019221782A2 (en) 2019-11-21
EP3704437A4 (en) 2021-07-28
AU2018423158A1 (en) 2020-05-21
EP3704437A2 (en) 2020-09-09

Similar Documents

Publication Publication Date Title
US20190137219A1 (en) Semi-autonomous motorized weapon systems
US11867479B2 (en) Interactive weapon targeting system displaying remote sensed image of target area
US8833231B1 (en) Unmanned range-programmable airburst weapon system for automated tracking and prosecution of close-in targets
CA2457669C (en) Autonomous weapon system
US9074847B1 (en) Stabilized weapon platform with active sense and adaptive motion control
EP3546879A1 (en) Imaging seeker for a spin-stabilized projectile
US11486677B2 (en) Grenade launcher aiming control system
JP2020502465A (en) Guided ammunition system for detecting off-axis targets
US9279643B2 (en) Preemptive countermeasure management
KR102489644B1 (en) Apparatus and method for Calculating real-time fire control command for 30 mm gatling gun
US20230215185A1 (en) Unmanned combat vehicle and target detection method thereof

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: AIMLOCK INC., COLORADO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BOCKMON, BRYAN STERLING;JOHNSTON, CORBIN CHASE;GALLIA, JASON R.;AND OTHERS;REEL/FRAME:049119/0098

Effective date: 20190507

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: PRE-INTERVIEW COMMUNICATION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION