EP2950035A1 - Method and system for a tactical visualization - Google Patents

Method and system for a tactical visualization Download PDF

Info

Publication number
EP2950035A1
EP2950035A1 EP14461534.1A EP14461534A EP2950035A1 EP 2950035 A1 EP2950035 A1 EP 2950035A1 EP 14461534 A EP14461534 A EP 14461534A EP 2950035 A1 EP2950035 A1 EP 2950035A1
Authority
EP
European Patent Office
Prior art keywords
objects
interest
camera
computer
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP14461534.1A
Other languages
German (de)
French (fr)
Inventor
Jacek Paczkowski
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Patents Factory Ltd Sp zoo
Original Assignee
Patents Factory Ltd Sp zoo
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Patents Factory Ltd Sp zoo filed Critical Patents Factory Ltd Sp zoo
Priority to EP14461534.1A priority Critical patent/EP2950035A1/en
Publication of EP2950035A1 publication Critical patent/EP2950035A1/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/04Aiming or laying means for dispersing fire from a battery ; for controlling spread of shots; for coordinating fire from spaced weapons
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/06Aiming or laying means with rangefinder
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/14Indirect aiming means
    • F41G3/16Sighting devices adapted for indirect laying of fire
    • F41G3/165Sighting devices adapted for indirect laying of fire using a TV-monitor

Definitions

  • the present invention relates to a method and system for a tactical visualization.
  • the present invention relates to an on-helmet apparatus or an on-gun apparatus.
  • head-up display is any transparent display that presents data without requiring users to look away from their usual viewpoints.
  • the origin of the name stems from a pilot being able to view information with the head positioned "up” and looking forward, instead of angled down looking at lower instruments (source: Wikipedia).
  • HUDs are now used in commercial aircraft, automobiles, computer gaming, and other applications.
  • a publication of US 20130229716 A1 entitled “Tactical riflescope with smartphone dock” discloses optical systems and devices that enable a mobile device (e.g., smartphone or other mobile phone, personal media player, and/or other personal electronic device) to be coupled with an optical device (e.g., a riflescope, spotting scope, etc.) such that information shown on the display of the mobile device is viewable to a user looking into the eyepiece of the optical device. Additionally or alternatively, an image from the optical device can be communicated to the mobile device.
  • a modular design can utilize an apparatus configured to encase a mobile device, which can be coupled with the optical device via and optical and/or electrical interface.
  • the aim of the development of the present invention is an improved and cost effective method and system for a tactical visualization.
  • An object of the present invention is a system for a tactical visualization, the system comprising: a data bus communicatively coupled to a memory; a controller communicatively coupled to the system bus; the system further comprising: means for visualizing tactical situation; at least one camera; wherein the controller is configured to process data from the camera in order to detect objects and identify objects of interest and assign locations to these objects versus a reference point in space as well as determine probabilities of handling each object of interest; a radio transceiver configured to transmit the processed data to a command center and to receive data from the command center; wherein the controller is configured to execute the steps of the method according to the present invention.
  • the means for visualizing is an on-helmet display or an on-gun display or optical signaling means.
  • the camera is configured to sense visible light as well as infrared light and is supplemented with microwave radar configured to detect objects.
  • the controller processed data based on object recognition or movement detection or detection of sources of heat.
  • the controller is connected to a user control means configured to allow a person using the system to control its functions.
  • the system comprises an orientation sensor, a geolocation system and an inertial navigation system.
  • the visualizing means is configured to provide augmented reality capability wherein a picture from the camera is mixed with information received from the command center and/or the controller.
  • the system is configured to place graphical indicators in proximity to objects and objects of interest.
  • the graphical indicators also include a location of a reference point, location of each of the objects of interest with reference to the reference point, identifier of a team member assigned to a given object of interest and priority of each object of interest.
  • Another object of the present invention is a method for a tactical visualization, the method comprising the steps of: identifying objects and objects of interest on an image acquired by a camera; generating a descriptor for each identified object; transmitting the collected data to a command center; awaiting data from the command center and receiving the data; processing the received information and superimposing it on the image from camera; and providing movement guidance on the composite image and displaying the composite image on a visualizing means.
  • the descriptor comprises type of the object, its location with reference to the camera, information regarding location of the system and orientation of the system.
  • Another object of the present invention is a computer program comprising program code means for performing all the steps of the computer-implemented method according to the present invention when said program is run on a computer.
  • Another object of the present invention is a computer readable medium storing computer-executable instructions performing all the steps of the computer-implemented method according to the present invention when executed on a computer.
  • these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system.
  • these signals are referred to as bits, packets, messages, values, elements, symbols, characters, terms, numbers, or the like.
  • tactical operation environment is for example firefighting operations, police operations, tactical games such as paintball or the like.
  • targets objects of interest in general
  • targets objects of interest in general
  • a person needs to be presented with a tactical image of a location of operation with superimposed information on targets.
  • information may relate to a probability or eliminating a target with identification of target hit difficulty/probability level.
  • the second is further but on the same elevation as the shooter and the third is much higher than the shooter. Then the target, most easy to hit, is the closest target. The second target is more difficult and the third target is most difficult of the three.
  • the task of an electronic equipment of a gun or helmet is to identify location of potential objects of interest versus the location of the person holding the gun (or wearing a helmet), calculation of a distance and angle of elevation of a barrel.
  • the system calculates probability of servicing (or hitting, eliminating, handling) for each of the objects of interest. For example, the object of interest easiest to be services may have the highest priority.
  • the aforementioned calculations shall be executed taking into account skills of a particular person operating the gun.
  • Team members may be assigned different objects of interest based on location and skills of the team members. For example in case two soldiers are at different distances to a target then the soldier further from the target may receive a task of engaging the target in case he is a better shooter and a hit with a single shot is more probable than in case of the soldier who is closer to the target.
  • the determination of probabilities and abilities may also take into account kinds of weapon and kinds of ammunition suitable for a task.
  • Team members may be presented with additional information regarding selection of means for a task, for example ammunition.
  • the system operates based on cameras, rangefinders, radars and the like and includes analysis of locations of detected objects of interest as well as team members executing a tactical operation.
  • Fig. 1 presents an on-helmet system according to the present invention.
  • the system is installed as helmet electronic equipment.
  • the system may be realized using dedicated components or custom made FPGA or ASIC circuits.
  • the system comprises a data bus 101 communicatively coupled to a memory 104. Additionally, other components of the system are communicatively coupled to the system bus 101 so that they may be managed by a controller 106.
  • the memory 104 may store computer program or programs executed by the controller 106 in order to execute steps of the method according to the present invention.
  • Each member of a tactical team may be equipped with means for visualizing 102 tactical situation.
  • These means may be for example: on-helmet display (eg. LCD, OLED), or an on-gun display (eg. LCD, OLED), or optical signalling means attached to a gun signalling that an object of interest has been assigned as well as direction in which the gun shall be pointed.
  • the optical signalling means may comprise four LEDs positioned after a gunsight wherein the LEDs from a cross (ie. A top LED, a bottom LED, a left LED and a right LED).
  • the system comprises at least one camera 109.
  • the camera senses visible light as well as infrared light.
  • the camera may be supplemented with microwave radars for detecting objects as well as missiles, bullets etc.
  • the controller 106 processes data from camera(s) and optionally radar(s) in order to detect objects and identify objects of interest and assigning locations to these objects versus a reference point in space as well as determining probabilities of servicing (eg. hitting) each object of interest (optionally also by different team members).
  • the radio transceiver 105 is responsible for transmitting processed data to a command center that may process a full tactical view of a location of operation.
  • the system may also receive information from the command center (such as assigned objects of interest and priorities of actions for each team member).
  • a method for detecting objects and identifying objects of interest is based on object recognition, movement detection or detection of sources of heat with an infrared camera.
  • the optical system may also work based on reference pictures for detection of new objects in a scene.
  • the controller 106 may be connected to a user control means such as buttons or a joystick allowing the person using the system to control its functions, preferably without moving a rifle away from an arm.
  • a user control means such as buttons or a joystick allowing the person using the system to control its functions, preferably without moving a rifle away from an arm.
  • Such functions may include setup of the system, switching on/off, switching between different viewing modes.
  • the system comprises an orientation sensor 108, a geolocation system 103 (eg. GLONASS or GPS) and an inertial navigation system 107.
  • a geolocation system 103 eg. GLONASS or GPS
  • an inertial navigation system 107 e.g. GPS
  • a version of the system wherein the visualizing means 102 is a display may provide augmented reality capability wherein a picture from the camera 109 is mixed with information received from the command center and/or the controller 106. Owing to this a person will see graphical indicators placed in proximity to objects and objects of interest. Such graphical indicators may identify objects status (eg. Civilian or enemy or ally etc.) and/or priorities and/or probabilities of proper servicing an object. The graphical indicators may distinguish own objects of interest with reference to objects of interest of other team members.
  • objects status eg. Civilian or enemy or ally etc.
  • a version of the system equipped with the optical signaling means will signal only one object of interest.
  • the graphical indicators may also include a location of a reference point; location of each of the objects of interest (with reference to the reference point); identifier of a team member assigned to a given object of interest; priority of each object of interest.
  • the commander decides on the allocation of objects of interest for individual subordinates.
  • the second mode "automatic” computer analyzing the situation tactically allocates objects of interest based on probability of servicing a particular objects of interest by individual team members.
  • the third mode "semiautomatic" the objects of interest are assigned by the computer and subsequently authorized by the commander. In case certain objects of interest remain allocated, the procedure of assignment of unassigned objects of interest starts again.
  • Figs. 2 A-C present examples of graphical indicators presented as superimposed on image captured by the camera 109.
  • the exemplary indicator of Fig. 2A identifies an object of interest that should be handled as a third object after handling object '+1' and '+2'.
  • the exemplary indicator of Fig. 2B identifies an object of interest that should be handled as a fifth object by another team member.
  • the '-' sign refers to another team member.
  • the exemplary indicator of Fig. 2C identifies an object of interest that should not be handled.
  • the graphical indicators may be different for different types of recognized objects such as persons, small vehicles, large vehicles and may have different configuration (such as color) for objects of interest, team members, civilians etc.
  • Fig. 3 presents an embodiment of an on-helmet system according to the present invention.
  • the helmet 301 comprises a camera 109 and a visualization means 102 as indicated in Fig. 1 .
  • the remaining elements of the electronic system are built in into the helmet 301 itself.
  • Fig. 4 presents a process of assigning objects of interest to team members.
  • the objects and objects of interest are identified 401 by a suitable system on an image acquired by a camera.
  • a suitable system for example such system is video object recognition system.
  • the system generates a descriptor for each object.
  • Such descriptor may include besides type of the object its location with reference to a camera, information regarding location of a team member (the personal system) and orientation of a helmet or gun (the personal system).
  • the collected data are transmitted to a command center. In a simpler form only image from the camera may be transmitted to the command center.
  • the system awaits data from the command center which processes data received preferably from at least two transmitters carried by team members.
  • each object of interest (row) is associated with probability results (columns) obtained for different team members.
  • the table may also comprise additional column providing the value of highest probability of handling the object of interest among the team members.
  • the table may be sorted according to this additional column ascending. For an object of interest of the first row there is selected a team member that has the highest probability of handling the object of interest. After a team member has been associated with the object of interest, the row is removed from the table and the table is resorted in the same manner s previously. The process is repeated until all table rows have been processed.
  • the command center transmits to each team member information regarding assigned objects of interest to this particular team member.
  • Each team member's electronic system receives, at step 404, information from the command center.
  • the received information is processed and superimposed 405 on the image from camera 109 mounted on a helmet or on a gun.
  • step 406 movement guidance is provided and in particular aiming guidance is provided as a part of a composite image data.
  • suitable graphical indicators such as arrows indicating direction in which the gun is to be moved or aimed.
  • the controller will superimpose graphical indicators related to objects of interest, their priorities and/or probabilities.
  • guiding indications 406 with respect to optical gunsight.
  • the guiding indications are calculated based on ballistic algorithms and ballistic parameters of a gun and ammunition.
  • a version of the system equipped with the optical signaling means will signal only one object of interest and after handling that object another object of interest is indicated according to a list of objects of interest.
  • the LEDs indicate position of the gun correct for handling the object of interest. For example in case the barrel is too low, the upper LED is red and the lower LED is green. In case the barrel is too high, the upper LED is green and the lower LED is red. In case the barrel is too far left, the left LED is green and the right LED is red. In case the barrel is too far right, the left LED is red and the right LED is green. When the gun is properly aimed all four LEDs are green.
  • there may be an additional diode signaling type of an object Depending on number of colors or illuminating patterns one diode may signal more than two different types of an object.
  • the system may comprise a manipulator in a form of for example a joystick or a touchpad, preferably operated by a thumb of a hand that pulls a trigger.
  • the manipulator is preferably positioned such that in order to operate it a shooter does not have to move a hand properly set up for a shot.
  • the middle and/or ring and/or little finger may operate additional controls (eg. buttons).
  • Fig. 5 shows an exemplary content presented on a display. There is presented image from a camera 109 and superimposed information next to each object of interest.
  • the aforementioned method for a tactical visualization may be performed and/or controlled by one or more computer programs.
  • Such computer programs are typically executed by utilizing the computing resources in a computing device.
  • Applications are stored on a non-transitory medium.
  • An example of a non-transitory medium is a nonvolatile memory, for example a flash memory or volatile memory, for example RAM.
  • the computer instructions are executed by a processor.
  • These memories are exemplary recording media for storing computer programs comprising computer-executable instructions performing all the steps of the computer-implemented method according the technical concept presented herein.

Abstract

A method for a tactical visualization, the method comprising the steps of: identifying objects and objects of interest on an image acquired by a camera; generating a descriptor for each identified object; transmitting the collected data to a command center; awaiting data from the command center and receiving the data; processing the received information and superimposing it on the image from camera; and providing movement guidance on the composite image and displaying the composite image on a visualizing means.

Description

  • The present invention relates to a method and system for a tactical visualization. In particular the present invention relates to an on-helmet apparatus or an on-gun apparatus.
  • Prior art defines so called head-up display that is any transparent display that presents data without requiring users to look away from their usual viewpoints. The origin of the name stems from a pilot being able to view information with the head positioned "up" and looking forward, instead of angled down looking at lower instruments (source: Wikipedia).
  • Although they were initially developed for military aviation, HUDs are now used in commercial aircraft, automobiles, computer gaming, and other applications.
  • It would be advantageous to provide tactical visualization for man, especially supporting team operations.
  • A publication of US 20130229716 A1 , entitled "Tactical riflescope with smartphone dock" discloses optical systems and devices that enable a mobile device (e.g., smartphone or other mobile phone, personal media player, and/or other personal electronic device) to be coupled with an optical device (e.g., a riflescope, spotting scope, etc.) such that information shown on the display of the mobile device is viewable to a user looking into the eyepiece of the optical device. Additionally or alternatively, an image from the optical device can be communicated to the mobile device. A modular design can utilize an apparatus configured to encase a mobile device, which can be coupled with the optical device via and optical and/or electrical interface.
  • The aim of the development of the present invention is an improved and cost effective method and system for a tactical visualization.
  • An object of the present invention is a system for a tactical visualization, the system comprising: a data bus communicatively coupled to a memory; a controller communicatively coupled to the system bus; the system further comprising: means for visualizing tactical situation; at least one camera; wherein the controller is configured to process data from the camera in order to detect objects and identify objects of interest and assign locations to these objects versus a reference point in space as well as determine probabilities of handling each object of interest; a radio transceiver configured to transmit the processed data to a command center and to receive data from the command center; wherein the controller is configured to execute the steps of the method according to the present invention.
  • Preferably, the means for visualizing is an on-helmet display or an on-gun display or optical signaling means.
  • Preferably, the camera is configured to sense visible light as well as infrared light and is supplemented with microwave radar configured to detect objects.
  • Preferably, the controller processed data based on object recognition or movement detection or detection of sources of heat.
  • Preferably, the controller is connected to a user control means configured to allow a person using the system to control its functions.
  • Preferably, the system comprises an orientation sensor, a geolocation system and an inertial navigation system.
  • Preferably, the visualizing means is configured to provide augmented reality capability wherein a picture from the camera is mixed with information received from the command center and/or the controller.
  • Preferably, the system is configured to place graphical indicators in proximity to objects and objects of interest.
  • Preferably, the graphical indicators also include a location of a reference point, location of each of the objects of interest with reference to the reference point, identifier of a team member assigned to a given object of interest and priority of each object of interest.
  • Another object of the present invention is a method for a tactical visualization, the method comprising the steps of: identifying objects and objects of interest on an image acquired by a camera; generating a descriptor for each identified object; transmitting the collected data to a command center; awaiting data from the command center and receiving the data; processing the received information and superimposing it on the image from camera; and providing movement guidance on the composite image and displaying the composite image on a visualizing means.
  • Preferably, the descriptor comprises type of the object, its location with reference to the camera, information regarding location of the system and orientation of the system.
  • Another object of the present invention is a computer program comprising program code means for performing all the steps of the computer-implemented method according to the present invention when said program is run on a computer.
  • Another object of the present invention is a computer readable medium storing computer-executable instructions performing all the steps of the computer-implemented method according to the present invention when executed on a computer.
  • These and other objects of the invention presented herein are accomplished by providing a method and system for a tactical visualization. Further details and features of the present invention, its nature and various advantages will become more apparent from the following detailed description of the preferred embodiments shown in a drawing, in which:
    • Fig. 1 presents an on-helmet system according to the present invention;
    • Figs. 2 A-C present examples of graphical indicators;
    • Fig. 3 presents an embodiment of an on-helmet system according to the present invention;
    • Fig. 4 presents a method according to the present invention; and
    • Fig. 5 shows an exemplary content presented on a display.
    NOTATION AND NOMENCLATURE
  • Some portions of the detailed description which follows are presented in terms of data processing procedures, steps or other symbolic representations of operations on data bits that can be performed on computer memory. Therefore, a computer executes such logical steps thus requiring physical manipulations of physical quantities.
  • Usually these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system. For reasons of common usage, these signals are referred to as bits, packets, messages, values, elements, symbols, characters, terms, numbers, or the like.
  • Additionally, all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Terms such as "processing" or "creating" or "transferring" or "executing" or "determining" or "detecting" or "obtaining" or "selecting" or "calculating" or "generating" or the like, refer to the action and processes of a computer system that manipulates and transforms data represented as physical (electronic) quantities within the computer's registers and memories into other data similarly represented as physical quantities within the memories or registers or other such information storage.
  • According to the present invention tactical operation environment is for example firefighting operations, police operations, tactical games such as paintball or the like.
  • In a tactical operation environment it is crucial that every person sees targets (objects of interest in general) that are within reach and also is aware of probability of servicing or eliminating such targets (objects of interest).
  • In other words a person needs to be presented with a tactical image of a location of operation with superimposed information on targets. For example, in case of police or military such information may relate to a probability or eliminating a target with identification of target hit difficulty/probability level.
  • For example, in case there are three targets within weapon's range, of which one is much closer than the others, the second is further but on the same elevation as the shooter and the third is much higher than the shooter. Then the target, most easy to hit, is the closest target. The second target is more difficult and the third target is most difficult of the three.
  • The task of an electronic equipment of a gun or helmet is to identify location of potential objects of interest versus the location of the person holding the gun (or wearing a helmet), calculation of a distance and angle of elevation of a barrel. In case there are more than one objects of interest, the system calculates probability of servicing (or hitting, eliminating, handling) for each of the objects of interest. For example, the object of interest easiest to be services may have the highest priority. The aforementioned calculations shall be executed taking into account skills of a particular person operating the gun.
  • As there may be teams of persons working in tactical operations of firefighters, police or army, it is beneficial to assign objects of interest at the level of teams. Different team members may be assigned different objects of interest based on location and skills of the team members. For example in case two soldiers are at different distances to a target then the soldier further from the target may receive a task of engaging the target in case he is a better shooter and a hit with a single shot is more probable than in case of the soldier who is closer to the target. The determination of probabilities and abilities may also take into account kinds of weapon and kinds of ammunition suitable for a task. Team members may be presented with additional information regarding selection of means for a task, for example ammunition.
  • The system operates based on cameras, rangefinders, radars and the like and includes analysis of locations of detected objects of interest as well as team members executing a tactical operation.
  • As a result there may be crated a list of objects of interest related to assets assigned to servicing these objects of interest.
  • Fig. 1 presents an on-helmet system according to the present invention. The system is installed as helmet electronic equipment.
  • The system may be realized using dedicated components or custom made FPGA or ASIC circuits. The system comprises a data bus 101 communicatively coupled to a memory 104. Additionally, other components of the system are communicatively coupled to the system bus 101 so that they may be managed by a controller 106.
  • The memory 104 may store computer program or programs executed by the controller 106 in order to execute steps of the method according to the present invention.
  • Each member of a tactical team may be equipped with means for visualizing 102 tactical situation. These means may be for example: on-helmet display (eg. LCD, OLED), or an on-gun display (eg. LCD, OLED), or optical signalling means attached to a gun signalling that an object of interest has been assigned as well as direction in which the gun shall be pointed. The optical signalling means may comprise four LEDs positioned after a gunsight wherein the LEDs from a cross (ie. A top LED, a bottom LED, a left LED and a right LED).
  • The system comprises at least one camera 109. Preferably the camera senses visible light as well as infrared light. Optionally the camera may be supplemented with microwave radars for detecting objects as well as missiles, bullets etc. Further the controller 106 processes data from camera(s) and optionally radar(s) in order to detect objects and identify objects of interest and assigning locations to these objects versus a reference point in space as well as determining probabilities of servicing (eg. hitting) each object of interest (optionally also by different team members).
  • The radio transceiver 105 is responsible for transmitting processed data to a command center that may process a full tactical view of a location of operation. The system may also receive information from the command center (such as assigned objects of interest and priorities of actions for each team member).
  • A method for detecting objects and identifying objects of interest is based on object recognition, movement detection or detection of sources of heat with an infrared camera. The optical system may also work based on reference pictures for detection of new objects in a scene.
  • The controller 106 may be connected to a user control means such as buttons or a joystick allowing the person using the system to control its functions, preferably without moving a rifle away from an arm. Such functions may include setup of the system, switching on/off, switching between different viewing modes.
  • In order to support precise information on location and orientation, the system comprises an orientation sensor 108, a geolocation system 103 (eg. GLONASS or GPS) and an inertial navigation system 107. By using these means, the system as well as a command center are aware of location of team members as well as direction towards which the team member system is aimed.
  • A version of the system wherein the visualizing means 102 is a display may provide augmented reality capability wherein a picture from the camera 109 is mixed with information received from the command center and/or the controller 106. Owing to this a person will see graphical indicators placed in proximity to objects and objects of interest. Such graphical indicators may identify objects status (eg. Civilian or enemy or ally etc.) and/or priorities and/or probabilities of proper servicing an object. The graphical indicators may distinguish own objects of interest with reference to objects of interest of other team members.
  • A version of the system equipped with the optical signaling means will signal only one object of interest.
  • The graphical indicators may also include a location of a reference point; location of each of the objects of interest (with reference to the reference point); identifier of a team member assigned to a given object of interest; priority of each object of interest.
  • There may be provided several modes of operation of the system. In the first mode "manual" the commander decides on the allocation of objects of interest for individual subordinates. In the second mode "automatic" computer analyzing the situation tactically allocates objects of interest based on probability of servicing a particular objects of interest by individual team members. In the third mode "semiautomatic" the objects of interest are assigned by the computer and subsequently authorized by the commander. In case certain objects of interest remain allocated, the procedure of assignment of unassigned objects of interest starts again.
  • Figs. 2 A-C present examples of graphical indicators presented as superimposed on image captured by the camera 109. The exemplary indicator of Fig. 2A identifies an object of interest that should be handled as a third object after handling object '+1' and '+2'. The exemplary indicator of Fig. 2B identifies an object of interest that should be handled as a fifth object by another team member. The '-' sign refers to another team member. The exemplary indicator of Fig. 2C identifies an object of interest that should not be handled.
  • The graphical indicators may be different for different types of recognized objects such as persons, small vehicles, large vehicles and may have different configuration (such as color) for objects of interest, team members, civilians etc.
  • Fig. 3 presents an embodiment of an on-helmet system according to the present invention. The helmet 301 comprises a camera 109 and a visualization means 102 as indicated in Fig. 1. The remaining elements of the electronic system are built in into the helmet 301 itself.
  • Fig. 4 presents a process of assigning objects of interest to team members. First the objects and objects of interest are identified 401 by a suitable system on an image acquired by a camera. For example such system is video object recognition system. Subsequently, at step 402, the system generates a descriptor for each object. Such descriptor may include besides type of the object its location with reference to a camera, information regarding location of a team member (the personal system) and orientation of a helmet or gun (the personal system). Next, at step 403, the collected data are transmitted to a command center. In a simpler form only image from the camera may be transmitted to the command center.
  • Subsequently, at step 404, the system awaits data from the command center which processes data received preferably from at least two transmitters carried by team members.
  • At the command center there is calculated probability of handling each object of interest by each of the team members. In case of application where weapons are involved such calculation may be executed for each carried weapon and each carried ammunition type. This means that in case of shooters there may be output several probability values for different configurations of weapon and ammunition. The combination of best weapon an ammunition is preferably selected for a given object of interest.
  • There may be created a table wherein each object of interest (row) is associated with probability results (columns) obtained for different team members. The table may also comprise additional column providing the value of highest probability of handling the object of interest among the team members.
  • The table may be sorted according to this additional column ascending. For an object of interest of the first row there is selected a team member that has the highest probability of handling the object of interest. After a team member has been associated with the object of interest, the row is removed from the table and the table is resorted in the same manner s previously. The process is repeated until all table rows have been processed.
  • After the objects of interest have been assigned to team members, the command center transmits to each team member information regarding assigned objects of interest to this particular team member. Each team member's electronic system receives, at step 404, information from the command center.
  • The received information is processed and superimposed 405 on the image from camera 109 mounted on a helmet or on a gun.
  • Subsequently, at step 406, movement guidance is provided and in particular aiming guidance is provided as a part of a composite image data. For example, in case of a gun there may be displayed suitable graphical indicators such as arrows indicating direction in which the gun is to be moved or aimed. After the gun has been aimed at objects of interest, the controller will superimpose graphical indicators related to objects of interest, their priorities and/or probabilities.
  • Optionally, after the gun has been aimed at objects of interest there may be give guiding indications 406 with respect to optical gunsight. The guiding indications are calculated based on ballistic algorithms and ballistic parameters of a gun and ammunition.
  • A version of the system equipped with the optical signaling means will signal only one object of interest and after handling that object another object of interest is indicated according to a list of objects of interest. The LEDs indicate position of the gun correct for handling the object of interest. For example in case the barrel is too low, the upper LED is red and the lower LED is green. In case the barrel is too high, the upper LED is green and the lower LED is red. In case the barrel is too far left, the left LED is green and the right LED is red. In case the barrel is too far right, the left LED is red and the right LED is green. When the gun is properly aimed all four LEDs are green. Optionally, there may be an additional diode signaling type of an object. Depending on number of colors or illuminating patterns one diode may signal more than two different types of an object.
  • The system may comprise a manipulator in a form of for example a joystick or a touchpad, preferably operated by a thumb of a hand that pulls a trigger. The manipulator is preferably positioned such that in order to operate it a shooter does not have to move a hand properly set up for a shot. The middle and/or ring and/or little finger may operate additional controls (eg. buttons).
  • Fig. 5 shows an exemplary content presented on a display. There is presented image from a camera 109 and superimposed information next to each object of interest.
  • It can be easily recognized, by one skilled in the art, that the aforementioned method for a tactical visualization may be performed and/or controlled by one or more computer programs. Such computer programs are typically executed by utilizing the computing resources in a computing device. Applications are stored on a non-transitory medium. An example of a non-transitory medium is a nonvolatile memory, for example a flash memory or volatile memory, for example RAM. The computer instructions are executed by a processor. These memories are exemplary recording media for storing computer programs comprising computer-executable instructions performing all the steps of the computer-implemented method according the technical concept presented herein.
  • While the invention presented herein has been depicted, described, and has been defined with reference to particular preferred embodiments, such references and examples of implementation in the foregoing specification do not imply any limitation on the invention. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader scope of the technical concept. The presented preferred embodiments are exemplary only, and are not exhaustive of the scope of the technical concept presented herein.
  • Accordingly, the scope of protection is not limited to the preferred embodiments described in the specification, but is only limited by the claims that follow.

Claims (13)

  1. A system for a tactical visualization, the system comprising:
    • a data bus (101) communicatively coupled to a memory (104);
    • a controller (106) communicatively coupled to the system bus (101);
    the system being characterized in that it comprises:
    • means for visualizing (102) tactical situation;
    • at least one camera (109);
    • wherein the controller (106) is configured to process data from the camera (109) in order to detect objects and identify objects of interest and assign locations to these objects versus a reference point in space as well as determine probabilities of handling each object of interest;
    • a radio transceiver (105) configured to transmit the processed data to a command center and to receive data from the command center;
    • wherein the controller (106) is configured to execute the steps of the method according to claim 10.
  2. The system according to claim 1 characterized in that the means for visualizing (102) is an on-helmet display or an on-gun display or optical signaling means.
  3. The system according to claim 1 characterized in that the camera (109) is configured to sense visible light as well as infrared light and is supplemented with microwave radar configured to detect objects.
  4. The system according to claim 1 characterized in that the controller (106) processed data based on object recognition or movement detection or detection of sources of heat.
  5. The system according to claim 1 characterized in that the controller (106) is connected to a user control means configured to allow a person using the system to control its functions.
  6. The system according to claim 1 characterized in that the system comprises an orientation sensor (108), a geolocation system (103) and an inertial navigation system (107).
  7. The system according to claim 1 characterized in that the visualizing means (102) is configured to provide augmented reality capability wherein a picture from the camera (109) is mixed with information received from the command center and/or the controller (106).
  8. The system according to claim 7 characterized in that the system is configured to place graphical indicators in proximity to objects and objects of interest.
  9. The system according to claim 8 characterized in that the graphical indicators also include a location of a reference point, location of each of the objects of interest with reference to the reference point, identifier of a team member assigned to a given object of interest and priority of each object of interest.
  10. A method for a tactical visualization, the method being characterized in that it comprises the steps of:
    • identifying (401) objects and objects of interest on an image acquired by a camera (109);
    • generating (402) a descriptor for each identified object;
    • transmitting (403) the collected data to a command center;
    • awaiting (404) data from the command center and receiving the data;
    • processing (405) the received information and superimposing it on the image from camera (109); and
    • providing movement guidance (406) on the composite image and displaying the composite image on a visualizing means (102).
  11. The method according to claim 10 characterized in that the descriptor comprises type of the object, its location with reference to the camera (109), information regarding location of the system and orientation of the system.
  12. A computer program comprising program code means for performing all the steps of the computer-implemented method according to claim 10 when said program is run on a computer.
  13. A computer readable medium storing computer-executable instructions performing all the steps of the computer-implemented method according to claim 10 when executed on a computer.
EP14461534.1A 2014-05-28 2014-05-28 Method and system for a tactical visualization Withdrawn EP2950035A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP14461534.1A EP2950035A1 (en) 2014-05-28 2014-05-28 Method and system for a tactical visualization

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP14461534.1A EP2950035A1 (en) 2014-05-28 2014-05-28 Method and system for a tactical visualization

Publications (1)

Publication Number Publication Date
EP2950035A1 true EP2950035A1 (en) 2015-12-02

Family

ID=50897522

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14461534.1A Withdrawn EP2950035A1 (en) 2014-05-28 2014-05-28 Method and system for a tactical visualization

Country Status (1)

Country Link
EP (1) EP2950035A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3376152A1 (en) * 2017-03-13 2018-09-19 MBDA Deutschland GmbH Information processing system and information processing method
DE102018106731A1 (en) * 2018-03-21 2019-09-26 Rheinmetall Electronics Gmbh Military device and method for operating a military device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000234897A (en) * 1999-02-17 2000-08-29 Mitsubishi Electric Corp Shooting target selector
FR2883396A1 (en) * 2005-03-21 2006-09-22 Giat Ind Sa Tasks e.g. enemy target, distribution assisting and coordinating system for e.g. first aid worker team, has communication units permitting evaluation of task feasibility, where distribution coinciding with local information is determined
US20080204361A1 (en) * 2007-02-28 2008-08-28 Science Applications International Corporation System and Method for Video Image Registration and/or Providing Supplemental Data in a Heads Up Display
US20100196859A1 (en) * 2009-02-01 2010-08-05 John David Saugen Combat Information System
US20130229716A1 (en) 2012-03-01 2013-09-05 Cubic Corporation Tactical riflescope with smartphone dock
US20140110482A1 (en) * 2011-04-01 2014-04-24 Zrf, Llc System and method for automatically targeting a weapon

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000234897A (en) * 1999-02-17 2000-08-29 Mitsubishi Electric Corp Shooting target selector
FR2883396A1 (en) * 2005-03-21 2006-09-22 Giat Ind Sa Tasks e.g. enemy target, distribution assisting and coordinating system for e.g. first aid worker team, has communication units permitting evaluation of task feasibility, where distribution coinciding with local information is determined
US20080204361A1 (en) * 2007-02-28 2008-08-28 Science Applications International Corporation System and Method for Video Image Registration and/or Providing Supplemental Data in a Heads Up Display
US20100196859A1 (en) * 2009-02-01 2010-08-05 John David Saugen Combat Information System
US20140110482A1 (en) * 2011-04-01 2014-04-24 Zrf, Llc System and method for automatically targeting a weapon
US20130229716A1 (en) 2012-03-01 2013-09-05 Cubic Corporation Tactical riflescope with smartphone dock

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3376152A1 (en) * 2017-03-13 2018-09-19 MBDA Deutschland GmbH Information processing system and information processing method
DE102018106731A1 (en) * 2018-03-21 2019-09-26 Rheinmetall Electronics Gmbh Military device and method for operating a military device

Similar Documents

Publication Publication Date Title
US20230152059A1 (en) Viewing optic with round counter system
EP4220069A1 (en) A display system for a viewing optic
CA3181919A1 (en) Viewing optic with an enabler interface
US20090205239A1 (en) System and Method for Determining Target Range and Coordinating Team Fire
AU2017270586A1 (en) Pattern configurable reticle
US20220326596A1 (en) Imaging system for firearm
US11662176B2 (en) Thermal gunsights
EP2950035A1 (en) Method and system for a tactical visualization
KR101386643B1 (en) Apparatus and method for weapon targeting assistant
US20240102773A1 (en) Imaging enabler for a viewing optic
US20240069323A1 (en) Power Pack for a Viewing Optic
US20240068776A1 (en) Systems and controls for an enabler of a viewing optic
RU2747740C1 (en) Method for automated targeting on field of battle with review of targets
WO2024040262A1 (en) Solar powered viewing optic having an integrated display system
WO2024020538A1 (en) Elevation adders for a viewing optic with an integrated display system

Legal Events

Date Code Title Description
17P Request for examination filed

Effective date: 20150309

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

18D Application deemed to be withdrawn

Effective date: 20161201

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN