EP2950035A1 - Procédé et système pour visualisation tactique - Google Patents

Procédé et système pour visualisation tactique Download PDF

Info

Publication number
EP2950035A1
EP2950035A1 EP14461534.1A EP14461534A EP2950035A1 EP 2950035 A1 EP2950035 A1 EP 2950035A1 EP 14461534 A EP14461534 A EP 14461534A EP 2950035 A1 EP2950035 A1 EP 2950035A1
Authority
EP
European Patent Office
Prior art keywords
objects
interest
camera
computer
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP14461534.1A
Other languages
German (de)
English (en)
Inventor
Jacek Paczkowski
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Patents Factory Ltd Sp zoo
Original Assignee
Patents Factory Ltd Sp zoo
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Patents Factory Ltd Sp zoo filed Critical Patents Factory Ltd Sp zoo
Priority to EP14461534.1A priority Critical patent/EP2950035A1/fr
Publication of EP2950035A1 publication Critical patent/EP2950035A1/fr
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/04Aiming or laying means for dispersing fire from a battery ; for controlling spread of shots; for coordinating fire from spaced weapons
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/06Aiming or laying means with rangefinder
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/14Indirect aiming means
    • F41G3/16Sighting devices adapted for indirect laying of fire
    • F41G3/165Sighting devices adapted for indirect laying of fire using a TV-monitor

Definitions

  • the present invention relates to a method and system for a tactical visualization.
  • the present invention relates to an on-helmet apparatus or an on-gun apparatus.
  • head-up display is any transparent display that presents data without requiring users to look away from their usual viewpoints.
  • the origin of the name stems from a pilot being able to view information with the head positioned "up” and looking forward, instead of angled down looking at lower instruments (source: Wikipedia).
  • HUDs are now used in commercial aircraft, automobiles, computer gaming, and other applications.
  • a publication of US 20130229716 A1 entitled “Tactical riflescope with smartphone dock” discloses optical systems and devices that enable a mobile device (e.g., smartphone or other mobile phone, personal media player, and/or other personal electronic device) to be coupled with an optical device (e.g., a riflescope, spotting scope, etc.) such that information shown on the display of the mobile device is viewable to a user looking into the eyepiece of the optical device. Additionally or alternatively, an image from the optical device can be communicated to the mobile device.
  • a modular design can utilize an apparatus configured to encase a mobile device, which can be coupled with the optical device via and optical and/or electrical interface.
  • the aim of the development of the present invention is an improved and cost effective method and system for a tactical visualization.
  • An object of the present invention is a system for a tactical visualization, the system comprising: a data bus communicatively coupled to a memory; a controller communicatively coupled to the system bus; the system further comprising: means for visualizing tactical situation; at least one camera; wherein the controller is configured to process data from the camera in order to detect objects and identify objects of interest and assign locations to these objects versus a reference point in space as well as determine probabilities of handling each object of interest; a radio transceiver configured to transmit the processed data to a command center and to receive data from the command center; wherein the controller is configured to execute the steps of the method according to the present invention.
  • the means for visualizing is an on-helmet display or an on-gun display or optical signaling means.
  • the camera is configured to sense visible light as well as infrared light and is supplemented with microwave radar configured to detect objects.
  • the controller processed data based on object recognition or movement detection or detection of sources of heat.
  • the controller is connected to a user control means configured to allow a person using the system to control its functions.
  • the system comprises an orientation sensor, a geolocation system and an inertial navigation system.
  • the visualizing means is configured to provide augmented reality capability wherein a picture from the camera is mixed with information received from the command center and/or the controller.
  • the system is configured to place graphical indicators in proximity to objects and objects of interest.
  • the graphical indicators also include a location of a reference point, location of each of the objects of interest with reference to the reference point, identifier of a team member assigned to a given object of interest and priority of each object of interest.
  • Another object of the present invention is a method for a tactical visualization, the method comprising the steps of: identifying objects and objects of interest on an image acquired by a camera; generating a descriptor for each identified object; transmitting the collected data to a command center; awaiting data from the command center and receiving the data; processing the received information and superimposing it on the image from camera; and providing movement guidance on the composite image and displaying the composite image on a visualizing means.
  • the descriptor comprises type of the object, its location with reference to the camera, information regarding location of the system and orientation of the system.
  • Another object of the present invention is a computer program comprising program code means for performing all the steps of the computer-implemented method according to the present invention when said program is run on a computer.
  • Another object of the present invention is a computer readable medium storing computer-executable instructions performing all the steps of the computer-implemented method according to the present invention when executed on a computer.
  • these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system.
  • these signals are referred to as bits, packets, messages, values, elements, symbols, characters, terms, numbers, or the like.
  • tactical operation environment is for example firefighting operations, police operations, tactical games such as paintball or the like.
  • targets objects of interest in general
  • targets objects of interest in general
  • a person needs to be presented with a tactical image of a location of operation with superimposed information on targets.
  • information may relate to a probability or eliminating a target with identification of target hit difficulty/probability level.
  • the second is further but on the same elevation as the shooter and the third is much higher than the shooter. Then the target, most easy to hit, is the closest target. The second target is more difficult and the third target is most difficult of the three.
  • the task of an electronic equipment of a gun or helmet is to identify location of potential objects of interest versus the location of the person holding the gun (or wearing a helmet), calculation of a distance and angle of elevation of a barrel.
  • the system calculates probability of servicing (or hitting, eliminating, handling) for each of the objects of interest. For example, the object of interest easiest to be services may have the highest priority.
  • the aforementioned calculations shall be executed taking into account skills of a particular person operating the gun.
  • Team members may be assigned different objects of interest based on location and skills of the team members. For example in case two soldiers are at different distances to a target then the soldier further from the target may receive a task of engaging the target in case he is a better shooter and a hit with a single shot is more probable than in case of the soldier who is closer to the target.
  • the determination of probabilities and abilities may also take into account kinds of weapon and kinds of ammunition suitable for a task.
  • Team members may be presented with additional information regarding selection of means for a task, for example ammunition.
  • the system operates based on cameras, rangefinders, radars and the like and includes analysis of locations of detected objects of interest as well as team members executing a tactical operation.
  • Fig. 1 presents an on-helmet system according to the present invention.
  • the system is installed as helmet electronic equipment.
  • the system may be realized using dedicated components or custom made FPGA or ASIC circuits.
  • the system comprises a data bus 101 communicatively coupled to a memory 104. Additionally, other components of the system are communicatively coupled to the system bus 101 so that they may be managed by a controller 106.
  • the memory 104 may store computer program or programs executed by the controller 106 in order to execute steps of the method according to the present invention.
  • Each member of a tactical team may be equipped with means for visualizing 102 tactical situation.
  • These means may be for example: on-helmet display (eg. LCD, OLED), or an on-gun display (eg. LCD, OLED), or optical signalling means attached to a gun signalling that an object of interest has been assigned as well as direction in which the gun shall be pointed.
  • the optical signalling means may comprise four LEDs positioned after a gunsight wherein the LEDs from a cross (ie. A top LED, a bottom LED, a left LED and a right LED).
  • the system comprises at least one camera 109.
  • the camera senses visible light as well as infrared light.
  • the camera may be supplemented with microwave radars for detecting objects as well as missiles, bullets etc.
  • the controller 106 processes data from camera(s) and optionally radar(s) in order to detect objects and identify objects of interest and assigning locations to these objects versus a reference point in space as well as determining probabilities of servicing (eg. hitting) each object of interest (optionally also by different team members).
  • the radio transceiver 105 is responsible for transmitting processed data to a command center that may process a full tactical view of a location of operation.
  • the system may also receive information from the command center (such as assigned objects of interest and priorities of actions for each team member).
  • a method for detecting objects and identifying objects of interest is based on object recognition, movement detection or detection of sources of heat with an infrared camera.
  • the optical system may also work based on reference pictures for detection of new objects in a scene.
  • the controller 106 may be connected to a user control means such as buttons or a joystick allowing the person using the system to control its functions, preferably without moving a rifle away from an arm.
  • a user control means such as buttons or a joystick allowing the person using the system to control its functions, preferably without moving a rifle away from an arm.
  • Such functions may include setup of the system, switching on/off, switching between different viewing modes.
  • the system comprises an orientation sensor 108, a geolocation system 103 (eg. GLONASS or GPS) and an inertial navigation system 107.
  • a geolocation system 103 eg. GLONASS or GPS
  • an inertial navigation system 107 e.g. GPS
  • a version of the system wherein the visualizing means 102 is a display may provide augmented reality capability wherein a picture from the camera 109 is mixed with information received from the command center and/or the controller 106. Owing to this a person will see graphical indicators placed in proximity to objects and objects of interest. Such graphical indicators may identify objects status (eg. Civilian or enemy or ally etc.) and/or priorities and/or probabilities of proper servicing an object. The graphical indicators may distinguish own objects of interest with reference to objects of interest of other team members.
  • objects status eg. Civilian or enemy or ally etc.
  • a version of the system equipped with the optical signaling means will signal only one object of interest.
  • the graphical indicators may also include a location of a reference point; location of each of the objects of interest (with reference to the reference point); identifier of a team member assigned to a given object of interest; priority of each object of interest.
  • the commander decides on the allocation of objects of interest for individual subordinates.
  • the second mode "automatic” computer analyzing the situation tactically allocates objects of interest based on probability of servicing a particular objects of interest by individual team members.
  • the third mode "semiautomatic" the objects of interest are assigned by the computer and subsequently authorized by the commander. In case certain objects of interest remain allocated, the procedure of assignment of unassigned objects of interest starts again.
  • Figs. 2 A-C present examples of graphical indicators presented as superimposed on image captured by the camera 109.
  • the exemplary indicator of Fig. 2A identifies an object of interest that should be handled as a third object after handling object '+1' and '+2'.
  • the exemplary indicator of Fig. 2B identifies an object of interest that should be handled as a fifth object by another team member.
  • the '-' sign refers to another team member.
  • the exemplary indicator of Fig. 2C identifies an object of interest that should not be handled.
  • the graphical indicators may be different for different types of recognized objects such as persons, small vehicles, large vehicles and may have different configuration (such as color) for objects of interest, team members, civilians etc.
  • Fig. 3 presents an embodiment of an on-helmet system according to the present invention.
  • the helmet 301 comprises a camera 109 and a visualization means 102 as indicated in Fig. 1 .
  • the remaining elements of the electronic system are built in into the helmet 301 itself.
  • Fig. 4 presents a process of assigning objects of interest to team members.
  • the objects and objects of interest are identified 401 by a suitable system on an image acquired by a camera.
  • a suitable system for example such system is video object recognition system.
  • the system generates a descriptor for each object.
  • Such descriptor may include besides type of the object its location with reference to a camera, information regarding location of a team member (the personal system) and orientation of a helmet or gun (the personal system).
  • the collected data are transmitted to a command center. In a simpler form only image from the camera may be transmitted to the command center.
  • the system awaits data from the command center which processes data received preferably from at least two transmitters carried by team members.
  • each object of interest (row) is associated with probability results (columns) obtained for different team members.
  • the table may also comprise additional column providing the value of highest probability of handling the object of interest among the team members.
  • the table may be sorted according to this additional column ascending. For an object of interest of the first row there is selected a team member that has the highest probability of handling the object of interest. After a team member has been associated with the object of interest, the row is removed from the table and the table is resorted in the same manner s previously. The process is repeated until all table rows have been processed.
  • the command center transmits to each team member information regarding assigned objects of interest to this particular team member.
  • Each team member's electronic system receives, at step 404, information from the command center.
  • the received information is processed and superimposed 405 on the image from camera 109 mounted on a helmet or on a gun.
  • step 406 movement guidance is provided and in particular aiming guidance is provided as a part of a composite image data.
  • suitable graphical indicators such as arrows indicating direction in which the gun is to be moved or aimed.
  • the controller will superimpose graphical indicators related to objects of interest, their priorities and/or probabilities.
  • guiding indications 406 with respect to optical gunsight.
  • the guiding indications are calculated based on ballistic algorithms and ballistic parameters of a gun and ammunition.
  • a version of the system equipped with the optical signaling means will signal only one object of interest and after handling that object another object of interest is indicated according to a list of objects of interest.
  • the LEDs indicate position of the gun correct for handling the object of interest. For example in case the barrel is too low, the upper LED is red and the lower LED is green. In case the barrel is too high, the upper LED is green and the lower LED is red. In case the barrel is too far left, the left LED is green and the right LED is red. In case the barrel is too far right, the left LED is red and the right LED is green. When the gun is properly aimed all four LEDs are green.
  • there may be an additional diode signaling type of an object Depending on number of colors or illuminating patterns one diode may signal more than two different types of an object.
  • the system may comprise a manipulator in a form of for example a joystick or a touchpad, preferably operated by a thumb of a hand that pulls a trigger.
  • the manipulator is preferably positioned such that in order to operate it a shooter does not have to move a hand properly set up for a shot.
  • the middle and/or ring and/or little finger may operate additional controls (eg. buttons).
  • Fig. 5 shows an exemplary content presented on a display. There is presented image from a camera 109 and superimposed information next to each object of interest.
  • the aforementioned method for a tactical visualization may be performed and/or controlled by one or more computer programs.
  • Such computer programs are typically executed by utilizing the computing resources in a computing device.
  • Applications are stored on a non-transitory medium.
  • An example of a non-transitory medium is a nonvolatile memory, for example a flash memory or volatile memory, for example RAM.
  • the computer instructions are executed by a processor.
  • These memories are exemplary recording media for storing computer programs comprising computer-executable instructions performing all the steps of the computer-implemented method according the technical concept presented herein.
EP14461534.1A 2014-05-28 2014-05-28 Procédé et système pour visualisation tactique Withdrawn EP2950035A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP14461534.1A EP2950035A1 (fr) 2014-05-28 2014-05-28 Procédé et système pour visualisation tactique

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP14461534.1A EP2950035A1 (fr) 2014-05-28 2014-05-28 Procédé et système pour visualisation tactique

Publications (1)

Publication Number Publication Date
EP2950035A1 true EP2950035A1 (fr) 2015-12-02

Family

ID=50897522

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14461534.1A Withdrawn EP2950035A1 (fr) 2014-05-28 2014-05-28 Procédé et système pour visualisation tactique

Country Status (1)

Country Link
EP (1) EP2950035A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3376152A1 (fr) * 2017-03-13 2018-09-19 MBDA Deutschland GmbH Système de traitement d'informations et procédé de traitement d'informations
DE102018106731A1 (de) * 2018-03-21 2019-09-26 Rheinmetall Electronics Gmbh Militärisches Gerät und Verfahren zum Betreiben eines militärischen Gerätes

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000234897A (ja) * 1999-02-17 2000-08-29 Mitsubishi Electric Corp 射撃目標選定装置
FR2883396A1 (fr) * 2005-03-21 2006-09-22 Giat Ind Sa Procede de coordination et d'aide a la repartition de taches au sein d'une equipe d'operateurs
US20080204361A1 (en) * 2007-02-28 2008-08-28 Science Applications International Corporation System and Method for Video Image Registration and/or Providing Supplemental Data in a Heads Up Display
US20100196859A1 (en) * 2009-02-01 2010-08-05 John David Saugen Combat Information System
US20130229716A1 (en) 2012-03-01 2013-09-05 Cubic Corporation Tactical riflescope with smartphone dock
US20140110482A1 (en) * 2011-04-01 2014-04-24 Zrf, Llc System and method for automatically targeting a weapon

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000234897A (ja) * 1999-02-17 2000-08-29 Mitsubishi Electric Corp 射撃目標選定装置
FR2883396A1 (fr) * 2005-03-21 2006-09-22 Giat Ind Sa Procede de coordination et d'aide a la repartition de taches au sein d'une equipe d'operateurs
US20080204361A1 (en) * 2007-02-28 2008-08-28 Science Applications International Corporation System and Method for Video Image Registration and/or Providing Supplemental Data in a Heads Up Display
US20100196859A1 (en) * 2009-02-01 2010-08-05 John David Saugen Combat Information System
US20140110482A1 (en) * 2011-04-01 2014-04-24 Zrf, Llc System and method for automatically targeting a weapon
US20130229716A1 (en) 2012-03-01 2013-09-05 Cubic Corporation Tactical riflescope with smartphone dock

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3376152A1 (fr) * 2017-03-13 2018-09-19 MBDA Deutschland GmbH Système de traitement d'informations et procédé de traitement d'informations
DE102018106731A1 (de) * 2018-03-21 2019-09-26 Rheinmetall Electronics Gmbh Militärisches Gerät und Verfahren zum Betreiben eines militärischen Gerätes

Similar Documents

Publication Publication Date Title
US20230152059A1 (en) Viewing optic with round counter system
EP4220069A1 (fr) Système d'affichage pour optique de visualisation
CA3181919A1 (fr) Optique de visualisation dotee d'une interface d'activateur
US20090205239A1 (en) System and Method for Determining Target Range and Coordinating Team Fire
CA3025778A1 (fr) Reticule configurable par motif
US20220326596A1 (en) Imaging system for firearm
US11662176B2 (en) Thermal gunsights
EP2950035A1 (fr) Procédé et système pour visualisation tactique
KR101386643B1 (ko) 무기 방렬 보조 장치 및 무기 방렬 보조 방법
US20240102773A1 (en) Imaging enabler for a viewing optic
US20240069323A1 (en) Power Pack for a Viewing Optic
US20240068776A1 (en) Systems and controls for an enabler of a viewing optic
RU2747740C1 (ru) Способ автоматизированного целеуказания на поле боя с доразведкой цели
WO2024040262A1 (fr) Optique de visualisation à énergie solaire ayant un système d'affichage intégré
WO2024020538A1 (fr) Additionneurs d'élévation pour une optique de visualisation avec un système d'affichage intégré

Legal Events

Date Code Title Description
17P Request for examination filed

Effective date: 20150309

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

18D Application deemed to be withdrawn

Effective date: 20161201

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN