WO2016184563A1 - Procédé de détermination de la trajectoire en vol d'un drone ennemi - Google Patents

Procédé de détermination de la trajectoire en vol d'un drone ennemi Download PDF

Info

Publication number
WO2016184563A1
WO2016184563A1 PCT/EP2016/000808 EP2016000808W WO2016184563A1 WO 2016184563 A1 WO2016184563 A1 WO 2016184563A1 EP 2016000808 W EP2016000808 W EP 2016000808W WO 2016184563 A1 WO2016184563 A1 WO 2016184563A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
self
foreign
sensor
determined
Prior art date
Application number
PCT/EP2016/000808
Other languages
German (de)
English (en)
Inventor
Michael Tüchler
Markus Oberholzer
Original Assignee
Rheinmetall Air Defence Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rheinmetall Air Defence Ag filed Critical Rheinmetall Air Defence Ag
Publication of WO2016184563A1 publication Critical patent/WO2016184563A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]

Definitions

  • the invention relates to a method for determining the trajectory of an externally controlled drone, a so-called foreign drone, with the features of claim 1.
  • LSS flying objects low, slow, small, that is to say low-flying, slow-flying and small.
  • LSS flying objects can be remote-controlled floating platforms, so-called multicopters with multiple propellers or rotors. There are floating platforms with, for example, four, six, eight or twelve propellers.
  • LSS flying objects can be designed as remote-controlled helicopters. LSS flying objects can be moved at low speed near the ground and along facades or near trees, making them difficult to detect.
  • foreign drones can penetrate into sensitive areas and cause disruptions there.
  • foreign drones could provide flights over government buildings, power plants, infrastructure or military facilities.
  • Third-party drones or drones in general can within their payload data recording devices such. B. optical sensors, z. B. cameras, so that in an unauthorized overflight over sensitive areas, the foreign drones can provide an explanation.
  • foreign drones also cause damage, be it by dangerous payloads or by a collision within the sensitive area. It is therefore of interest to prevent or at least to establish the unauthorized intrusion of foreign drones in sensitive areas in order to be able to initiate countermeasures.
  • a method for the detection of foreign drones is known.
  • the method is used to secure buildings to detect drones by means of different types of sensor devices. This should reduce false alarms.
  • a space to be monitored is monitored by means of at least one omnidirectional sensor device of the first type, wherein the omnidirectional sensor device in the optical frequency range is sensitive.
  • the sensor device of the first type has a panorama sensor with a curved mirror surface and a CCD sensor.
  • the space is further monitored by means of at least one omnidirectional sensor device of the second type.
  • the omnidirectional sensor device of the second kind is sensitive in the radio-wave and / or microwave range.
  • a radio feel sensor is provided, which is sensitive in the radio frequency range.
  • the optical sensor devices are used for spatial location.
  • the space is monitored with two sensor devices of the first and second type, wherein from the received signals for each type of sensor device, the location of the signal origin in a measurement plane or in three-dimensional space is determined. This determines the position of the drone.
  • the sensor devices of the second type determine radio control data of the drone. It is checked whether the received signals are at least partially drone-typical signals or signal sequences. Based on a temporal sequence of the received signals of an object, a movement pattern of the object is created and checked whether it is a movement pattern typical for a drone. It is checked whether the signals received by the sensor devices of the first and second type correlate with respect to the place of origin. When the signals correlate, an alarm is triggered.
  • the sensor devices are arranged stationary and, for example, mounted on a mast or arranged on existing infrastructure of a building.
  • This method is not yet optimally designed.
  • the disadvantage is that only a foreign drone is detected when the foreign drone has penetrated into a specific space to be monitored.
  • the sensor devices can no longer provide data regarding the trajectory of the foreign drone.
  • the foreign drone has left the sensitive area, ie the area to be monitored, after an reconnaissance mission, it is no longer detectable due to the limited sensor range. It is therefore no longer possible to detect a landing location of the foreign drone, which is outside the surveillance area.
  • the foreign drones are recorded because on the one hand they themselves represent a certain value and on the other hand important information data is often stored on a storage medium of the foreign drone. A Radio transmission from the foreign drone often does not take place, so that the foreign drone is not unmasked in their reconnaissance mission by the second sensor devices.
  • a drone (UAV) is known, by means of which a target in the airspace on the ground or on water can be monitored and tracked.
  • the system determines the best flight path to enable the best possible camera shots while minimizing the risk of being discovered.
  • the drone can be operated in a stalker mode. Using a stalker system, the distance to the target is kept between a minimum distance and a maximum distance.
  • the drone has a video camera and other sensors.
  • the video signal of the drone is compared with previous records to locate possible targets. These sensors include i.a. a position sensor, a speed sensor, an acceleration sensor and other kinematic sensors.
  • There is also a command module containing an environment model with mission environment data, such as topographical maps of the surroundings, country histories, and the like.
  • the invention is therefore based on the object of designing and further developing the aforementioned method so that the determination of the trajectory and / or the determination of the landing location of the foreign drone is improved.
  • predestined landing sites in an environment model already deposited in advance.
  • protected landing sites for example, be deposited below bridges in the environment model.
  • This award from the most suitable landing sites can already be saved in advance and does not have to be done until the third party drone has entered.
  • the particularly predestined landing locations are determined by means of the image recognition of the self-drone.
  • the predestined landing locations can be determined during the pursuit or during a previous reconnaissance flight. This increases the chances that the actual landing location of the foreign drone and thus also the further trajectory can be determined more accurately.
  • a self-drone is used to determine the position of the foreign drone.
  • the self-drone is equipped with a position sensor and with a sensor arrangement.
  • the position of the self-drone is determined by means of the position sensor.
  • the foreign drone is detected by the sensor arrangement. This can be done, in particular, by adapting the trajectory of the self-drone and in particular also the orientation of the sensor to the position of the foreign drone.
  • the current position of the foreign drone is determined.
  • the foreign drone is tracked with the self-drone.
  • the fact that the foreign drone is tracked with the self-drone, the area to be monitored can be significantly expanded. It reduces the risk that the foreign drone leaves the area to be monitored, as the monitored area is widened by the pursuit.
  • the determination of the trajectory and in particular the determination of the landing location of the foreign drone is improved. It is conceivable that the self-drone permanently monitors a sensitive area from the air or rises only when necessary or after an advance warning in order to protect the sensitive area.
  • At least one stationary sensor is provided, wherein a primary area is monitored by means of the at least one stationary sensor, whereby, as soon as a foreign drone has been detected in this primary area of the at least one stationary sensor or has been determined that with a certain probability Third-party drone has penetrated into the primary area, the self-drone is started, now follows the drone of the foreign drone and an extended secondary area is monitored in the pursuit. It is conceivable that in addition to the drone several, in particular stationary sensors are provided which monitor this first or primary area. As soon as a foreign drone enters this primary area or it has been determined that a foreign drone has penetrated into the primary area with a certain probability, the self-drone can start.
  • the self-drone further preferably has a communication module, wherein at least the position of the self-drone is transmitted to a ground station by means of the communication module.
  • the position sensor in particular the position of the self-drone is determined and sent via the communication module to a ground station.
  • the ground station can detect the changes in position of the self-drone and thus the foreign drone in order to determine the landing site.
  • the position of the self-drone is here approximately used to determine the approximate position of the foreign drone.
  • the self-drone can follow the foreign drone at a fixed safety distance.
  • a certain safety distance is maintained.
  • the position of the foreign drone can therefore be approximated by a circle with the radius of the safety distance to the position of the self-drone.
  • the smaller the safety distance the more precisely the position of the foreign drone can be limited.
  • the foreign drone can be tracked until the landing site is detected.
  • the landing site is often identical to the starting location of the foreign drone. Therefore, it is possible, if appropriate, to be able to address the operating personnel of the third-party drone or to obtain further information about the operating personnel when the landing site and starting location have been determined.
  • a different position of the external drone from the position of the self-drone can be transmitted.
  • This more accurate position specification of the foreign drone is determined based on the sensor data. It is also possible in particular for sensor data recorded by the sensor arrangement to be transmitted to the ground station.
  • the position sensor can be designed, for example, as a sensor for a global navigation satellite system such as, for example, GPS or Galileo.
  • the position sensor can be designed as a mobile radio device and use a radio mast bearing.
  • the sensor arrangement preferably has a plurality of sensors, each having a different field of view in a different spatial orientation.
  • the sensor arrangement preferably has at least one optical sensor.
  • This optical sensor can be designed as a camera, in particular as a video camera or TV camera.
  • the corresponding camera may include a CCD sensor or the like.
  • the Sensor arrangement can in particular have a camera system with at least one camera, preferably with several cameras.
  • the sensor arrangement can be configured such that at least one of the sensors, preferably a plurality of sensors, can / can be moved about at least one spatial axis in order to increase the field of view. The movement may be by pitching, rotating, pivoting or scanning or the like.
  • an image recognition is used, the sensor data of the optical sensor being compared by means of the image recognition with a stored, stored environment model.
  • the sensor data could be assigned to the corresponding object in the environment model, the position of the intruder's drone and / or the position of the foreign drone can be determined.
  • the distance of the objects to the intrinsic drone can be estimated.
  • a determination of the position of the self-drone can take place if several different objects have been detected by means of the image recognition or image processing.
  • the image recognition recognizes that the foreign drone at least partially obscures a certain object, then it is recognized that the foreign drone is located between the object and the position of the self-drone.
  • the position of the self-drone can be estimated by this.
  • the distance between the foreign drone and the intruder's drone can be estimated.
  • the sensor arrangement of the respective self-drone further comprises a radar transmitter and a radar receiver.
  • a radar transmitter By means of this radar sensor, it is possible to gain additional accurate distance information.
  • speeds of the external drone can be calculated via the coupling to time data.
  • detected foreign drones for example, can also be categorized and evaluated with respect to their flight altitude over ground and their speed.
  • the environmental model of the area to be monitored is stored in a data processing system of the drone.
  • the environment model is stored at the location of the ground station.
  • the type and / or the design of the foreign drone is preferably determined. These are preferably in a memory of the self-drone and / or in a suitable memory at the ground station information regarding several types of drones stored. This information may in particular include information regarding the range and / or the maximum flight duration of the foreign drones. This makes it possible to determine a maximum flight range of the foreign drone. This maximum flight range can be referred to as impact zone and is formed by the maximum achievable by the foreign drone zone. The landing site can only be within the effective zone or at the edge of the zone of action.
  • weather data are taken into account in the calculation of the zone of action of the foreign drone.
  • These weather data can be determined by weather sensors.
  • these weather sensors are assigned to the self-drone.
  • these weather sensors may be arranged stationary, wherein the weather data are either processed by the ground station or transmitted in a preferred embodiment to the drone.
  • the current and / or predicted wind direction and / or wind speed is taken into account in the determination of the maximum flight range.
  • Foreign drones, especially in vertical take-off design are strongly influenced by the wind. It is more likely that the landing point is leeward, d. H. in the wind direction as against the wind direction, d. H. is located on the windward side. The zone of action is leeward greater than the windward side.
  • the respective current positions of the self-drone and the foreign drone can also be displayed on a conventional display device, such as a screen, in order to be able to control connected emergency services.
  • a conventional display device such as a screen
  • suitable landing are displayed. It is conceivable that also the effect zone is displayed. It is now possible, with the self-drone or with other own drones, to fly to these predestined landing places and to monitor these landing places in particular.
  • the self-drone With its sensors, the self-drone captures completely or in sections the area to be monitored, ie the corresponding sensitive area and edge areas, and continuously determines its own position. An intruding into the sensitive area Foreign drone is detected by the sensors of the self-drone. The self-drone determines the current position of the foreign drone. The self-drone approaches in particular to a safety distance of the foreign drone.
  • the control of the self-drone takes place autonomously, in particular during the tracking.
  • the self-drone is equipped with a data processing system.
  • the drone is controlled by the data processing system.
  • the self-drone control is performed by a control program executed by a data processing unit in the self-drone.
  • This control program contains the instruction to follow the foreign drone up to the safety distance.
  • the safety distance can be, for example, 20 meters.
  • the self-drone meanwhile transmits the position of the foreign drone to the ground station. This transmission to the ground station takes place until the foreign drone has reached the landing site.
  • the data processing system may be designed to control the drone as part of a ground station, with control commands are transmitted from the ground station to the self-drone.
  • Fig. 1 in a schematic representation of an environment to be monitored with a foreign drone and with a self-drone.
  • the invention now relates to a method for determining the trajectory 5 of a foreign drone 6.
  • a sensor arrangement in particular by means of at least one optical sensor, the foreign drone 6 is detected.
  • the optical sensor or the sensor arrangement is indicated here schematically by a detection area 7.
  • a self-drone 8 is used to determine the position of the foreign drone 6, wherein the self-drone 8 is equipped with a sensor arrangement, in particular with at least one optical sensor and with a position sensor, wherein the position of the self-drone 8 is determined with the position sensor, wherein the foreign drone 6 of of the Sensor arrangement is detected, wherein due to the determined position of the self-drone 8 and / or due to the sensor data of the sensor arrangement, the current position of the foreign drone 6 is determined, the foreign drone 6 is followed by the self-drone 8.
  • This has the advantage that the trajectory 5 can be better determined. In particular, a landing location 9 can be better determined.
  • the self-drone 8 is preferably designed as an LSS flying object.
  • the foreign drone 6 may in particular also be designed as an LSS flying object.
  • Such LSS flying objects can be remote-controlled floating platforms, so-called multicopters with multiple propellers or rotors. Furthermore, it is conceivable to use a helicopter as the LSS flying object.
  • an environment model is preferably stored in advance.
  • the environment model contains in particular information about existing buildings 1, 2, 3, the trees 4 and the like.
  • the self-drone 8 detects the foreign drone 6 and follows it.
  • the control of the self-drone 8 is preferably carried out autonomously.
  • the self-drone 8 preferably has a data processing system.
  • the flight route of the self-drone 8 is determined by means of the data processing system.
  • the self-drone 8 flies autonomously.
  • the flight path of the self-drone 8 is determined as a function of the trajectory 5 of the foreign drone 6.
  • the flight route of the self-drone 8 is determined such that the distance between the self-drone 8 and the foreign drone 6 does not exceed a maximum distance.
  • the flight route of the self-drone 8 is determined such that the minimum distance, i. H. a corresponding safety distance is not fallen below.
  • the safety distance can be, for example, 20 meters.
  • the maximum distance may be, for example, more than 20 meters, in particular 50, 70, 80 or 100 meters.
  • the speed of the self-drone 8 is preferably adapted to the speed of the foreign drone 6.
  • the direction of flight of the self-drone 8 is preferably adapted to the direction of flight of the foreign drone 6.
  • the sensor arrangement of the self-drone 8 preferably has at least one optical sensor.
  • a plurality of sensors are provided.
  • the sensors may have different detection ranges 7.
  • the at least one optical sensor can be designed as a camera, in particular as a video camera or TV camera.
  • the optical sensor may comprise a CCD sensor or the like or as a CCD sensor be educated.
  • the sensors can be formed by a camera system with at least one camera, in particular with a plurality of cameras. It is conceivable that the sensors can be moved about a spatial axis in order to change the detection area 7.
  • the detection area 7 may be changed by a pitching motion, a rotation, a pivotal movement or a scan or the like. As a result, the detection of the foreign drone 6 is improved.
  • the position is determined by a position sensor.
  • the position sensor can use GPS data, Galileo or a GSM bearing or a radio mast bearing.
  • the buildings 1, 2 may, for example, lie in a sensitive area.
  • the buildings 1, 2 may, for example, be formed as government buildings or be formed by power station and infrastructure facilities or by military facilities.
  • the inventive method increases the chances that access to the foreign drone 6 is obtained.
  • the type of foreign drone 6 is first determined by means of the data processing system of the self-drone 8.
  • a database with information about drone types is stored in the data processing system.
  • this database may contain information on size, drive, range, maximum altitudes, maximum duration of flight, and the like.
  • the sensor data determined by means of the sensor arrangement are evaluated by means of an image recognition.
  • the evaluated information is compared with the information stored in the database so that the foreign drone 6 can be assigned a drone type or at least a group of drone types. Based on this assignment, a maximum range of the foreign drone 6 can now be determined.
  • This maximum range is marked as a zone of effect in the environment model. It is now compared in particular which of the suitable landing places 9, 10, 11, 12 lie within the action zone. For example, it is conceivable that the landing places 9, 11 and 12 are within the zone of action, but the landing area 10 to the left of the building 1 is outside the zone of action. Therefore, the principal landing site 10 can be excluded as an actually possible or suitable landing place.
  • the landing locations 9, 10, 11 and 12 each have different levels of probabilities.
  • the probability of the landing place 10 would be, for example, 0 percent.
  • the assignment of the probabilities can depend on altitude and the distance of the foreign drone 6 to the landing site 9, 1 1, 12 and the Speed of the foreign drone 6 done. For example, a small distance of the foreign drone 6 to the landing site 9 and a low altitude can lead to a higher probability of the landing location 9 compared to the landing locations 11 and 12.
  • the probabilities of the landing places 9, 10, 11, 12 can be continuously updated.
  • the landing locations 9, 10, 11, 12 of a ground station are displayed. It is conceivable to also display the size of the probabilities.
  • the position determination of the foreign drone 6 is also carried out by means of the at least one optical sensor, is evaluated in the means of the image recognition, in which area of the environment model, the foreign drone 6 is currently and / or at what distance the foreign drone 6 for self-drone 8 moves.
  • weather data are taken into account in the calculation or determination of the zone of effect.
  • the weather data can be determined by sensors of the self-drone 8 itself or provided by stationary sensors.
  • the weather data include at least the wind speed and the wind direction.
  • the environment model contains in particular 3D information about the environment to be overflown.
  • the 3D environment model contains elevation information.
  • the self-drone 8 is in contact via a communication module with a ground station.
  • the communication module allows a radio connection to the ground station.
  • At least the position of the self-drone 8 and / or the position of the foreign drone 6 is transmitted to the ground station.
  • the zone of action and / or the landing location 9, 10, 11, 12 are transmitted to the ground station.
  • the zone of action and / or the landing area 9, 10, 1 1, 12 are determined by the ground station.
  • the transmission of the positions of the self-drone 8 and / or foreign drone 6 is preferably carried out continuously, in particular periodically.
  • the detected flight duration of the foreign drone 6 can be included in the calculation of the action zone.
  • the current flying altitude of the foreign drone 6 can also be included in the detected effective zone. LIST OF REFERENCE NUMBERS

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

L'invention concerne un procédé de détermination de la trajectoire en vol (5) d'un drone ennemi (6), le drone ennemi (6) étant détecté au moyen d'au moins un ensemble de capteurs, un drone ami (8) étant utilisé pour déterminer la position du drone ennemi (6), le drone ami (8) étant équipé d'un ensemble de capteurs et d'un capteur de position. La position du drone ami (8) est déterminée par le détecteur de position, le drone ennemi (6) étant détecté par l'ensemble de capteurs, la position du drone ennemi (6) étant déterminée sur la base de la position déterminée du drone ami (8) et/ou sur la base des données de capteurs de l'ensemble de capteurs, le drone ennemi (6) étant suivi par le drone ami (8), et un modèle d'environnement étant mémorisé. La détermination de la trajectoire en vol et/ou la détermination du lieu d'atterrissage du drone ennemi (6) sont améliorées en raison du fait que le modèle d'environnement comprend des lieux d'atterrissage appropriés (9, 10, 11, 12).
PCT/EP2016/000808 2015-05-18 2016-05-17 Procédé de détermination de la trajectoire en vol d'un drone ennemi WO2016184563A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102015006233.6A DE102015006233B4 (de) 2015-05-18 2015-05-18 Verfahren zur Bestimmung der Flugbahn einer Fremddrohne
DE102015006233.6 2015-05-18

Publications (1)

Publication Number Publication Date
WO2016184563A1 true WO2016184563A1 (fr) 2016-11-24

Family

ID=56008571

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/000808 WO2016184563A1 (fr) 2015-05-18 2016-05-17 Procédé de détermination de la trajectoire en vol d'un drone ennemi

Country Status (2)

Country Link
DE (1) DE102015006233B4 (fr)
WO (1) WO2016184563A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11064363B2 (en) 2016-10-11 2021-07-13 Whitefox Defense Technologies, Inc. Systems and methods for cyber-physical vehicle management, detection and control
US11134380B2 (en) 2016-10-11 2021-09-28 Whitefox Defense Technologies, Inc. Systems and methods for cyber-physical vehicle management, detection and control
US11487017B2 (en) * 2019-03-21 2022-11-01 Alberta Centre For Advanced Mnt Products Drone detection using multi-sensory arrays
US11558743B2 (en) 2018-09-05 2023-01-17 Whitefox Defense Technologies, Inc. Integrated secure device manager systems and methods for cyber-physical vehicles

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11017680B2 (en) 2015-09-30 2021-05-25 Alarm.Com Incorporated Drone detection systems
DE102019104866A1 (de) * 2019-02-26 2020-08-27 Rheinmetall Waffe Munition Gmbh Drohne sowie Verfahren zur Zielbekämpfung

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007091992A2 (fr) * 2005-01-21 2007-08-16 The Boeing Company Affichage de conscience situationnelle
US20090157233A1 (en) * 2007-12-14 2009-06-18 Kokkeby Kristen L System and methods for autonomous tracking and surveillance
DE102007062603A1 (de) * 2007-12-22 2009-07-02 Diehl Bgt Defence Gmbh & Co. Kg Verfahren und Vorrichtung zur Detektion Daten sendender Fahrzeuge
EP3012659A2 (fr) * 2014-10-22 2016-04-27 Honeywell International Inc. Zones d'arpentage au moyen d'un système radar et véhicule aérien télépiloté

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107577247B (zh) * 2014-07-30 2021-06-25 深圳市大疆创新科技有限公司 目标追踪系统及方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007091992A2 (fr) * 2005-01-21 2007-08-16 The Boeing Company Affichage de conscience situationnelle
US20090157233A1 (en) * 2007-12-14 2009-06-18 Kokkeby Kristen L System and methods for autonomous tracking and surveillance
DE102007062603A1 (de) * 2007-12-22 2009-07-02 Diehl Bgt Defence Gmbh & Co. Kg Verfahren und Vorrichtung zur Detektion Daten sendender Fahrzeuge
EP3012659A2 (fr) * 2014-10-22 2016-04-27 Honeywell International Inc. Zones d'arpentage au moyen d'un système radar et véhicule aérien télépiloté

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11064363B2 (en) 2016-10-11 2021-07-13 Whitefox Defense Technologies, Inc. Systems and methods for cyber-physical vehicle management, detection and control
US11134380B2 (en) 2016-10-11 2021-09-28 Whitefox Defense Technologies, Inc. Systems and methods for cyber-physical vehicle management, detection and control
US11558743B2 (en) 2018-09-05 2023-01-17 Whitefox Defense Technologies, Inc. Integrated secure device manager systems and methods for cyber-physical vehicles
US11487017B2 (en) * 2019-03-21 2022-11-01 Alberta Centre For Advanced Mnt Products Drone detection using multi-sensory arrays

Also Published As

Publication number Publication date
DE102015006233B4 (de) 2020-12-03
DE102015006233A1 (de) 2016-11-24

Similar Documents

Publication Publication Date Title
DE102015006233B4 (de) Verfahren zur Bestimmung der Flugbahn einer Fremddrohne
DE102009009896B4 (de) Verfahren und Vorrichtung zur Erfassung von Zielobjekten
US11827352B2 (en) Visual observer for unmanned aerial vehicles
EP3493107A1 (fr) Système de reconnaissance et d'observation étendues optique mobile à détection d'objets automatique et procédé de reconnaissance et d'observation étendues optiques mobiles à détection d'objets automatique
EP2702382A2 (fr) Procédé et système pour examiner une surface sous le rapport des défauts de matière
EP3819659A1 (fr) Module de communication pour composants de systèmes de défense aérienne tactique
EP3139125A1 (fr) Systeme de defense et installation de defense par drone contre les drones etrangers
DE102016008553B4 (de) System umfassend ein Kraftfahrzeug und ein unbemanntes Luftfahrzeug und Verfahren zum Erfassen von Gegebenheiten in einer Umgebung eines Kraftfahrzeugs
WO2019170649A1 (fr) Système de gestion d'informations de situation aérienne et de trafic pour véhicules aériens sans pilote et pilotés
DE102007062603B4 (de) Verfahren und Vorrichtung zur Detektion Daten sendender Fahrzeuge
DE102020112415A1 (de) Zonenbasierte landesysteme und -verfahren für unbemannte luftfahrzeuge
DE102019109127B4 (de) Drohnenbasiertes Luft- und Kollisionsüberwachungssystem
DE102014224884A1 (de) Verfahren und System zum Überwachen von Logistikeinrichtungen
DE102018008282A1 (de) Vorrichtung und Verfahren zum Erfassen von Flugobjekten
EP3373092B1 (fr) Procédé de localisation d'une défaillance d'un système
WO2011157723A1 (fr) Système et procédé d'évitement de collisions
DE102019211048A1 (de) Verfahren und Einrichtung zur automatisierbaren Personenzugangskontrolle zu einem im Freien befindlichen Anlagengelände
EP3486404A2 (fr) Système de reconnaissance et d'observation étendues optique mobile et procédé de reconnaissance et d'observation étendues optiques mobiles
DE102021110647A1 (de) Verfahren, Abfangdrohne und Abfangsystem zur Abwehr einer unerwünschten Fremddrohne
EP1732349A2 (fr) Procédé et dispositif destinés à la télélecture de données
DE102016110477A1 (de) Verfahren zum Positionieren eines insbesondere unbemannten Luftfahrzeuges mit Hilfe einer aktiven statischen Bodenstation sowie Luftfahrzeug und Bodenstation zur Durchführung des Verfahrens
EP3404443B1 (fr) Dispositif et procédé de détection d'un usager de la route
DE102022120476A1 (de) Verfahren und Aufklärungssystem zur Aufklärung eines unerlaubten Einflugs einer Fremddrohne in einen definierten Luftbereich
DE102022200355A1 (de) Verfahren, Abwehrdrohne und Abwehrsystem zur Abwehr einer Fremddrohne
DE19818426C2 (de) Verfahren zur Fernaufklärung und Zielortung

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16723019

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16723019

Country of ref document: EP

Kind code of ref document: A1