WO2010067091A1 - Remote control system and method - Google Patents

Remote control system and method Download PDF

Info

Publication number
WO2010067091A1
WO2010067091A1 PCT/GB2009/051578 GB2009051578W WO2010067091A1 WO 2010067091 A1 WO2010067091 A1 WO 2010067091A1 GB 2009051578 W GB2009051578 W GB 2009051578W WO 2010067091 A1 WO2010067091 A1 WO 2010067091A1
Authority
WO
WIPO (PCT)
Prior art keywords
object
sensors
volume
central processor
location
Prior art date
Application number
PCT/GB2009/051578
Other languages
French (fr)
Inventor
John Joseph Spicer
Original Assignee
Roke Manor Research Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to GB0822387A priority Critical patent/GB2466039B/en
Priority to GB0822387.7 priority
Application filed by Roke Manor Research Limited filed Critical Roke Manor Research Limited
Publication of WO2010067091A1 publication Critical patent/WO2010067091A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0033Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement by having the operator tracking the vehicle either by direct line of sight or via one or more cameras located remotely from the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/02Automatic approach or landing aids, i.e. systems in which flight data of incoming planes are processed to provide landing data
    • G08G5/025Navigation or guidance aids

Abstract

An object remote control system comprises a remote imaging system comprising a plurality of sensors (2); a central processor (9); a user display (11); an object locator and communication links between the sensors, the object locator and the central processor. The sensors (2) are deployed in a predetermined volume (1) and the central processor (9) derives a three dimensional representation of the volume from data received from the sensors. The object locator determines the location of an object (7) within the volume (1) and a view as seen from the determined location of the object is derived in the central processor from the three dimensional representation and displayed on the display. The system further comprises a user input (12) to the central processor; and a communication link (16) between the central processor and the object and commands are input to the user input and communicated to the object, to move the object within the volume. A corresponding method of operation is also provided.

Description

REMOTE CONTROL SYSTEM AND METHOD

This invention relates to a remote object imaging and control system and a method of deriving a view from an object at a location within a volume and controlling the object. The system and method have particular application for use in remote control of vehicles, such as aircraft, ships, or other vessels, but may also be used to control other objects remotely.

There are occasions when it is convenient to be able to see a view as seen by an object, without actually being in the same position as the object. Although, this has been done in the past, for example with remote surgery using robots, such systems rely on a camera being positioned on the robot and the image transmitted, for example by a broadband link, to a location where the surgeon is able to move controls to direct the corresponding movement in the robot. However, there are cases where it is not practical to have cables attached to a remote object, such as undersea applications and there are other situations where there is not sufficient spectrum available in public areas to use wireless communication for sending back video images, such as in communication with aircraft or ships, or when carrying out underwater maintenance or survey works.

In accordance with a first aspect of the present invention, a method of remotely controlling an object comprises deriving a view from an object at a location within a volume by deploying a plurality of sensors in a predetermined volume; receiving sensor data from the plurality of sensors at a central processor; processing the received sensor data to generate a three dimensional representation of the volume; determining a location of an object within the volume; deriving a view of the volume as seen from the location of the object; and displaying the view; wherein the view is displayed to a remote controller; and wherein commands are sent from the remote controller to the object to move the object within the volume in response to the displayed view.

The present invention derives a three dimensional representation of a volume, determines the location of an object and uses these in combination to display a view as seen from the location of the object. Having obtained a view, as seen from the object, this can be used as the basis for controlling that object. The invention enables the displayed image to be used to enable a controller to move the object remotely, for example to avoid any obstacles shown in the view. Preferably, the step of deriving the location of the object includes deriving one or more of position, velocity and attitude of the object. Preferably, the object is a vehicle.

This invention is most suited to controlling movement of vehicles, although it could be used for controlling other objects. Preferably, the vehicle is an aircraft.

This invention is particularly applicable to aircraft, but can also be used on other types of vehicle operating in constrained areas, such as for docking ships, for carrying out underwater surveys, or for maintaining underwater structures, such as oil rigs.

The location of the vehicle may be determined by conventional ground based tracking systems, such as radar, but preferably, the location of the vehicle is determined in the vehicle and communicated to the central processor.

An example of this is using data from a global navigation satellite system (GNSS) terminal on the vehicle, or a long range aid to navigation (LORAN) terminal. Preferably, the volume is an airfield and associated airspace. In order to reduce issues with spectrum allocation, preferably, the sensors communicate with the central processor via fixed links.

The sensors may be installed at fixed locations, or alternatively, the sensors are mobile sensors adapted to be redeployed within the volume. These may be combined with a communication infrastructure, allowing the mobile sensors to be plugged into a wired network.

In the case of mobile sensors, preferably, the sensors send both location and sensor data to the central processor. Preferably, real time data from the sensors is superimposed upon a previously generated representation of the volume.

Any suitable sensor may be used, but preferably, the sensors comprise video cameras, infra-red imagers, sonar or radar transceivers.

In accordance with a second aspect of the present invention, an object remote control system comprises a remote imaging system comprising a plurality of sensors; a central processor; a user display; an object locator and communication links between the sensors, the object locator and the central processor; wherein the sensors are deployed in a predetermined volume; wherein the central processor derives a three dimensional representation of the volume from data received from the sensors; wherein the object locator determines the location of an object within the volume; and wherein a view as seen from the determined location of the object is derived in the central processor from the three dimensional representation and displayed on the display; wherein the system further comprises a user input to the central processor; and a communication link between the central processor and the object; and wherein commands are input to the user input and communicated to the object, to move the object within the volume.

Preferably, the object locator derives one or more of position, velocity and attitude of the object.

Preferably, the object locator is installed on the object. Preferably, the object is a vehicle. Preferably, the vehicle is an aircraft.

Preferably, the volume is an airfield and associated airspace. Preferably, the sensors are mobile sensors.

Preferably, the mobile sensors further comprise location devices. Preferably, the sensors are omnidirectional.

Preferably, the sensors comprise video cameras, infra-red imagers, sonar sensors, or radar transceivers. An example of a method and system according to the present invention will now be described with reference to the accompanying drawings in which:

Figure 1 illustrates a first example of deriving a view as seen from an object in accordance with the present invention;

Figure 2 shows a control centre for use in the systems of Figs.1 and 3; Figure 3 illustrates a first example of a vehicle remote control system and method according to the present invention, applied to aircraft, deriving the view in accordance with the example of Fig. 1;

Figure 4 is a flow diagram illustrating the steps involved using the example of Fig. 3; and, Figure 5 illustrates an example of using the present invention for undersea surveying. Fig. 1 illustrates a method of deriving a view in accordance with the present invention. A volume 1 is defined, for which a remote imaging system is provided. The remote imaging system comprises a plurality of sensors 2, mounted on poles 3, arranged along sides 4, 5 forming a perimeter of the volume 1. In this particular example, the sensors are fixed sensors, mounted on poles, along the perimeter of the volume, but different arrangements of mounting and location are equally possible. The sensors may be mobile sensors and they may be located at any position within the volume, not just along the perimeter, or with suitable construction, the sensors may be suspended from a structure above the volume, hanging down into the volume.

Data received from the sensors 2 is communicated to a control centre 6, shown in more detail in Fig. 2, typically via fixed links to a data input 8. This data is processed by processor 9 and used to generate a three dimensional representation of the volume. This representation may be stored in a store 10. A location of an object 7 within the volume 1 is determined from an object locator and the view, as seen from the object at the determined location is extracted from the 3-D model derived from the sensor data and displayed on a display 11. The object locator may be remote from or mounted on the object. For example, the location of the object may be determined using additional sensors in the system which detect the object and report its position, or the object itself may include some kind of location device, such as GNSS technology, or for above water applications, LORAN and transmit its location at intervals to the control centre 6, e.g. via a wireless link.

Having derived the view as seen from the object, without actually having to put sensors on the object and transmit the sensor data over wireless communication from the object, it is then possible to use the displayed view as part of a remote control system. In the particular example of aircraft, such as cargo aircraft, traffic or shipping surveillance, or crop monitoring aircraft, in order to obtain regulatory approval, ground based pilots are likely to be required for remote controlled aircraft, whether unmanned air vehicles, unmanned autonomous systems, or piloted vehicles requiring outside assistance, when these vehicles are using commercial airfields for take off and landing. Although, current remotely operated vehicles can be controlled by a ground based pilot, this usually involves the use of satellite communication links and equipment on the vehicle, which is very expensive. Even if local terrestrial radio communications were used, a problem arises in that if the ground based pilot were to use video images transmitted from the aircraft, there is a very high spectrum requirement, due to the large radio bandwidth needed to transmit the video back to the ground, which would have a limiting effect on the number of aircraft which can be controlled at any one time. An example of applying the method of the present invention is described with respect to controlling a remotely piloted aircraft, as shown in Fig.3, however the principles described herein may be applied to the remote control of any vehicle within a constrained volume. The aircraft may be an unmanned aircraft, or a manned aircraft where the pilot requires external assistance, such as needing a remote pilot to take control of landing when the aircraft pilot does not have suitable instruments for landing at night or in restricted visibility.

The defined volume 1 includes a runway 13 and airspace 14 surrounding an airfield 15. Along the perimeter of the volume, a number of sensors 2 are provided. In this example the airfield is provided with the sensors mounted on poles 3 to monitor the runway and other parts of the airfield. However, these sensors do not need to be only at the perimeter of the volume 1, nor do they need to be mounted on fixed mountings, provided that their location can be determined and associated with any data generated by those sensors, e.g. using a co-located global positioning system. Mobile sensors may be put in place for a fixed period of time to derive an initial 3-D representation, then moved within the volume to be positioned at locations previously determined to be most likely to have changing circumstances that require regular updates to enable an accurate real time view to be generated. A fixed network may be installed with connections at the desired locations to enable mobile sensors to send data back over fixed links. The control centre 6 may be within, or outside the volume 1 and is connected to the plurality of sensors 2 within the volume. The type of sensor depends upon the desired output for the remote pilot in the control centre and a number of different types of sensor can be used. For example, where a video image is required, the sensors may be cameras, mounted on the poles 3 at intervals along opposite edges 4 of the perimeter of the volume. Multiple cameras facing up, down and in different horizontal directions may be used to cover as large a part of the volume as possible, or omni-directional sensors may be used. Fixed parts of the image, such as the runway, grass, buildings etc., may be pre-stored and images from the video cameras then superimposed over the stored image to show any change in the view from a particular location. Another set of cameras or other types of sensors, e.g. radar may be provided to observe an aircraft 10 and to determine its position, velocity and attitude. Alternatively, the aircraft can transmit the necessary information to the control centre from the aircraft's own navigation systems.

The video feed from the ground based sensors is reconstructed to generate the 3-D representation. Although video is particularly useful, other sources of an image, or view, may be used such as high frequency radio, for example radar imaging, or infrared. This might be essential in conditions of restricted visibility, such as in fog or at night. The reconstruction uses data from the airfield cameras and sensors and may also use stored data. From each sensor, data is related to its position within the volume and a full representation of the volume is built up. Where there are gaps in the basic data, for example due to a lack of sensor coverage, these can be completed by extrapolation, or from a pre-stored representation. Either, the full representation may be pre-stored, or just those features which do not change frequently, such as buildings. When in use, any pre-stored representation is enhanced by real time data, so that the remote pilot can take suitable action if an obstruction appears in an area that needs to be clear for the aircraft to safely pass, such as a vehicle crossing the runway. Having determined the location and direction of travel of the object of interest, the view showing what would have been seen if there had been a camera mounted on the aircraft, given its current position, velocity and attitude is derived.

In the control centre 6, as shown in Fig. 2, the received sensor data is processed in the processor 9 and displayed on a display 11 to the remote pilot. Based on the pilot's skill, he is able to enter instructions, via the user input 12 to the processor, which are transmitted from transceiver 16 to an aircraft 17. The user input 12 may take any convenient form, such as a joy stick, to which the processor responds by generating instructions to the aircraft which will cause corresponding movement of the aircraft control surfaces. Alternatively, the remote pilot may operate by entering discrete instructions to change altitude or speed or to move onto a new heading. The aircraft may have knowledge of its own position which it transmits to the control centre 6 on the ground, such as from an internal navigation system, or communication with the ground may be avoided completely by using ground based air traffic control radar to determine the location of the aircraft. Attitude information enables the view which is generated at a particular location to be the one as seen in the direction of travel of the aircraft. In this example, the ground based pilot communicates with the aircraft via wireless communication to send control instructions, but wireless transmission of control data uses far less bandwidth than video, so this usage is not a significant factor in determining the spectrum requirements for the system. All data heavy communication can be done via fixed links on the ground from each camera, or sensor.

The location of the aircraft 17 may be determined using known ground based methods, such as multilateration as described in GB2250154. The image generation from the sensor data may be carried out using various techniques, such as those applied in virtual studios, as described in GB2413720 A or GB2400513 B.

Fig. 4 is a flow chart of the stages involved in the method of the present invention applied to remote pilotage of a vehicle. As a vehicle comes within range of a volume, defining a control area, where remote pilotage is available, this is detected 20 and a signal to the control centre 6 indicates 21 whether, or not the vehicle requires remote pilotage. This may occur where the vehicle is arriving from elsewhere, e.g. an aircraft waiting to land, or a ship coming into dock, or where the vehicle is at a parking bay, or berth, waiting to move off. If no remote pilotage is required, the control centre continues to monitor 22 for other vehicles which may require the service. If remote pilotage is required, then the control centre sets up communication 23 with the vehicle and allocates a pilot 24. If for some reason, no pilot is available, then the vehicle either remains in its parking bay, or if in flight, can be directed into an automated circling routine 25, outside the controlled volume, using the same sense and avoid systems which have brought it to its current position. When a pilot has been allocated 24, the pilot may send a test transmission 26 to ensure that he is able to communicate with the vehicle and then takes over control as the vehicle reaches the volume edge. The pilot uses the display 11 in the control centre 6 to determine what obstructions may be in the way of the vehicle that he is remotely piloting and provides suitable inputs 27 to move the vehicle to avoid these, whether by changing height, speed, direction or any other suitable option. The display may be of an image, taking data from cameras, or a representation, such as a radar display showing other radar targets to be avoided. Infra red imaging may be superimposed to show otherwise invisible targets such as deer or birds, when operating in poor visibility. At intervals, a check 28 may be made to see if the vehicle has exited the control area. If not, then the remote pilotage continues 29, but once the vehicle has exited the defined control area, then the pilot provides a termination communication 30 to allow the vehicle to take over control of its movement from that point. Although most useful in dealing with aircraft, the invention may also be applied to docking ships in busy harbours, so increasing the number of vessels that a pilot can deal with, by avoiding the pilot having to join and leave each individual one. The steps involved are very similar to those described with respect to Fig.4. The control commands may be sent using wireless communications, without taking up too much of the available spectrum.

The invention may also be used for recreational remote controlled aircraft, for which the basic principles are the same, though possibly over a smaller area. In this case, the control centre 6 may superimpose specific virtual objects which the pilot must avoid, whilst flying within the volume 1 , as well as, or instead of, using real time images on top of the base view created from the sensor data.

Another application is the use of the system in underwater surveys, such as shown in Fig. 5. The plurality of sensors may be provided as a towed array 31 and multiple vessels 32, 33, each with their own towed array 31 positioned in a coordinated manner, so that an image of the underwater volume which the arrays define can be built up. An untethered remote controlled underwater vehicle 34 may be operated within the volume, making use of the remotely generated view as seen from its location to enable the vehicle to be directed to move safely within the volume. Alternatively, to cover a smaller area a single vessel may have a rig with sensors at predetermined locations which can be lowered into an area of interest and the untethered vehicle is then controlled remotely using the remotely generated view as the basis for instructions to move the vehicle and avoid objects, or reach a desired location.

Claims

1. A method of remotely controlling an object, the method comprising deriving a view from an object at a location within a volume by deploying a plurality of sensors in a predetermined volume; receiving sensor data from the plurality of sensors at a central processor; processing the received sensor data to generate a three dimensional representation of the volume; determining a location of an object within the volume; deriving a view of the volume as seen from the location of the object; and displaying the view; wherein the view is displayed to a remote controller; and wherein commands are sent from the remote controller to the object to move the object within the volume in response to the displayed view.
2. A method according to claim 1, wherein the step of deriving the location of the object includes deriving one or more of position, velocity and attitude of the object.
3. A method according to claim 1 or claim 2, wherein the object is a vehicle.
4. A method according to claim 3, wherein the vehicle is an aircraft.
5. A method according to claim 3 or claim 4, wherein the location of the vehicle is determined by the vehicle and communicated to the central processor.
6. A method according to any preceding claim, wherein the volume is an airfield and associated airspace.
7. A method according to any preceding claim, wherein the sensors communicate with the central processor via fixed links.
8. A method according to any of claims 1 to 6, wherein the sensors are mobile sensors adapted to be redeployed within the volume.
9. A method according to claim 8, wherein the sensors send both location and sensor data to the central processor.
10. A method according to any preceding claim, wherein real time data from the sensors is superimposed upon a previously generated representation of the volume.
11. A method according to any preceding claim, wherein the sensors comprise video cameras, infra-red imagers, sonar or radar transceivers.
12. An object remote control system, the system comprising a remote imaging system comprising a plurality of sensors; a central processor; a user display; an object locator and communication links between the sensors, the object locator and the central processor; wherein the sensors are deployed in a predetermined volume; wherein the central processor derives a three dimensional representation of the volume from data received from the sensors; wherein the object locator determines the location of an object within the volume; and wherein a view as seen from the determined location of the object is derived in the central processor from the three dimensional representation and displayed on the display; wherein the system further comprises a user input to the central processor; and a communication link between the central processor and the object; and wherein commands are input to the user input and communicated to the object, to move the object within the volume.
13. A system according to claim 12, wherein the object locator derives one or more of position, velocity and attitude of the object.
14. A system according to claim 12 or claim 13, wherein the object locator is installed on the object.
15. A system according to any of claims 12 to 14, wherein the object is a vehicle.
16. A system according to any of claims 12 to 15, wherein the vehicle is an aircraft.
17. A system according to any of claims 12 to 16, wherein the volume is an airfield and associated airspace.
18. A system according to any of claims 12 to 17, wherein the sensors are mobile sensors.
19. A system according to claim 18, wherein the mobile sensors further comprise location data sources.
20. A system according to any of claims 12 to 19, wherein the sensors are omnidirectional.
21. A system according to any of claims 12 to 20, wherein the sensors comprise video cameras, infra-red imagers, sonar sensors, or radar transceivers.
PCT/GB2009/051578 2008-12-09 2009-11-20 Remote control system and method WO2010067091A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB0822387A GB2466039B (en) 2008-12-09 2008-12-09 Imaging system and method
GB0822387.7 2008-12-09

Publications (1)

Publication Number Publication Date
WO2010067091A1 true WO2010067091A1 (en) 2010-06-17

Family

ID=40289687

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2009/051578 WO2010067091A1 (en) 2008-12-09 2009-11-20 Remote control system and method

Country Status (2)

Country Link
GB (1) GB2466039B (en)
WO (1) WO2010067091A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2489829A (en) * 2011-04-08 2012-10-10 Lfk Lenkflugka Rpersysteme Gmbh Use of image processing to ascertain three-dimensional deviations of an aircraft from a trajectory towards a target
US9322148B2 (en) 2014-06-16 2016-04-26 Caterpillar Inc. System and method for terrain mapping

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1798691A2 (en) * 2003-03-14 2007-06-20 British Broadcasting Corporation Method and apparatus for generating a desired view of a scene from a selected viewpoint
US20080180523A1 (en) * 2007-01-31 2008-07-31 Stratton Kenneth L Simulation system implementing real-time machine data
WO2008112148A1 (en) * 2007-03-08 2008-09-18 Itt Manufacturing Enterprises, Inc. Augmented reality-based system and method providing status and control of unmanned vehicles

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3138264B2 (en) * 1988-06-21 2001-02-26 ソニー株式会社 Image processing method and apparatus
GB9706839D0 (en) * 1997-04-04 1997-05-21 Orad Hi Tec Systems Ltd Graphical video systems
US7099752B1 (en) * 2003-10-27 2006-08-29 Leslie Jae Lenell Safelander
US7840317B2 (en) * 2004-08-16 2010-11-23 Matos Jeffrey A Method and system for controlling a hijacked aircraft

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1798691A2 (en) * 2003-03-14 2007-06-20 British Broadcasting Corporation Method and apparatus for generating a desired view of a scene from a selected viewpoint
US20080180523A1 (en) * 2007-01-31 2008-07-31 Stratton Kenneth L Simulation system implementing real-time machine data
WO2008112148A1 (en) * 2007-03-08 2008-09-18 Itt Manufacturing Enterprises, Inc. Augmented reality-based system and method providing status and control of unmanned vehicles

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2489829A (en) * 2011-04-08 2012-10-10 Lfk Lenkflugka Rpersysteme Gmbh Use of image processing to ascertain three-dimensional deviations of an aircraft from a trajectory towards a target
GB2489829B (en) * 2011-04-08 2015-04-01 Lfk Gmbh Process for guiding the flight of an aircraft to a predetermined target object, and flight-guidance system
US9322148B2 (en) 2014-06-16 2016-04-26 Caterpillar Inc. System and method for terrain mapping

Also Published As

Publication number Publication date
GB0822387D0 (en) 2009-01-14
GB2466039A (en) 2010-06-16
GB2466039B (en) 2013-06-12

Similar Documents

Publication Publication Date Title
US9791866B2 (en) Autonomous cargo delivery system
US10060746B2 (en) Methods and systems for determining a state of an unmanned aerial vehicle
US10496088B2 (en) System, apparatus, and method for the measurement, collection, and analysis of radio signals utilizing unmanned aerial vehicles
US20180046177A1 (en) Motion Sensing Flight Control System Based on Smart Terminal and Terminal Equipment
US9533760B1 (en) Image monitoring and display from unmanned vehicle
US20180342169A1 (en) Autonomous drone service system
US9513635B1 (en) Unmanned aerial vehicle inspection system
US9875657B2 (en) Automated un-manned air traffic control system
US9604723B2 (en) Context-based flight mode selection
US10408936B2 (en) LIDAR light fence to cue long range LIDAR of target drone
Shakhatreh et al. Unmanned aerial vehicles (UAVs): A survey on civil applications and key research challenges
US20170301242A1 (en) Flight control for flight-restricted regions
US10453348B2 (en) Unmanned aerial vehicle management
JP6235716B2 (en) Method for controlling an unmanned aerial vehicle in an environment and system for controlling an unmanned aerial vehicle in an environment
US20170336806A1 (en) Unmanned aerial vehicle electromagnetic avoidance and utilization system
CN102566581B (en) Sensing based on track is with evading
US20180188721A1 (en) Unmanned aerial vehicle inspection system
US20160275801A1 (en) Unmanned Aerial Systems Traffic Management
ES2688233T3 (en) Near Field Navigation System
EP3032368B1 (en) Unmanned aerial vehicle control handover planning
WO2016065623A1 (en) Systems and methods for surveillance with visual marker
US9253453B2 (en) Automatic video surveillance system and method
EP2933663A2 (en) Weather data dissemination
US8244469B2 (en) Collaborative engagement for target identification and tracking
US20180025473A1 (en) Unmanned aerial vehicle privacy controls

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09796783

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09796783

Country of ref document: EP

Kind code of ref document: A1