GB2466039A - Deriving A View From An Object At A Location Within A Volume - Google Patents

Deriving A View From An Object At A Location Within A Volume Download PDF

Info

Publication number
GB2466039A
GB2466039A GB0822387A GB0822387A GB2466039A GB 2466039 A GB2466039 A GB 2466039A GB 0822387 A GB0822387 A GB 0822387A GB 0822387 A GB0822387 A GB 0822387A GB 2466039 A GB2466039 A GB 2466039A
Authority
GB
United Kingdom
Prior art keywords
sensors
volume
location
central processor
view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0822387A
Other versions
GB2466039B (en
GB0822387D0 (en
Inventor
John Joseph Spicer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Roke Manor Research Ltd
Original Assignee
Roke Manor Research Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Roke Manor Research Ltd filed Critical Roke Manor Research Ltd
Priority to GB0822387.7A priority Critical patent/GB2466039B/en
Publication of GB0822387D0 publication Critical patent/GB0822387D0/en
Priority to PCT/GB2009/051578 priority patent/WO2010067091A1/en
Publication of GB2466039A publication Critical patent/GB2466039A/en
Application granted granted Critical
Publication of GB2466039B publication Critical patent/GB2466039B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0033Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by having the operator tracking the vehicle either by direct line of sight or via one or more cameras located remotely from the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/02Automatic approach or landing aids, i.e. systems in which flight data of incoming planes are processed to provide landing data
    • G08G5/025Navigation or guidance aids

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Geometry (AREA)
  • Computing Systems (AREA)
  • Computer Graphics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

A method of deriving a view from an object at a location within a volume 1, comprises deploying a plurality of sensors in a predetermined volume 1, receiving sensor data from the plurality of sensors 2 at a central processor 9 and processing the received sensor data to generate a three dimensional representation of the volume 1. A location of an object within the volume 1 is determined. A view of the volume as seen from the location of the object is derived and the view is displayed. The method may be applied to retmote control of an aircraft during landing.

Description

IMAGING SYSTEM AND METHOD
This invention relates to a remote object imaging system and a method of deriving a view from an object at a location within a volume. The system and method have particular application for use in remote control of vehicles, such as aircraft, ships, or other vessels.
There are occasions when it is convenient to be able to see a view as seen by an object, without actually being in the same position as the object. Although, this has been done in the past, for example with remote surgery using robots, such systems rely on a camera being positioned on the robot and the image transmitted, for example by a broadband link, to a location where the surgeon is able to move controls to direct the corresponding movement in the robot. However, there are cases where it is not practical to have cables attached to a remote object, such as undersea applications and there are other situations where there is not sufficient spectrum available in public areas to use wireless communication for sending back video images, such as in communication with aircraft or ships, or when carrying out underwater maintenance or survey works.
In accordance with a first aspect of the present invention, a method of deriving a view from an object at a location within a volume comprises deploying a plurality of sensors in a predetermined volume; receiving sensor data from the plurality of sensors at a central processor; processing the received sensor data to generate a three dimensional representation of the volume; determining a location of an object within the volume; deriving a view of the volume as seen from the location of the object; and displaying the view.
The present invention derives a three dimensional representation of a volume, determines the location of an object and uses these in combination to display a view as seen from the location of the object.
Having obtained a view, as seen from the object, this can be used as the basis for controlling that object.
In accordance with a second aspect of the present invention, a method of remotely controlling an object comprises carrying out the step of deriving a view in accordance with the method of the first aspect, displaying the view to a remote controller; and sending commands from the remote controller to the object to move the object within the volume in response to the displayed view.
The invention enables the displayed image to be used to enable a controller to move the object remotely and avoid any obstacles shown in the view.
Preferably, the step of deriving the location of the object includes deriving one or more of position, velocity and attitude of the object.
Preferably, the object is a vehicle.
This invention is most suited to controlling movement of vehicles, although it could be used for controlling other objects.
Preferably, the vehicle is an aircraft.
This invention is particularly applicable to aircraft, but can also be used on other types of vehicle operating in constrained areas, such as for docking ships, for carrying out underwater surveys, or for maintaining underwater structures, such as oil rigs.
The location of the vehicle may be determined by conventional ground based tracking systems, such as radar, but preferably, the location of the vehicle is determined in the vehicle and communicated to the central processor.
An example of this is using data from a global navigation satellite system (GNSS) terminal on the vehicle, or a long range aid to navigation (LORAN) terminal.
Preferably, the volume is an airfield and associated airspace.
In order to reduce issues with spectrum allocation, preferably, the sensors communicate with the central processor via fixed links.
The sensors may be installed at fixed locations, or alternatively, the sensors are mobile sensors adapted to be redeployed within the volume. These may be combined with a communication infrastructure, allowing the mobile sensors to be plugged into a In the case of mobile sensors, preferably, the sensors send both location and sensor data to the central processor.
Preferably, real time data from the sensors is superimposed upon a previously generated representation of the volume.
Any suitable sensor may be used, but preferably, the sensors comprise video cameras, infra-red imagers, sonar or radar transceivers.
In accordance with a third aspect of the present invention, a remote imaging system comprises a plurality of sensors; a central processor; a user display; an object locator and communication links between the sensors, the object locator and the central processor; wherein the sensors are deployed in a predetermined volume; wherein the central processor derives a three dimensional representation of the volume from data received from the sensors; wherein the object locator determines the location of an object within the volume; and wherein a view as seen from the determined location of the object is derived in the central processor from the three dimensional representation and displayed on the display.
In accordance with a fourth aspect of the present invention, an object remote control system comprises a remote imaging system according to the third aspect; a user input to the central processor; and a communication link between the central processor and the object; wherein commands are input to the user input and communicated to the object, to move the object within the volume.
Preferably, the object locator derives one or more of position, velocity and attitude of the object.
Preferably, the object locator is installed on the object.
Preferably, the object is a vehicle.
Preferably, the vehicle is an aircraft.
Preferably, the volume is an airfield and associated airspace.
Preferably, the sensors are mobile sensors.
Preferably, the mobile sensors further comprise location devices.
Preferably, the sensors are omnidirectional.
Preferably, the sensors comprise video cameras, infra-red imagers, sonar sensors, or radar transceivers.
An example of a method and system according to the present invention will now be described with reference to the accompanying drawings in which: Figure 1 illustrates a first example of deriving a view as seen from an object in accordance with the present invention; Figure 2 shows a control centre for use in the systems of Figs. I and 3; Figure 3 illustrates a first example of a vehicle remote control system and method according to the present invention, applied to aircraft, deriving the view in accordance with the example of Fig. 1; Figure 4 is a flow diagram illustrating the steps involved using the example of Fig. 3; and, Figure 5 illustrates an example of using the present invention for undersea surveying.
Fig. 1 illustrates a method of deriving a view in accordance with the present invention. A volume 1 is defined, for which a remote imaging system is provided. The remote imaging system comprises a plurality of sensors 2, mounted on poles 3, arranged along sides 4, 5 forming a perimeter of the volume 1.
In this particular example, the sensors are fixed sensors, mounted on poles, along the perimeter of the volume, but different arrangements of mounting and location are equally possible. The sensors may be mobile sensors and they may be located at any position within the volume, not just along the perimeter, or with suitable construction, the sensors may be suspended from a structure above the volume, hanging down into the volume.
Data received from the sensors 2 is communicated to a control centre 6, shown in more detail in Fig. 2, typically via fixed links to a data input 8. This data is processed by processor 9 and used to generate a three dimensional representation of the volume. This representation may be stored in a store 10. A location of an object 7 within the volume 1 is determined and the view, as seen from the object at the determined location is extracted from the 3-D model derived from the sensor data and displayed on a display 11. The location of the object may be determined using additional sensors in the system which detect the object and report its position, or the object itself may include some kind of location device, such as GNSS technology, or for above water applications, LORAN and transmit its location at intervals to the control centre 6, e.g. via a wireless link.
Having derived the view as seen from the object, without actually having to put sensors on the object and transmit the sensor data over wireless communication from the object, it is then possible to use the displayed view as part of a remote control system. In the particular example of aircraft, such as cargo aircraft, traffic or shipping surveillance, or crop monitoring aircraft, in order to obtain regulatory approval, ground based pilots are likely to be required for remote controlled aircraft, whether unmanned air vehicles, unmanned autonomous systems, or piloted vehicles requiring outside assistance, when these vehicles are using commercial airfields for take off and landing.
Although, current remotely operated vehicles can be controlled by a ground based pilot, this usually involves the use of satellite communication links and equipment on the vehicle, which is very expensive. Even if local terrestrial radio communications were used, a problem arises in that if the ground based pilot were to use video images transmitted from the aircraft, there is a very high spectrum requirement, due to the large radio bandwidth needed to transmit the video back to the ground, which would have a limiting effect on the number of aircraft which can be controlled at any one time.
An example of applying the method of the present invention is described with respect to controlling a remotely piloted aircraft, as shown in Fig.3, however the principles described herein may be applied to the remote control of any vehicle within a constrained volume. The aircraft may be an unmanned aircraft, or a manned aircraft where the pilot requires external assistance, such as needing a remote pilot to take control of landing when the aircraft pilot does not have suitable instruments for landing at night or in restricted visibility.
The defined volume 1 includes a runway 13 and airspace 14 surrounding an airfield 15. Along the perimeter of the volume, a number of sensors 2 are provided. Tn this example the airfield is provided with the sensors mounted on poles 3 to monitor the runway and other parts of the airfield. However, these sensors do not need to be only at the perimeter of the volume 1, nor do they need to be mounted on fixed mountings, provided that their location can be determined and associated with any data generated by those sensors, e.g. using a co-located global positioning system. Mobile sensors may be put in place for a fixed period of time to derive an initial 3-D representation, then moved within the volume to be positioned at locations previously determined to be most likely to have changing circumstances that require regular updates to enable an accurate real time view to be generated. A fixed network may be installed with connections at the desired locations to enable mobile sensors to send data back over fixed links.
The control centre 6 may be within, or outside the volume I and is connected to the plurality of sensors 2 within the volume. The type of sensor depends upon the desired output for the remote pilot in the control centre and a number of different types of sensor can be used. For example, where a video image is required, the sensors may be cameras, mounted on the poles 3 at intervals along opposite edges 4 of the perimeter of the volume. Multiple cameras facing up, down and in different horizontal directions may be used to cover as large a part of the volume as possible, or omni-directional sensors may be used. Fixed parts of the image, such as the runway, grass, buildings etc., may be pre-stored and images from the video cameras then superimposed over the stored image to show any change in the view from a particular location. Another set of cameras or other types of sensors, e.g. radar may be provided to observe an aircraft 10 and to determine its position, velocity and attitude. Alternatively, the aircraft can transmit the necessary information to the control centre from the aircraft's own navigation systems.
The video feed from the ground based sensors is reconstructed to generate the 3-D representation. Although video is particularly useful, other sources of an image, or view, may be used such as high frequency radio, for example radar imaging, or infra-red. This might be essential in conditions of restricted visibility, such as in fog or at night. The reconstruction uses data from the airfield cameras and sensors and may also use stored data. From each sensor, data is related to its position within the volume and a full representation of the volume is built up. Where there are gaps in the basic data, for example due to a lack of sensor coverage, these can be completed by extrapolation, or from a pre-stored representation. Either, the full representation may be pre-stored, or just those features which do not change frequently, such as buildings. When in use, any pre-stored representation is enhanced by real time data, so that the remote pilot can take suitable action if an obstruction appears in an area that needs to be clear for the aircraft to safely pass, such as a vehicle crossing the runway. Having determined the location and direction of travel of the object of interest, the view showing what would have been seen if there had been a camera mounted on the aircraft, given its current position, velocity and attitude is derived.
In the control centre 6, as shown in Fig. 2, the received sensor data is processed in the processor 9 and displayed on a display 11 to the remote pilot. Based on the pilot's skill, he is able to enter instructions, via the user input 12 to the processor, which are transmitted from transceiver 16 to an aircraft 17. The user input 12 may take any convenient form, such as ajoy stick, to which the processor responds by generating instructions to the aircraft which will cause corresponding movement of the aircraft control surfaces. Alternatively, the remote pilot may operate by entering discrete instructions to change altitude or speed or to move onto a new heading.
The aircraft may have knowledge of its own position which it transmits to the control centre 6 on the ground, such as from an internal navigation system, or communication with the ground may be avoided completely by using ground based air traffic control radar to determine the location of the aircraft. Attitude information enables the view which is generated at a particular location to be the one as seen in the direction of travel of the aircraft. In this example, the ground based pilot communicates with the aircraft via wireless communication to send control instructions, but wireless transmission of control data uses far less bandwidth than video, so this usage is not a significant factor in determining the spectrum requirements for the system. All data heavy communication can be done via fixed links on the ground from each camera, or sensor.
The location of the aircraft 17 may be determined using known ground based methods, such as multilateration as described in GB2250 154. The image generation from the sensor data may be carried out using various techniques, such as those applied in virtual studios, as described in GB2413720 A or GB2400513 B. Fig. 4 is a flow chart of the stages involved in the method of the present invention applied to remote pilotage of a vehicle. As a vehicle comes within range of a volume, defining a control area, where remote pilotage is available, this is detected 20 and a signal to the control centre 6 indicates 21 whether, or not the vehicle requires remote pilotage. This may occur where the vehicle is arriving from elsewhere, e.g. an aircraft waiting to land, or a ship coming into dock, or where the vehicle is at a parking bay, or berth, waiting to move off If no remote pilotage is required, the control centre continues to monitor 22 for other vehicles which may require the service. If remote pilotage is required, then the control centre sets up communication 23 with the vehicle and allocates a pilot 24. If for some reason, no pilot is available, then the vehicle either remains in its parking bay, or if in flight, can be directed into an automated circling routine 25, outside the controlled volume, using the same sense and avoid systems which have brought it to its current position.
When a pilot has been allocated 24, the pilot may send a test transmission 26 to ensure that he is able to communicate with the vehicle and then takes over control as the vehicle reaches the volume edge. The pilot uses the display 11 in the control centre 6 to determine what obstructions may be in the way of the vehicle that he is remotely piloting and provides suitable inputs 27 to move the vehicle to avoid these, whether by changing height, speed, direction or any other suitable option. The display may be of an image, taking data from cameras, or a representation, such as a radar display showing other radar targets to be avoided. Infra red imaging may be superimposed to show otherwise invisible targets such as deer or birds, when operating in poor visibility.
At intervals, a check 28 may be made to see if the vehicle has exited the control area.
If not, then the remote pilotage continues 29, but once the vehicle has exited the defined control area, then the pilot provides a termination communication 30 to allow the vehicle to take over control of its movement from that point.
Although most useful in dealing with aircraft, the invention may also be applied to docking ships in busy harbours, so increasing the number of vessels that a pilot can deal with, by avoiding the pilot having to join and leave each individual one. The steps involved are very similar to those described with respect to Fig.4. The control commands may be sent using wireless communications, without taking up too much of the available spectrum.
The invention may also be used for recreational remote controlled aircraft, for which the basic principles are the same, though possibly over a smaller area. In this case, the control centre 6 may superimpose specific virtual objects which the pilot must avoid, whilst flying within the volume 1, as well as, or instead of, using real time images on top of the base view created from the sensor data.
Another application is the use of the system in underwater surveys, such as shown in Fig. 5. The plurality of sensors may be provided as a towed array 31 and multiple vessels 32, 33, each with their own towed array 31 positioned in a co-ordinated manner, so that an image of the underwater volume which the arrays define can be built up. An untethered remote controlled underwater vehicle 34 may be operated within the volume, making use of the remotely generated view as seen from its location to enable the vehicle to be directed to move safely within the volume.
Alternatively, to cover a smaller area a single vessel may have a rig with sensors at predetermined locations which can be lowered into an area of interest and the untethered vehicle is then controlled remotely using the remotely generated view as the basis for instructions to move the vehicle and avoid objects, or reach a desired location.

Claims (23)

  1. CLAIMS1. A method of deriving a view from an object at a location within a volume, the method comprising deploying a plurality of sensors in a predetermined volume; receiving sensor data from the plurality of sensors at a central processor; processing the received sensor data to generate a three dimensional representation of the volume; determining a location of an object within the volume; deriving a view of the volume as seen from the location of the object; and displaying the view.
  2. 2. A method of remotely controlling an object, the method comprising carrying out the step of deriving a view in accordance with claim 1, displaying the view to a remote controller; and sending commands from the remote controller to the object to move the object within the volume in response to the displayed view.
  3. 3. A method according to claim 1 or claim 2, wherein the step of deriving the location of the object includes deriving one or more of position, velocity and attitude of the object.
  4. 4. A method according to any preceding claim, wherein the object is a vehicle.
  5. 5. A method according to claim 4, wherein the vehicle is an aircraft.
  6. 6. A method according to claim 4 or claim 5, wherein the location of the vehicle is determined by the vehicle and communicated to the central processor.
  7. 7. A method according to any preceding claim, wherein the volume is an airfield and associated airspace.
  8. 8. A method according to any preceding claim, wherein the sensors communicate with the central processor via fixed links.
  9. 9. A method according to any of claims ito 7, wherein the sensors are mobile sensors adapted to be redeployed within the volume.
  10. 10. A method according to claim 9, wherein the sensors send both location and sensor data to the central processor.
  11. 11. A method according to any preceding claim, wherein real time data from the sensors is superimposed upon a previously generated representation of the volume.
  12. 12. A method according to any preceding claim, wherein the sensors comprise video cameras, infra-red imagers, sonar or radar transceivers.
  13. 13. A remote imaging system, the system comprising a plurality of sensors; a central processor; a user display; an object locator and communication links between the sensors, the object locator and the central processor; wherein the sensors are deployed in a predetermined volume; wherein the central processor derives a three dimensional representation of the volume from data received from the sensors; wherein the object locator determines the location of an object within the volume; and wherein a view as seen from the determined location of the object is derived in the central processor from the three dimensional representation and displayed on the display.
  14. 14. A object remote control system, the system comprising a remote imaging system according to claim 13; a user input to the central processor; and a communication link between the central processor and the object; wherein commands are input to the user input and communicated to the object, to move the object within the volume.
  15. 15. A system according to claim 13 or claim 14, wherein the object locator derives one or more of position, velocity and attitude of the object.
  16. 16. A system according to any of claims 13 to 15, wherein the object locator is installed on the object.
  17. 17. A system according to any of claims 13 to 16, wherein the object is a vehicle.
  18. 18. A system according to any of claims 13 to 17, wherein the vehicle is an aircraft.
  19. 19. A system according to any of claims 13 to 18, wherein the volume is an airfield and associated airspace.
  20. 20. A system according to any of claims 13 to 19, wherein the sensors are mobile sensors.
  21. 21. A system according to claim 20, wherein the mobile sensors further comprise location data sources.
  22. 22. A system according to any of claims 13 to 21, wherein the sensors are omnidirectional.
  23. 23. A system according to any of claims 13 to 22, wherein the sensors comprise video cameras, infra-red imagers, sonar sensors, or radar transceivers.Amendments to the claims have been filed as followsCLAIMS1. A method of remotely controlling an object, the method comprising deriving a view from an object at a location within a volume by deploying a plurality of sensors in a predetermined volume; receiving sensor data from the plurality of sensors at a central processor; processing the received sensor data to generate a three dimensional representation of the volume; determining a location of an object within the volume; deriving a view of the volume as seen from the location of the object; and displaying the view; wherein the view is displayed to a remote controller; and wherein commands are sent from the remote controller to the object to move the object within the volume in response to the displayed view.2. A method according to claim 1, wherein the step of deriving the location of the object includes deriving one or more of position, velocity and attitude of the object. 0) 153. A method according to claim 1 or claim 2, wherein the object is a vehicle.4. A method according to claim 3, wherein the vehicle is an aircraft.� 20 5. A method according to claim 3 or claim 4, wherein the location of the vehicle is determined by the vehicle and communicated to the central processor.6. A method according to any preceding claim, wherein the volume is an airfield and associated airspace.7. A method according to any preceding claim, wherein the sensors communicate with the central processor via fixed links.8. A method according to any of claims ito 6, wherein the sensors are mobile sensors adapted to be redeployed within the volume.9. A method according to claim 8, wherein the sensors send both location and sensor data to the central processor.10. A method according to any preceding claim, wherein real time data from the sensors is superimposed upon a previously generated representation of the volume.11. A method according to any preceding claim, wherein the sensors comprise video cameras, infra-red imagers, sonar or radar transceivers.12. An object remote control system, the system comprising a remote imaging system comprising a plurality of sensors; a central processor; a user display; an object locator and communication links between the sensors, the object locator and the central processor; wherein the sensors are deployed in a predetermined volume; wherein the central processor derives a three dimensional representation of the volume from data received from the sensors; wherein the object locator determines the location of an object within the volume; and wherein a view as seen from the determined location of 0) 15 the object is derived in the central processor from the three dimensional representation and displayed on the display; wherein the system further comprises a user input to the central processor; and a communication link between the central processor and the object; and wherein commands are input to the user input and communicated to the object, to move the object within the volume.13. A system according to claim 12, wherein the object locator derives one or more of position, velocity and attitude of the object.14. A system according to claim 12 or claim 13, wherein the object locator is installed on the object.15. A system according to any of claims 12 to 14, wherein the object is a vehicle.16. A system according to any of claims 12 to 15, wherein the vehicle is an aircraft.17. A system according to any of claims 12 to 16, wherein the volume is an airfield and associated airspace.18. A system according to any of claims 12 to 17, wherein the sensors are mobile sensors.19. A system according to claim 18, wherein the mobile sensors further comprise location data sources.20. A system according to any of claims 12 to 19, wherein the sensors are omnidirectional.21. A system according to any of claims 12 to 20, wherein the sensors comprise video cameras, infra-red imagers, sonar sensors, or radar transceivers. a) (\J (\J
GB0822387.7A 2008-12-09 2008-12-09 Imaging system and method Expired - Fee Related GB2466039B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB0822387.7A GB2466039B (en) 2008-12-09 2008-12-09 Imaging system and method
PCT/GB2009/051578 WO2010067091A1 (en) 2008-12-09 2009-11-20 Remote control system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0822387.7A GB2466039B (en) 2008-12-09 2008-12-09 Imaging system and method

Publications (3)

Publication Number Publication Date
GB0822387D0 GB0822387D0 (en) 2009-01-14
GB2466039A true GB2466039A (en) 2010-06-16
GB2466039B GB2466039B (en) 2013-06-12

Family

ID=40289687

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0822387.7A Expired - Fee Related GB2466039B (en) 2008-12-09 2008-12-09 Imaging system and method

Country Status (2)

Country Link
GB (1) GB2466039B (en)
WO (1) WO2010067091A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11431990B2 (en) 2015-06-04 2022-08-30 Thales Holdings Uk Plc Video compression with increased fidelity near horizon

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011016521B4 (en) * 2011-04-08 2020-10-15 Mbda Deutschland Gmbh Method for flight guidance of an aircraft to a specified target object and flight guidance system
US9322148B2 (en) 2014-06-16 2016-04-26 Caterpillar Inc. System and method for terrain mapping
WO2021251971A1 (en) * 2020-06-11 2021-12-16 Darmo Technologies Corp Surrogate pilot and salvage master

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2221118A (en) * 1988-06-21 1990-01-24 Sony Corp Image transformation apparatus
WO1998046029A1 (en) * 1997-04-04 1998-10-15 Orad Hi-Tec Systems Limited Graphical video systems
GB2413720A (en) * 2003-03-14 2005-11-02 British Broadcasting Corp Modifying estimated positions in images of real scenes
US20060032978A1 (en) * 2004-08-16 2006-02-16 Matos Jeffrey A Method and system for controlling a hijacked aircraft

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7099752B1 (en) * 2003-10-27 2006-08-29 Leslie Jae Lenell Safelander
US7920071B2 (en) * 2006-05-26 2011-04-05 Itt Manufacturing Enterprises, Inc. Augmented reality-based system and method providing status and control of unmanned vehicles
US8139108B2 (en) * 2007-01-31 2012-03-20 Caterpillar Inc. Simulation system implementing real-time machine data

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2221118A (en) * 1988-06-21 1990-01-24 Sony Corp Image transformation apparatus
WO1998046029A1 (en) * 1997-04-04 1998-10-15 Orad Hi-Tec Systems Limited Graphical video systems
GB2413720A (en) * 2003-03-14 2005-11-02 British Broadcasting Corp Modifying estimated positions in images of real scenes
EP1798691A2 (en) * 2003-03-14 2007-06-20 British Broadcasting Corporation Method and apparatus for generating a desired view of a scene from a selected viewpoint
US20060032978A1 (en) * 2004-08-16 2006-02-16 Matos Jeffrey A Method and system for controlling a hijacked aircraft

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11431990B2 (en) 2015-06-04 2022-08-30 Thales Holdings Uk Plc Video compression with increased fidelity near horizon

Also Published As

Publication number Publication date
GB2466039B (en) 2013-06-12
WO2010067091A1 (en) 2010-06-17
GB0822387D0 (en) 2009-01-14

Similar Documents

Publication Publication Date Title
US11834173B2 (en) Anchored aerial countermeasures for rapid deployment and neutralizing of target aerial vehicles
CN109923492B (en) Flight path determination
US11415689B2 (en) Search and rescue UAV system and method
EP2177966B1 (en) Systems and methods for unmanned aerial vehicle navigation
CN112335190B (en) Radio link coverage map and impairment system and method
US20150204974A1 (en) System for mapping and tracking ground targets
US10228691B1 (en) Augmented radar camera view for remotely operated aerial vehicles
EP3333591B1 (en) Aircraft radar system for bird and bat strike avoidance
EP3398093A1 (en) Construction and update of elevation maps
US11741702B2 (en) Automatic safe-landing-site selection for unmanned aerial systems
WO2010067091A1 (en) Remote control system and method
Mitchell et al. Testing and Evaluation of UTM Systems in a BVLOS Environment
CN111819610A (en) Air situation information and traffic management system for unmanned aerial vehicles and manned aircraft
Minwalla et al. Experimental evaluation of PICAS: An electro-optical array for non-cooperative collision sensing on unmanned aircraft systems
KR102183415B1 (en) System for landing indoor precision of drone and method thereof
Geister et al. Flight testing of optimal remotely-piloted-aircraft-system scan patterns
AU2001100302A4 (en) System and method for electric power transmission line inspection
WO2023189534A1 (en) Unmanned mobile object, information processing method, and computer program
US20240248477A1 (en) Multi-drone beyond visual line of sight (bvlos) operation
Miccinesi et al. W-band Radar aboard of Unmanned Aerial System for Wire Strike Avoidance
JP2024046652A (en) Route determination system, route determination method, and system program
JP2020057055A (en) Air traffic control information processing system

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20151209