GB2588893A - Aerial vehicle detection - Google Patents

Aerial vehicle detection Download PDF

Info

Publication number
GB2588893A
GB2588893A GB1915388.1A GB201915388A GB2588893A GB 2588893 A GB2588893 A GB 2588893A GB 201915388 A GB201915388 A GB 201915388A GB 2588893 A GB2588893 A GB 2588893A
Authority
GB
United Kingdom
Prior art keywords
image
aircraft
aerial vehicle
image sensor
captured
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB1915388.1A
Other versions
GB201915388D0 (en
Inventor
Tulloch William
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Airbus Operations Ltd
Original Assignee
Airbus Operations Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Airbus Operations Ltd filed Critical Airbus Operations Ltd
Priority to GB1915388.1A priority Critical patent/GB2588893A/en
Publication of GB201915388D0 publication Critical patent/GB201915388D0/en
Priority to PCT/EP2020/079308 priority patent/WO2021078663A1/en
Publication of GB2588893A publication Critical patent/GB2588893A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D45/00Aircraft indicators or protectors not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/08Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/04Systems determining the presence of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Abstract

Method of detecting unmanned aerial vehicles (UAVs) 130 proximate to an aircraft in flight 102 comprising: receiving a first image captured by a first aircraft mounted image sensor 114; determining if an UAV candidate is present; receiving a second image captured by a second aircraft-mounted image sensor 134 having a field of view which overlaps the first cameras field of view; determining if the UAV is present; and generating an alert if the UAV is present in both the first and second captured images. The UAV may be identified by: a trained classifier; or detecting the relative motion of the UAV candidate. Processing of the second image may only be carried out if an UAV candidate is present in the first image. A ground-based image sensor 162 may be used to indicate that an UAV is present. The location of the UAV may be triangulated using the first and second images. The image sensors may be 360° cameras.

Description

Intellectual Property Office Application No. GII1915388.1 RTM Date:15 April 2020 The following terms are registered trademarks and should be read as such wherever they occur in this document: Bluetooth WW1 ZigBee Cellular Z-Wave EnOcean Intellectual Property Office is an operating name of the Patent Office www.gov.uk /ipo
AERIAL VEHICLE DETECTION
TECHNICAL FIELD
[0001] The present invention relates to detection of an aerial vehicle. Particularly, although not exclusively, the present invention relates to the detection of an aerial vehicle by an aircraft.
BACKGROUND
100021 Commercial unmanned aerial vehicles (UAVs) are now widely available and affordable. UAVs pose a potential threat to aircraft and aircraft operations.
SUMMARY
100031 A first aspect of the present invention provides a method of detecting the presence of an external aerial vehicle in the vicinity of an aircraft in flight. The method comprises receiving image data representing a first image captured by a first aircraft-mounted image sensor having a first field of view and processing the image data to determine whether an external aerial vehicle candidate is present in a target space of the first captured image; receiving image data representing a second image captured by a second aircraft-mounted image sensor having a second field of view, which encompasses the target space, and processing the image data to determine whether the external aerial vehicle candidate is present in the second captured image; and generating a signal indicating that an external aerial vehicle is determined to be present in the vicinity of the aircraft if the external aerial vehicle candidate is determined to be present in both the first captured image and in the second captured image. With this method an improved set of data for the authorities is generated. This can lead to a more efficient management of the airspace in the vicinity of the detected UAV and reduce the risks that UAVs pose.
[0004] Optionally, the images are processed to identify the presence of an external aerial vehicle candidate by comparing the images to one or more stored representations of existing aerial vehicles. Furthermore, the one or more stored representations of aerial vehicles is determined by a classifier which is trained to recognise different types of aerial vehicles using supervised training procedures based on images from a library of aerial vehicle images. A number of libraries already exist and therefore this may reduce the time and complexity of training the image processor.
[0005] Optionally, the image data is captured using a 360° image sensor. This may provide an increased field of vision compared to regular image sensors and may result in a larger area of the surface of the aircraft captured.
[0006] Optionally, the method may comprise receiving an indication that the external aerial vehicle candidate is present based on an external image captured by a ground-based image sensor. A ground-based image sensor will provide additional verification that the UAV is present in the vicinity of the aircraft in flight.
[0007] Optionally, processing the image data to determine whether the external aerial vehicle candidate is present in the second captured image is triggered in response to a determination that an external aerial vehicle candidate is present in a target space of the first captured image. This may provide the advantage of minimising the image sensor processing power and associated resources unless an external aerial vehicle is suspected or determined in by another image sensor.
[0008] Optionally, a location of the external aerial vehicle is triangulated using the first and second captured images. Beneficially, this provides location data about the external aerial vehicle which can be used to help minimise the risk the external aerial vehicle poses.
[0009] A second aspect of the present invention provides a machine-readable storage medium executable by a processor to implement the method according to the first aspect of the present invention.
[0010] A third aspect of the present invention provides a system for detecting the presence of an external aerial vehicle in the vicinity of an aircraft in flight The system comprises a first image sensor device having a first field of view to capture a first image comprising an external aerial vehicle candidate in a target space of the first image that is in the vicinity of an aircraft; a second image sensor device having a second field of view, which encompasses the target space, to capture a second image comprising the external aerial vehicle candidate; and a processor to generate a signal indicating that an external aerial vehicle is determined to be present in the vicinity of the aircraft if the external aerial vehicle candidate is determined to be present in both the first captured image and in the second captured image. Beneficially, an improved set of data for the authorities is generated with this system. This can lead to a more efficient management of the airspace in the vicinity of the detected UAV and reduce the risks that UAVs pose.
[0011] Optionally, at least one image sensor is aircraft mounted. Advantageously, external aerial vehicles in the vicinity of the aircraft is detected with use of on-board image sensors. Beneficially, the field of view of the image sensors may be outwardly facing from the aircraft.
100121 Optionally, at least one image sensor is ground mounted. Such an arrangement provides added protection against external aerial vehicles that are spotted near to an airfield or other ground-based locations.
100131 A fourth aspect of the present invention is an aircraft comprising the system according to the third aspect of the present invention.
100141 A fifth aspect of the present invention is a processor and stored program code, and at least a first image sensor and a second image sensor, to perform the method of the first aspect of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
100151 Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which: [0016] Figure 1 A is an illustrative plan view of an aircraft, according to an example; [0017] Figure 1B is an illustrative side elevation of an aircraft, according to an example; [0018] Figure 2 is an illustrative side elevation of an aircraft in flight, according to an example; [0019] Figure 3 is another illustrative side elevation of an aircraft in flight, according to an example; [0020] Figure 4 illustrates two overhead images of a scene superimposed on one another, according to an example; [0021] Figure 5 is an illustrative view of a scenario, according to an example; and 100221 Figure 6 is a process flow chart of a method, according to an example.
DETAILED DESCRIPTION
[0023] That present invention takes into account that UAVs are readily available for anyone to purchase and there is little guidance or rules relating to their ownership or use. Where rules exist, they may not be internationally recognised or applied. There have been reported incidents involving commercial aircraft and suspected UAVs, which have resulted in the shutdown of major airports. These are extremely disruptive incidents and come at a large cost to flight operators and flight passengers alike. There have even been incidents when the presence of a UAV has been reported, causing subsequent disruption, without the presence even being verified. Such is the seriousness of the threat UAVs pose that a mere alleged sighting can down aircraft for long periods of time.
[0024] Figure 1 A illustrates a plan view of an aircraft 102 in flight and Figure 1B illustrates a side elevation of the aircraft 102 in flight. The figures show image sensors mounted at various positions on the aircraft. An image sensor 104, 107 is mounted at a position on the leading edge of each horizontal stabiliser and another image sensor 106 is mounted at a position at the top of the vertical stabiliser. Another image sensor 108, 116 is mounted on the leading edge of each wing tip. An image sensor 110, 118 is mounted on the underside of each wing and an image sensor 112, 119 is mounted atop each engine. There is also an image sensor 114 mounted on the underside of the fuselage toward the cockpit. The image sensor 114 may be mounted on the centre line. An additional image sensor 128 is mounted on the top of the fuselage above the cabin area.
[0025] As used herein, an image sensor is any kind of device that is able to capture an image. The device may operate in colour or monochrome, and may operate in visible or near IR (or 1R) regions of the electromagnetic spectrum. Such a device typically captures and stores images digitally, and is controllable to communicate captured image data to a local or remote processor for image processing. Known imaging sensors, for example in digital cameras that are adapted for use in adverse (i.e in flight) conditions, are suitable for use in examples herein.
[0026] The plurality of image sensors may be controlled by one or more processors (not pictured). The processor, or processors, may be mounted within the fuselage of the aircraft 102. In some examples, each image sensor may include a co-located processor that performs at least some control and/or image processing. In other examples, the image sensors may be controlled centrally. The image sensors may be powered by local power connections taken from the aircraft power network. Control signals and image data may be communicated to and from image sensors via wired or wireless connections. In some examples, the processor(s) are arranged to process images and/or video captured by the plurality of image sensors to identify external aerial vehicle candidates, such as UAVs.
[0027] At least some of the image sensors may have a wide or a panoramic field of view, for example greater than 160° horizontally and/or greater than 75° vertically. What each image sensor can see for any given field of view is of course dictated by where the image sensor is mounted on the aircraft and in which direction it is directed. At least one of the image sensors may have a 360° field of view horizontally and 90° or greater vertically.
Image sensors may be fixed, for example as applied in catadioptric cameras, and derive their wide fields of view from fixed elements, such as lenses or minors. Other image sensors may be movable, such as rotatable, to achieve their fields of view. In any case the image sensors may be interconnected and be in communication with one another either directly or via a central system. Connectivity may use a wireless protocol, such as an Internet of Things (IoT) protocol such as Bluetooth, WiFi, ZigBee, MQTT IoT, CoAP, DSS, NFC, Cellular, AMQP, RFID, Z-Wave, EnOcean and the like.
[0028] According to examples, the fields of view of various image sensors mounted on the plane 102 overlap giving multiple viewpoints of a vicinity or space around the aircraft 102. According to the present example, the fields of view are arranged to be outwardly-looking away from the aircraft 102, so that all regions around the aircraft are visible. Figures 1A and 1B illustrate a resultant field of view 126 that encompasses the entire area around the aircraft 102.
[0029] The aircraft 102 is in Figure 1B is depicted being approached by a UAV 130. The UAV 130 poses a threat to the safety of the aircraft and is likely to cause disruption if it is not dealt with in an efficient manner. Dealing with the UAV 130 may include, for example, recording its detection, location, velocity, alerting the pilot and notifying other aircraft and aircraft controllers at airports in the vicinity. This can give an improved set of data for the authorities, which can lead to a more efficient management of the airspace in the vicinity of the detected UAV.
[0030] Figure 2 illustrates a side elevation of the aircraft 102 in flight. As previously described, the UAV 130 is approaching the aircraft 102. In this example, the UAV 130 is within a field of view 134 of the image sensor 132. Each of the image sensors captures images which are stored and processed in near-real-time to determine whether a UAV candidate is present. A UAV that is spotted by one image sensor is referred to herein as a candidate, whereas the same candidate, if it is spotted by more than one image sensor, is determined to be a UAV sighting. As such, the image sensor 132 determines that there is the UAV candidate 130 in the vicinity, referred to herein as a target space, of the aircraft 102.
[0031] In response to determining whether the UAV candidate 130 is present in the target space of the first captured image, the processor controlling the image sensor 132 notifies other image sensors, having overlapping fields of view with the image sensor 132, to scan their respective target areas in their captured images to determine if the UAV candidate 130 is also present therein.
[0032] The illustration of Figure 3 shows the image sensor 114 having a field of view 136 which overlaps with the field of view 134. The field of view 136 provides an alternate view of the UAV 130 and that is used to verify or confirm that a UAV is in the vicinity of the aircraft 102. Therefore, image sensor 114, if it also comprises an image of the UAV candidate, is used to confirm the presence of the UAV 130. Based on the successful detection and confirmation of the UAV 130, an output indicating the presence of the UAV (as opposed to a UAV candidate') 130 is generated, along with any other pertinent details that has been surmised, such as size, distance and velocity. Of course, any of the plurality of image sensors with overlapping fields of view may be used to confirm that the UAV 130 is in the external area of the aircraft 102. Moreover, any two of the plurality of image sensors may be used to triangulate the location of the UAV 130.
[0033] Figure 3 illustrates other objects beneath the aircraft 102. By way of example, the objects include a first tree 140, a second tree 142 and a car 144. The car 144 may be stationary or moving. The objects are in the field of view of at least one image sensor.
[0034] One way of determining that an object in a field of view is a UAV candidate will now be described with reference to Figure 4, which illustrates two overhead images of the ground 138 captured at two different times. The images are superimposed over one another. In this example, the two images were captured, one after the other, for instance 0.2s apart, by the same aircraft-mounted image sensor. Each image includes a first tree 141, a second tree 142, a road 143, a vehicle 144 on the road and a UAV 130. Each object in the image is designated with a first reference number (e.g. 141 for the first tree) that denotes the position of the object in the first image, and a second reference numeral (e.g. 141' for the first tree) that denotes the position of the object in the second image. The objects are at least initially assumed to be on the ground 138.
[0035] The images are superimposed over one by object-matching and aligning the nonmoving objects. Non-moving objects may be identified by reference to the respective locations in the consecutive images and with knowledge of the ground velocity and altitude of the aircraft. In addition, or alternatively, non-moving object may be identified by reference to libraries of similar images (e.g. such as for roads and trees), by using a trained classifier, or by reference to satellite images and maps of a respective landscape.
[0036] According to Figure 4, it is clear that, as between the two images, the first and second trees, 141, 142, have not moved: they are non-moving objects. The vehicle 144, 144', meanwhile, has moved by a relatively small distance, dl, along the road 143 (which has of course not moved). The UAV 130, 130' in the same period of time has moved a relatively significant distance, d2, in a direction that is not aligned with the road 143.
[0037] The processor is arranged to compare the two images and determine that certain matching objects have not moved (e.g. the trees and the road), whereas certain other objects (e.g. the car 144 and the UAV 130) have moved. The speed or velocity of the objects that are moving can be determined by reference to their different positions in the images relative to the static objects, and with the knowledge of the ground velocity and altitude of the aircraft. For example, d1 is estimated to be about 1.8m, whereas d2 is estimated to be about 6x that distance, or 9m. A car travelling 1.8m in 0.2 seconds has a ground speed of 32.4kmh-1. Meanwhile, the UAV has a calculated apparent ground speed of 194.4 km114. According to one example, the processor is arranged to determine that an object moving at such a high apparent ground speed (for example, a threshold apparent ground speed may be higher than 120 kmh-1 or higher than 150 kmh-') is in flight and, in fact, nearer to the aircraft.
[0038] In more general terms, the UAV 130 has moved the greatest distance across the field of view of the image sensor and it is determined not to be moving as it should be if it is a moving object on the ground 138 such as, for example, a car. The processor controlling the image sensor differentiates between objects on the ground that are moving as they should be, given knowledge of the altitude and ground velocity of the aircraft, and objects that are not moving as they should be. In the latter case, if it is clear that the objects are moving further/faster than a typical ground object (static or moving relatively slowly), the processor deduces that they could be UAVs.
[0039] Once the UAV candidate has been identified by two image sensors, and is determined to be a UAV, triangulation can be performed (given a known spacing on the aircraft between the respective image sensors) to determine the altitude of the UAV, its distance from the aircraft and its velocity. The distance from the aircraft and velocity determine a threat level posed by the UAV [0040] An alternative or additional way to identify UAV candidates applies image processing. In one example, the image processing uses known object detection algorithms to identify moving objects and compare the objects to a library of known objects, including known UAVs. In another example, the image processing comprises a trained classifier to identify UAV candidates. The classifier may be trained to identify UAV candidates based on movement characteristics and/or based on pattern or image matching.
[0041] Figure 5 is an illustrative view of a scenario according to an example. The processor, or processors controlling the image sensors on aircraft 102 may connect to a wide area network of other systems. For example, there is an image sensor 162 having a field of view 164 of the UAV mounted on a control tower 160. The image processor controlling the image sensor 162 determines that the UAV is present and generates a signal indicating it. Also, nearby the aircraft 102 there is a second aircraft 170 with a plurality of image sensors mounted in similar positions as aircraft 102. An image sensor 172 mounted on the aircraft 170 and has a field of view 174 that encompasses the target area of the UAV 130. The processor controlling the image sensor 172 determines the presence of the UAV HO and generates a signal indicating its presence. In this example, the image sensor is described to be mounted on the control tower 160 but may be mounted on different ground-based locations such as aircraft hangars, other buildings, masts, or ground based vehicles.
[0042] When aircraft 102 identifies an aerial vehicle, such as the UAV 130, the system controlling the plurality of image sensors sends out a signal to other systems that have image sensors with overlapping views of the external area of the aircraft 102. Where fields of view of other image sensors are controllable, those image sensors may adjust their respective field of view to view the UAV candidate. The other image sensors perform a similar process that that has been described to verify that the UAV 130 is present in the external area to the aircraft 102. More broadly, a first sighting of a UAV candidate by any of the sensors (i.e. ground or aircraft-mounted) illustrated in Figure 5 may generate a signal that causes any other of the images sensors that has an overlapping field of view to confirm the UAV candidate as being a UAV. Such an arrangement provides added protection against UAVs that are spotted near to an airfield.
[0043] Figure 6 is a process flow chat of method 600 of determining if there is an aerial vehicle in the external area of an aircraft, for example, using the plurality of image sensors mounted on the aircraft 102 as described in relation to Figures 1 to 5. At block 602, the method receives first image data having a first field of view of the external area of the aircraft 102. Next, at block 604, the received first image data is processed to determine the presence of a UAV candidate. At block 606, second image data is received from a second image sensor having a field of view that encompasses part of the field of view, or the target area containing the UAV candidate, of the first image. At block 608, the second image data is processed to determine if the UAV candidate is present. Finally, at block 610, a signal is generated to indicate the presence of the UAV if it is determined that there is one in the first and second image data.
[0044] It is to be noted that the term "or" as used herein is to be interpreted to mean "and/or", unless expressly stated otherwise. The term aircraft has been used but it will be appreciated that any vehicle may be used, such as boats, cars, lorries, ships, or unmanned aerial vehicles.

Claims (14)

  1. CLAIMS: A method of detecting the presence of an external aerial vehicle in the vicinity of an aircraft in flight comprising: receiving image data representing a first image captured by a first aircraft-mounted image sensor having a first field of view and processing the image data to determine whether an external aerial vehicle candidate is present in a target space of the first captured image; receiving image data representing a second image captured by a second aircraft-mounted image sensor having a second field of view, which encompasses the target space, and processing the image data to determine whether the external aerial vehicle candidate is present in the second captured image; and generating a signal indicating that an external aerial vehicle is determined to be present in the vicinity of the aircraft if the external aerial vehicle candidate is determined to be present in both the first captured image and in the second captured image.
  2. 2. A method according to claim 1, wherein the images are processed to identify the presence of an external aerial vehicle candidate by comparing the images to one or more stored representations of existing aerial vehicles.
  3. 3. A method according to claim 2, wherein the one or more stored representations of aerial vehicles is determined by a classifier which is trained to recognise different types of aerial vehicles using supervised training procedures based on images from a library of aerial vehicle i m ages.
  4. 4. A method according to claim 1, wherein the images are processed to identify the presence of an external aerial vehicle candidate by detecting the relative motion of the UAV candidate.
  5. 5. A method according to any preceding claim, wherein the image data is captured using a 360° image sensor.
  6. 6. A method according to any preceding claim, comprising receiving an indication that an the external aerial vehicle candidate is present based on an external image captured by a ground-based image sensor.
  7. 7. A method according to any one of the preceding claims, wherein the processing the image data to determine whether the external aerial vehicle candidate is present in the second captured image is triggered in response to a determination that an external aerial vehicle candidate is present in a target space of the first captured image.
  8. 8. A method according to any one of the preceding claims, wherein a location of the external aerial vehicle is triangulated using the first and second captured images.
  9. 9. A machine-readable storage medium comprising instructions executable by a processor to implement the method according to claims 1 to 8.
  10. 10. A system for detecting the presence of an external aerial vehicle in the vicinity of an aircraft in flight comprising: a first image sensor device having a first field of view to capture a first image comprising an external aerial vehicle candidate in a target space of the first image that is in the vicinity of an aircraft; a second image sensor device having a second field of view, which encompasses the target space, to capture a second image comprising the external aerial vehicle candidate; and a processor to generate a signal indicating that an external aerial vehicle is determined to be present in the vicinity of the aircraft if the external aerial vehicle candidate is determined to be present in both the first captured image and in the second captured image.
  11. 11. A system according to claim 10, wherein at least one image sensor is aircraft mounted.
  12. 12. A system according to claim 11, wherein at least one image sensor s ground mounted.
  13. 13. An aircraft comprising a system according to any one of claims 10 to 12.
  14. 14. An aircraft comprising a processor and stored program code, and at least a first image sensor and a second image sensor, to perform the method of any one of claims 1 to 8.
GB1915388.1A 2019-10-23 2019-10-23 Aerial vehicle detection Pending GB2588893A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1915388.1A GB2588893A (en) 2019-10-23 2019-10-23 Aerial vehicle detection
PCT/EP2020/079308 WO2021078663A1 (en) 2019-10-23 2020-10-19 Aerial vehicle detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1915388.1A GB2588893A (en) 2019-10-23 2019-10-23 Aerial vehicle detection

Publications (2)

Publication Number Publication Date
GB201915388D0 GB201915388D0 (en) 2019-12-04
GB2588893A true GB2588893A (en) 2021-05-19

Family

ID=68728380

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1915388.1A Pending GB2588893A (en) 2019-10-23 2019-10-23 Aerial vehicle detection

Country Status (2)

Country Link
GB (1) GB2588893A (en)
WO (1) WO2021078663A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3121763A1 (en) * 2015-07-24 2017-01-25 Honeywell International Inc. Helo bumper system using a camera for obstacle detection
CN108168706A (en) * 2017-12-12 2018-06-15 河南理工大学 A kind of multispectral infrared imaging detecting and tracking system for monitoring low-altitude unmanned vehicle
CN108447075A (en) * 2018-02-08 2018-08-24 烟台欣飞智能系统有限公司 A kind of unmanned plane monitoring system and its monitoring method
WO2019163454A1 (en) * 2018-02-20 2019-08-29 ソフトバンク株式会社 Image processing device, flying object, and program

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5581250A (en) * 1995-02-24 1996-12-03 Khvilivitzky; Alexander Visual collision avoidance system for unmanned aerial vehicles
US6804607B1 (en) * 2001-04-17 2004-10-12 Derek Wood Collision avoidance system and method utilizing variable surveillance envelope
US9275645B2 (en) * 2014-04-22 2016-03-01 Droneshield, Llc Drone detection and classification methods and apparatus
US10514711B2 (en) * 2016-10-09 2019-12-24 Airspace Systems, Inc. Flight control using computer vision
US10495421B2 (en) * 2017-08-25 2019-12-03 Aurora Flight Sciences Corporation Aerial vehicle interception system
CN107831777B (en) * 2017-09-26 2020-04-10 中国科学院长春光学精密机械与物理研究所 Autonomous obstacle avoidance system and method for aircraft and aircraft

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3121763A1 (en) * 2015-07-24 2017-01-25 Honeywell International Inc. Helo bumper system using a camera for obstacle detection
CN108168706A (en) * 2017-12-12 2018-06-15 河南理工大学 A kind of multispectral infrared imaging detecting and tracking system for monitoring low-altitude unmanned vehicle
CN108447075A (en) * 2018-02-08 2018-08-24 烟台欣飞智能系统有限公司 A kind of unmanned plane monitoring system and its monitoring method
WO2019163454A1 (en) * 2018-02-20 2019-08-29 ソフトバンク株式会社 Image processing device, flying object, and program

Also Published As

Publication number Publication date
GB201915388D0 (en) 2019-12-04
WO2021078663A1 (en) 2021-04-29

Similar Documents

Publication Publication Date Title
RU2692306C2 (en) Tracking system for unmanned aerial vehicles
CN110612234B (en) System and method for calibrating vehicle sensors
US20190019418A1 (en) Automated system of air traffic control (atc) for at least one unmanned aerial vehicle (uav)
US11827352B2 (en) Visual observer for unmanned aerial vehicles
US20180090018A1 (en) Midair collision threat detection and assessment using visual information
JP2020509363A (en) Method and system using a networked phased array antenna application for detecting and / or monitoring moving objects
US11132909B2 (en) Drone encroachment avoidance monitor
RU2746090C2 (en) System and method of protection against unmanned aerial vehicles in airspace settlement
RU2755603C2 (en) System and method for detecting and countering unmanned aerial vehicles
KR102290533B1 (en) RTK-GPS interlocking system and method for detecting and responding to illegal flight
KR20190004176A (en) Apparatus and method for the obstacle collision avoidance of unmanned aerial vehicle
US11361668B1 (en) Collision awareness system for ground operations
Zarandy et al. A novel algorithm for distant aircraft detection
US20210088652A1 (en) Vehicular monitoring systems and methods for sensing external objects
KR20190021875A (en) System and its Method for Detecting and Defeating Small Unmanned Aircrafts
KR101483058B1 (en) Ground control system for UAV anticollision
Minwalla et al. Experimental evaluation of PICAS: An electro-optical array for non-cooperative collision sensing on unmanned aircraft systems
Dolph et al. Detection and Tracking of Aircraft from Small Unmanned Aerial Systems
GB2588893A (en) Aerial vehicle detection
RU2746102C1 (en) System and method for protecting the controlled area from unmanned vehicles
WO2023286295A1 (en) Intrusion determination device, intrusion detection system, intrusion determination method, and program storage medium
KR101676485B1 (en) System and method for providing Drone rader using mobile cell towers
McCalmont et al. Sense and avoid technology for global hawk and predator uavs
US20230010630A1 (en) Anti-collision system for an aircraft and aircraft including the anti-collision system
WO2023286186A1 (en) Device for dealing with suspicious aircraft, system for dealing with suspicious aircraft, method for dealing with suspicious aircraft, and program storage medium