US20230245574A1 - Methods, computer programs, computing devices and controllers - Google Patents
Methods, computer programs, computing devices and controllers Download PDFInfo
- Publication number
- US20230245574A1 US20230245574A1 US18/132,468 US202318132468A US2023245574A1 US 20230245574 A1 US20230245574 A1 US 20230245574A1 US 202318132468 A US202318132468 A US 202318132468A US 2023245574 A1 US2023245574 A1 US 2023245574A1
- Authority
- US
- United States
- Prior art keywords
- uav
- data
- identification data
- computing device
- received
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title description 71
- 238000004590 computer program Methods 0.000 title description 13
- 238000005516 engineering process Methods 0.000 claims description 16
- 238000001228 spectrum Methods 0.000 claims description 3
- 230000005540 biological transmission Effects 0.000 description 12
- 238000004891 communication Methods 0.000 description 10
- 230000004044 response Effects 0.000 description 10
- 238000013475 authorization Methods 0.000 description 6
- 230000000246 remedial effect Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 238000012550 audit Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0073—Surveillance aids
- G08G5/0078—Surveillance aids for monitoring traffic from the aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0047—Navigation or guidance aids for a single aircraft
- G08G5/0069—Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/44—Program or device authentication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0004—Transmission of traffic-related information to or from an aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0004—Transmission of traffic-related information to or from an aircraft
- G08G5/0013—Transmission of traffic-related information to or from an aircraft with a ground station
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0017—Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
- G08G5/0021—Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located in the aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0017—Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
- G08G5/0026—Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located on the ground
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0043—Traffic management of multiple aircrafts from the ground
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0047—Navigation or guidance aids for a single aircraft
- G08G5/006—Navigation or guidance aids for a single aircraft in accordance with predefined flight zones, e.g. to avoid prohibited zones
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0073—Surveillance aids
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0073—Surveillance aids
- G08G5/0082—Surveillance aids for monitoring traffic from a ground station
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W24/00—Supervisory, monitoring or testing arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W24/00—Supervisory, monitoring or testing arrangements
- H04W24/02—Arrangements for optimising operational condition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
- H04W4/44—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W84/00—Network topologies
- H04W84/18—Self-organising networks, e.g. ad-hoc networks or sensor networks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2111—Location-sensitive, e.g. geographical location, GPS
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B7/00—Radio transmission systems, i.e. using radiation field
- H04B7/14—Relay systems
- H04B7/15—Active relay systems
- H04B7/185—Space-based or airborne stations; Stations for satellite systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B7/00—Radio transmission systems, i.e. using radiation field
- H04B7/14—Relay systems
- H04B7/15—Active relay systems
- H04B7/185—Space-based or airborne stations; Stations for satellite systems
- H04B7/18502—Airborne stations
- H04B7/18504—Aircraft used as relay or high altitude atmospheric platform
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/03—Protecting confidentiality, e.g. by encryption
Definitions
- This disclosure relates to methods, computer programs, computing devices and controllers.
- UAV unmanned aerial vehicle
- UAS unmanned aircraft system
- a method of controlling a computing device comprising:
- a method of controlling a computing device comprising:
- a computer program comprising instructions which, when executed, cause a computing device to perform a method provided in accordance with the first, second and/or third embodiments.
- a computing device configured to perform a method provided in accordance with the first, second and/or third embodiments.
- a controller for a computing device the controller being configured to perform a method provided in accordance with the first, second and/or third embodiments.
- FIG. 1 shows a block diagram of an example computing device in accordance with embodiments.
- a computing device associates identification data received wirelessly from a UAV with image data captured by a camera, the image data representing a scene comprising the UAV.
- the computing device may be useful, for example, in providing photographic evidence that the UAV was, or was not, at a particular location at a particular point in time, in allowing a user of the computing device to identify one or more attributes or of associated with the UAV, etc.
- the computing device comprises a mobile computing device, such as a smartphone or tablet computing device.
- the functionality described herein may, in some examples, be made readily available to members of the public using their existing computing devices. For example, software may be downloaded into an existing computing device to provide the computing device with the functionality described herein.
- the computing device 100 may take various forms.
- the computing device 100 may be arranged in one or multiple geographical locations.
- the computing device 100 may comprise a distributed computing system.
- the computing device 100 may be provided in one or more housings.
- the computing device 100 may process some or all data locally, within the computing device 100 .
- the computing device 100 may use one or more cloud-based services to process some or all data.
- the computing device 100 comprises a mobile computing device, although the computing device could be fixed-location in other examples.
- the computing device 100 may comprise one or more elements in addition to the mobile computing device.
- mobile computing devices include, but are not limited to, wearable devices, smartphones, laptop computing devices, dedicated portable UAV-monitoring equipment, UAV remote control devices (for example handheld UAV remote control devices), and tablet computing devices.
- the mobile computing device may be a handheld computing device.
- Such devices may be relatively inexpensive compared, for example, to more complicated UAV monitoring equipment but may be sufficiently powerful to perform the techniques described herein in at least some desired scenarios. Further, some users may already have such devices and may be able to use such devices to perform the techniques described herein with relatively low additional expenditure, without acquiring further hardware etc.
- Portability of the computing device 100 may be effective where a user wishes to perform the techniques described herein in different locations, using the same device.
- Existing, fixed-location UAV monitoring equipment may not be designed or suitable for this.
- the computing device 100 comprises a UAV (different from the UAV comprised in the scene represented in the received image data).
- the computing device 100 may comprise one or more elements in addition to the UAV.
- a UAV may therefore perform the techniques described herein in relation to a further UAV.
- Using a UAV may provide flexibility where, for example, the UAV can approach the UAV comprised in the scene represented in the received image data, for example to interrogate the UAV, take a close-up photograph or video of the UAV, follow the UAV etc.
- a UAV may enable the techniques described herein to be deployed in different locations using the same hardware.
- a UAV with a camera may, for example, patrol a given airspace, covering a relatively large area compared to a fixed-location camera or computing system. Further, a UAV may be dispatched on-demand from a given location to a different location where a further UAV of interest is in the different location. As such, the number of computing systems that cover a given area may be lower where the computing system comprises a UAV than where multiple fixed-location computing systems are used to cover the given area.
- the computing device 100 comprises a controller 110 .
- the controller 110 is communicatively coupled to one or more other components of the computing device 100 , for example via a bus.
- the controller 110 may, for example, comprise a microprocessor.
- the computing device 100 comprises one or more sensors 120 .
- the computing device 100 comprises a sensor in the form of a camera 120 .
- the camera 120 may, for example, capture still image data and/or video data.
- the camera 120 may, for example, capture visible light and/or infrared.
- Other types of sensor 120 include, but are not limited to, ultrasonic sensors, Light Detection And Ranging (LiDAR) sensors etc. References to image data will be understood accordingly to be data captured by the sensor, dependent on the type of sensor.
- LiDAR Light Detection And Ranging
- the computing device 100 comprises a transceiver 130 .
- the transceiver 130 may transmit and receive on the same, or different, frequencies.
- the transceiver 130 may transmit to and receive from the same, or different, entities.
- the transceiver 130 may transmit and receive using the same, or different, communication protocols.
- the transceiver 130 may be operable to transmit and receive simultaneously, or otherwise.
- the transceiver 130 may operate in the radio frequency (RF) part of the electromagnetic spectrum.
- RF radio frequency
- the computing device 100 comprises a display 140 .
- the display 140 may, for example, comprise a touch-sensitive display. However, other types of display may be used.
- the camera 120 and display 140 are on different surfaces of the computing device 100 .
- the camera 120 may be on a front surface of the computing device 100 and the display 140 may be on a rear surface of the computing device 100 .
- a user of the computing device 100 may be able to point the camera 120 at the UAV and see a representation of the UAV on the display 140 in real-time, as if they were seeing the UAV ‘through’ the computing device 100 .
- the user can alternatively or additionally point the camera 120 at the UAV, capture image data representing the UAV, and see the representation of the UAV on the display 140 at a later point in time.
- the computing device 100 comprises memory 150 .
- the memory may store one or more computer programs.
- the one or more computer programs may comprise computer-readable instructions.
- the computing device 100 for example the controller 110 , may be configured to execute the one or more computer programs and, as a result, perform at least some of the techniques described herein.
- the one or more computer programs may be downloaded onto the computing device 100 .
- the one or more computer programs may be downloaded from a computer program store.
- the controller 110 receives image data from the camera 120 .
- the received image data represents a scene comprising a UAV.
- the scene may comprise one or more further objects. Examples of such further objects include, but are not limited, to UAVs, other vehicles, people and buildings.
- the received image data may comprise a photograph of the scene.
- the received image data may have been subject to image processing prior to being received at the controller 110 .
- An example of such image processing is object recognition, for example to identify the UAV and/or further objects.
- Such object recognition may be performed using a trained Artificial Neural Network (ANN), for example.
- ANN Artificial Neural Network
- the controller 110 receives identification data wirelessly from the UAV.
- the identification data is useable to identity at least one attribute of or associated with the UAV.
- the controller 110 receives the identification data wirelessly from the UAV via the transceiver 130 .
- the controller 110 may use standardised wireless technology to receive the identification data.
- Standardised wireless technology may be considered to be wireless technology that is the subject of one or more standards. This can facilitate interoperability and/or adoption compared, for example, to proprietary wireless technology.
- proprietary wireless technology may be used in some examples. For example, proprietary wireless technology may allow enhanced customisation compared to standardised wireless technology.
- the computing device 100 may receive the identification data wirelessly from the UAV via a wireless local area network (WLAN), for example in accordance with Wi-FiTM technology.
- WLAN wireless local area network
- the computing device 100 may receive the identification data on the 2.4 GHz and/or 5.8 GHz Wi-FiTM bands. This may provide a relatively large operating range and relatively low power consumption compared to some short-range technologies such as BluetoothTM.
- the computing device 100 may use one or more designated Wi-FiTM channels for the reception of the identification data.
- the identification data may be received via a short-range technology, of which BluetoothTM is an example.
- BluetoothTM may have a typical operating range of around 10 m-100 m.
- BluetoothTM 5.0 may have a typical operating range of around 40 m-400 m.
- the computing device 100 may pair with the UAV in order to receive the identification data via BluetoothTM.
- the computing device 100 may, however, be able to receive the identification data from the UAV over BluetoothTM without being paired with the UAV.
- Wi-FiTM or BluetoothTM may not be reliant upon the availability of a cellular network, for example a 4G system, where connectivity may be limited in remote locations, where the cost of use may be relatively high etc. Further, such technologies may be supported by a relatively large number of computing devices. For example, most smartphones currently support both Wi-FiTM and BluetoothTM.
- the computing device 100 may be configured to parse the identification data from wireless transmissions from the UAV. For example, the computing device 100 may receive data in a given format from the UAV and may extract a given portion of the received data corresponding to the identification data, where the given portion is known to correspond to the identification data. The received data may be in a standardised or proprietary format having a predefined syntax, for example. The computing device 100 may be configured to discard at least some data other than the identification data in wireless transmissions received from the UAV.
- the computing device 100 may receive the identification data from the UAV while the UAV is within the field of view of the camera 120 , or otherwise. Receiving the identification data from the UAV while the UAV is within the field of view of the camera 120 may provide temporal correlation between the received image data and the received identification data which can help in associating the received identification data with the received image data. For example, a user of the computing device 100 may be able to capture a photograph of the UAV along with the identification data.
- the controller 110 may transmit an identification request to the UAV.
- the controller 110 may detect the presence of the UAV and transmit an identification request to the UAV.
- the controller 110 may receive the identification data from the UAV in response to the identification request, or otherwise.
- the computing system 100 may interrogate the UAV to request the UAV to identify itself.
- the UAV may transmit the identification request without being prompted to do so.
- the UAV may be configured to transmit the identification request intermittently.
- the received identification data may comprise a registration identifier of the UAV.
- the registration identifier may facilitate identification of the UAV with a UAV-registration body.
- the received identification data may comprise an equipment identifier of the UAV.
- the equipment identifier may facilitate identification of the UAV based on being able to identify the UAV equipment itself.
- the received identification data for example the equipment identifier, may comprise a UAV make identifier, identifying a make of the UAV.
- the received identification data for example the equipment identifier, may comprise a UAV model identifier, identifying a model of the UAV.
- the received identification data, for example the equipment identifier may comprise a UAV type identifier, identifying a type of the UAV.
- the received identification data may comprise a contact identifier of an entity associated with the UAV.
- the contact identifier may facilitate contact with an owner, operator or other entity associated with the UAV.
- the received identification data may comprise a WLAN identifier associated with the wireless reception of the identification data from the UAV.
- the WLAN identifier may enable the UAV to be identified, where the WLAN identifier is known to be associated with the UAV.
- the WLAN identifier may, for example, comprise a Service Set ID (SSID) and/or a Basic Service Set ID (BSSID).
- SSID Service Set ID
- BSSID Basic Service Set ID
- a BSSID may be more reliable than an SSID in identifying a UAV since the SSID may be changed by a user and may not be sufficiently unique across multiple UAVs whereas a BSSID is more likely to be useable to readily identify an individual UAV.
- an SSID may be used in some examples.
- the received identification data may comprise one or multiple different types of identifier.
- the received identification data may not comprise any personally identifiable information, namely information that can be used to identify a human operator, owner, or the like of the UAV. This can help to preserve privacy, particularly but not exclusively where the UAV broadcasts or otherwise advertises the identification data to interested parties, as opposed to transmitting the identification data in a more targeted manner.
- the controller 110 may receive the identification data as a result of transmission of the identification data from the UAV to a control device of an operator of the UAV over a predetermined communications link.
- the predetermined communications link may be a control link.
- the controller 110 may, in effect, intercept the control data and use the control data in accordance with the techniques described herein.
- the identification data may be transmitted over the predetermined communications link from the UAV to the control device in an encrypted form.
- the controller 110 may decrypt the identification data.
- the controller 110 may have access to a decryption key useable to decode data communicated over the predetermined communications link. Different UAVs may use different encryption protocols, keys etc.
- the controller 110 may have access to multiple different decryption keys and/or may be able to use multiple different decryption techniques to provide enhanced interoperability in relation to different users, manufacturers etc.
- the controller 110 may receive the identification data as a result of broadcasting of the identification data by the UAV.
- the UAV may broadcast the identification data to interested receiving devices in addition to, or as an alternative to, transmitting the identification data to the control device over the predetermined communications link.
- a control link to a control device may be not available.
- the identification data may be broadcast intermittently, for example periodically.
- the controller 110 associates the received image data (and/or data based thereon) with the received identification data (and/or data based thereon).
- data based on the received image data include, but are not limited to, compressed image data, encrypted image data, map data generated based on the received image data etc.
- data based on the received identification data include, but are not limited to, contact information associated with an owner of the UAV, registration information associated with the UAV, telemetry information associated with the UAV, a cryptographically derived version of the received identification data (for example a hash of the received identification data) etc.
- the ability to capture image data of a scene comprising the UAV at the same time as receiving the identification data wirelessly from the UAV may depend on transmission capabilities of the UAV, properties of the camera 120 and/or properties of the transceiver 130 .
- the transceiver 130 may be able to receive signals from a UAV several miles away (potentially several or even tens of miles away), but the camera 120 may be unable to capture image data of a UAV such a distance away.
- the controller 110 may store the received identification data (and/or data based thereon) in association with the received image data (and/or data based thereon).
- the controller 110 may store the received identification data (and/or data based thereon) in association with the received image data (and/or data based thereon) in the memory 150 of the computing device 100 , or elsewhere.
- the controller 110 may transmit the received identification data (and/or data based thereon) in association with the received image data (and/or data based thereon).
- the controller 110 may transmit the received identification data (and/or data based thereon) in association with the received image data (and/or data based thereon) via the transceiver 130 , or otherwise.
- the controller 110 may transmit the received identification data (and/or data based thereon) together with or separately from the received image data (and/or data based thereon).
- the controller 110 may, however, not transmit data in this manner, for example where this could cause privacy concerns.
- a user of the computing device 100 may be restricted or inhibited from accessing the received identification data (and/or data based thereon).
- the association is made between the received identification data (and/or data based thereon) in association with the received image data (and/or data based thereon) and such data may, for example, be stored in the computing device 100 it may only be accessible by a designated entity (for example a law enforcement agency).
- the controller 110 may perform a look-up for additional data associated with the UAV using the received identification data (and/or data based thereon).
- additional data include, but are not limited to, authorisation information, flight path information etc.
- Performing the look-up may comprise querying a database using the received identification data (and/or data based thereon).
- the database may be local to or remote from the computing device 100 .
- the computing device 110 may be associated with a restricted flight zone (or ‘restricted airspace’).
- the flight zone may be restricted in that one or more restrictions may be in place in relation to the flying of UAVs within the restricted flight zone.
- the restricted flight zone may correspond to a sensitive location, a landmark, space that can only be entered upon payment etc. Examples of restricted flight zones may include, but are not limited to, airports, military facilities, prisons, infrastructure, and schools.
- the computing device 110 may be physically located within the restricted flight zone.
- the computing device 110 may not be physically located within the restricted flight zone, but may nevertheless be used to perform computation associated with restricted flight zone.
- the controller 110 may determine whether or not the UAV is authorised to be in the restricted flight zone based on the received identification data (and/or data based thereon). For example, the controller 110 may query an authorisation database using the received identification data (and/or data based thereon) and receive an indication of authorisation accordingly. Alternatively, or additionally, the controller 110 may determine whether or not the UAV is authorised to be in the restricted flight zone based on authorisation data received from the UAV.
- the controller 110 may perform a first action in response to determining that the UAV is authorised to be in the restricted flight zone and may perform a second, different action in response to determining that the UAV is not authorised to be in the restricted flight zone.
- Examples of the first action include, but are not limited to, allowing the UAV to enter the restricted flight zone (for example where the UAV is not already in the restricted flight zone), allowing the UAV to remain in the restricted flight zone (for example where the UAV is already in the restricted flight zone), allowing the UAV to leave the restricted flight zone (for example where the UAV is not already in the restricted flight zone).
- Examples of the second action include, but are not limited to, preventing the UAV from entering the restricted flight zone (for example where the UAV is not already in the restricted flight zone), preventing the UAV from remaining in the restricted flight zone (for example where the UAV is already in the restricted flight zone), preventing the UAV from leaving the restricted flight zone (for example where the UAV is already in the restricted flight zone), generating an alarm, notifying an entity associated with the UAV, notifying an entity that is not associated with the UAV, and taking over control of the UAV (for example to fly it out of a restricted flight zone, cause it to crash etc).
- entities that are not associated with the UAV may include, but are not limited to, police, security agencies and aviation authorities.
- the controller 110 may receive telemetry data wirelessly from the UAV. Examples of telemetry data include, but are not limited to, location, altitude, speed, direction, battery level, and altitude.
- the controller 110 may associate the received telemetry data with the received image data (and/or data based thereon) and/or the received identification data (and/or data based thereon).
- the controller 110 may cause the telemetry data to be displayed at the same time as the received image data (and/or data based thereon) and/or the received identification data (and/or data based thereon).
- the controller 110 may receive authorisation data wirelessly from the UAV.
- the UAV may be able to indicate to the computing device 100 that it is authorised to be in a given location.
- the controller 110 may be able to act on the authorisation data received from the UAV in certain scenarios, for example where the controller 110 trusts that the UAV is indeed authorised.
- the controller 110 may associate the other data with the received image data (and/or data based thereon) and/or the received identification data (and/or data based thereon).
- Examples of other data include, but are not limited to, time data and location data. This can be used, along with the received image data (and/or data based thereon), as evidence that the UAV was, or was not, in a particular location at a particular point in time.
- the controller 110 may cause the received image data (and/or data based thereon) to be displayed at the same time as the received identification data (and/or data based thereon).
- the controller 110 may cause both the received image data (and/or data based thereon) and the received identification data (and/or data based thereon) to be displayed together, at the same time, such that a user of the computing device 100 may determine that the received identification data (and/or data based thereon) relates to the UAV.
- the received image data (and/or data based thereon) can be displayed at the same time as the received identification data (and/or data based thereon) on the display 140 of the computing device 100 in which the camera 120 is comprised.
- the controller 110 may cause the received image data (and/or data based thereon) to be displayed at the same time as the received identification data (and/or data based thereon) without specific user input.
- the controller 110 may cause the received image data (and/or data based thereon) to be displayed at the same time as the received identification data (and/or data based thereon) in response to specific predetermined user input.
- the controller 110 may cause the received image data (and/or data based thereon) to be displayed initially without the received identification data (and/or data based thereon) and, in response to specific user input, cause the identification data (and/or data based thereon) to be displayed at the same time as the received image data (and/or data based thereon).
- Such specific user input may, for example, comprise the user selecting a representation of the UAV on the display 140 of the computing device 100 , selecting a soft button on the display 140 of the computing device 100 , etc.
- the controller 110 may cause the received image data (and/or data based thereon) to be displayed on the display 140 and also cause the received identification data (and/or data based thereon) to be displayed on the display 140 , but at different times. For example, the controller 110 may cause the received image data (and/or data based thereon) to be displayed initially without the received identification data (and/or data based thereon) and, in response to specific user input, cause the identification data (and/or data based thereon) to be displayed instead of the received image data (and/or data based thereon).
- Such specific user input may, for example, comprise the user selecting a representation of the UAV on the display 140 of the computing device 100 , selecting a soft button on the display 140 of the computing device 100 , etc.
- the received image data (and/or data based thereon) and the received identification data (and/or data based thereon) may be displayed in real-time in relation to activity of the UAV.
- the display 140 may display the received image data (and/or data based thereon) and the received identification data (and/or data based thereon) while the UAV is still within the field of view of the camera 120 .
- the received image data (and/or data based thereon) and the received identification data (and/or data based thereon) may be displayed in non-real-time in relation to activity of the UAV.
- the display 140 may display the received image data (and/or data based thereon) and the received identification data (and/or data based thereon) after the UAV has left the field of view of the camera 120 .
- the scene represented in the received image data may comprise multiple UAVs.
- the controller 110 may determine that the received identification data relates to a particular one of the multiple UAVs and associate the received identification data (and/or data based thereon) with the particular one of the multiple UAVs accordingly. For example, the controller 110 may be able to determine that a wireless transmission comprising the received identification data originates from the particular one of the multiple UAVs based on determining a direction of arrival using of the transmission. The controller 110 may be able to determine that a wireless transmission comprising the received identification data originates from the particular one of the multiple UAVs where the received identification data is associated with a particular type of UAV and where the controller 110 can recognise one of the multiple UAVs as being of that particular type.
- the controller 110 may be unable to determine which particular one of the multiple UAVs the received identification data relates to but may nevertheless associate the received identification data (and/or data based thereon) with the received image data. For example, it may be sufficient to record image data of an authorised UAV in a restricted flight zone even if an authorised UAV was present in the restricted flight zone at the same time.
- the computing device 100 may indicate the number of UAVs in the scene represented in the received image data.
- the controller 110 may associate the further identification data (and/or data based thereon) with the received image data (and/or data based thereon).
- the controller 110 may be able to determine that the further identification data relates to a particular further one of the multiple UAVs or otherwise.
- Image data is received from a camera.
- the camera may or may not be comprised in the computing device.
- the received image data represents a scene comprising an unmanned aerial vehicle, UAV.
- Identification data is received wirelessly from the UAV.
- the received image data is associated with the received identification data.
- image data representing a scene comprising a UAV is associated with identification data received wireless from the UAV. This can be used, for example, in providing photographic evidence that the UAV was at a particular location at a particular point in time, in identifying one or more attributes or of associated with the UAV, etc.
- the computing device and/or method may be associated with a restricted flight zone. It may be especially effective to be able to track, record and/or monitor such information in relation to a restricted flight zone which may, for example, correspond to a sensitive location in which only authorised UAVs should be allowed to fly.
- examples also relate to non-restricted flight zones. For example, an individual may take a photograph and/or video of a UAV on their smartphone and have identification data associated with the UAV recorded with the photograph and/or video, for example where the individual suspects contravention of a regulation, privacy etc.
- a first action may be performed in response to determining that the UAV is authorised to be in the restricted flight zone and a second, different action may be performed in response to determining that the UAV is not authorised to be in the restricted flight zone. This allows different actions to be taken depending on the nature of the UAV and its reasons for being in the restricted flight zone.
- Performing the first action may comprise allowing the UAV to enter the restricted flight zone.
- Performing the first action may comprise allowing the UAV to remain in the restricted flight zone.
- Performing the first action may comprise allowing the UAV to leave the restricted flight zone.
- access control to, in and/or from the restricted flight zone may be provided by the computing device.
- Performing the second action may comprise preventing the UAV from entering the restricted flight zone.
- Performing the second action may comprise preventing the UAV from remaining in the restricted flight zone.
- Performing the second action may comprise preventing the UAV from leaving the restricted flight zone.
- access control to, in and/or from the restricted flight zone may be provided by the computing device.
- Performing the second action may comprise generating an alarm.
- an entity can be alerted to the presence of the unauthorised UAV in the restricted flight zones. Examples of such entity include, but are not limited to, people in the restricted flight zone.
- Performing the second action may comprise notifying an entity associated with the UAV.
- an entity associated with the UAV for example its operator
- Performing the second action may comprise notifying an entity that is not associated with the UAV.
- an entity not associated with the UAV for example a law enforcement agency
- Performing the second action may comprise taking over control of the UAV.
- it may be possible to remove the UAV from the restricted flight zone, for example by flying the UAV to a non-restricted flight zone.
- the received identification data may be stored in association with the received image data.
- a record can be kept of the association, for example for training, law enforcement, audit etc purposes.
- the received identification data may be transmitted in association with the received image data.
- an entity other than the computing device can be informed of the association, for example for training, law enforcement, audit etc purposes.
- a look-up may be performed for additional data associated with the UAV using the received identification data. This can supplement the information available to the computing device in relation to the UAV.
- Telemetry data may be received wirelessly from the UAV.
- the received telemetry data may be associated with the received image data and/or the received identification data.
- the computing device has additional data on which to base actions, to provide to a user etc.
- the received image data may be caused to be displayed at the same time as the received identification data. This may facilitate remedial action being taken promptly.
- the received image data may be caused to be displayed at the same time as the received identification data on a display of a computing device in which the camera is comprised. This may facilitate remedial action being taken promptly.
- the received identification data may comprise a registration identifier of the UAV. This can facilitate identification of the UAV and/or an associated entity.
- the received identification data may comprise an equipment identifier of the UAV. This can facilitate identification of the UAV and/or an associated entity.
- the received identification data may comprise a contact identifier of an entity associated with the UAV. This can facilitate identification of the UAV and/or an associated entity.
- the received identification data may comprise a wireless local area network, WLAN, identifier associated with the wireless reception of the identification data from the UAV. This can facilitate identification of the UAV and/or an associated entity when the relationship between the WLAN identifier and the UAV and/or an associated entity is known.
- WLAN wireless local area network
- the WLAN identifier may comprise a Basic Service Set ID, BSSID. This may provide a relatively accurate technique to identify the UAV based on WLAN identifier compared to the use of a Service Set ID, SSID.
- the identification data may be received as a result of transmission of the identification data from the UAV to a control device of an operator of the UAV over a predetermined communications link.
- the techniques described herein may use an existing transmission to identify the UAV. This may result in existing hardware and/or software being used.
- the identification data may be transmitted over the predetermined communications link in an encrypted form and wherein the method comprises decrypting the identification data. This provides both privacy in relation to control of the UAV by an operator and the ability for the computing device (or another entity) to accurately identify the UAV in relation to the image data.
- the identification data may be received as a result of broadcasting of the identification data by the UAV.
- existing hardware and/or software used to broadcast identification data may be used.
- the received image data may comprise a photograph of the scene. This can provide photographic evidence to facilitate proving that the UAV was in a particular location at a particular point in time.
- the computing device may comprise a mobile computing device. This can allow a given device to be used in multiple locations across a given area, potentially without multiple fixed-location cameras and/or computing devices to cover the given area.
- the computing device may comprise a UAV. This can allow UAV to patrol a given area, potentially without multiple fixed-location cameras and/or computing devices to cover the given area.
- Image data representing a scene comprising an unmanned aerial vehicle, UAV is caused to be captured using a camera of the computing device.
- Identification data is received wirelessly from the UAV.
- the captured image data and/or data based on the captured image data is caused to be displayed on a display of the computing device at the same time as the received identification data and/or data based on the received identification data.
- the computing device can conveniently capture a representation of a UAV and display identification data received from the UAV and/or data based thereon, on a display of the computing device, for example so that a user who has controlled the computing device to capture the representation can also see the identification data and/or the data based thereon on a display of the same computing device.
- a photograph of a scene comprising an unmanned aerial vehicle, UAV is caused to be captured using a camera of the computing device.
- Identification data is received wirelessly from the UAV.
- the received identification data and/or data based on the received identification data is caused to be displayed on a display of the computing device at the same time as the photograph of the scene comprising the UAV.
- the computing device can conveniently capture a photograph of a UAV and display identification data received from the UAV and/or data based thereon, on a display of the computing device, for example so that a user who has controlled the computing device to capture the photograph can also see the identification data and/or the data based thereon on a display of the same computing device.
- the computing device 100 may comprise a different combination of components (for example fewer, more, different).
- image data received from a camera represents a scene comprising an unmanned aerial vehicle, UAV and in which identification data received wirelessly from the UAV is associated with the received image data.
- the received image data (and/or data derived therefrom) and the received identification data (and/or data derived therefrom) can be displayed at the same time on a display.
- Another approach would be to display received identification data (and/or data derived therefrom) by itself or along with data that is not image data received from a camera (and/or data derived therefrom).
- the received identification data could be displayed by itself, in tabular form along with the time and/or location at which the identification data was received, could be displayed on a map of a flight zone (the map not being based on data captured by the camera) etc.
- a method of controlling a computing device comprising:
- a method according to clause 1 wherein the method is performed in relation to a restricted flight zone. 3.
- performing the first action comprises allowing the UAV to enter the restricted flight zone. 6.
- performing the first action comprises allowing the UAV to remain in the restricted flight zone. 7.
- performing the first action comprises allowing the UAV to leave the restricted flight zone.
- performing the second action comprises preventing the UAV from entering the restricted flight zone.
- performing the second action comprises preventing the UAV from remaining in the restricted flight zone. 10.
- performing the second action comprises preventing the UAV from leaving the restricted flight zone. 11.
- a method according to any of clauses 4 to 10, wherein performing the second action comprises generating an alarm. 12. A method according to any of clauses 4 to 11, wherein performing the second action comprises notifying an entity associated with the UAV. 13. A method according to any of clauses 4 to 12, wherein performing the second action comprises notifying an entity that is not associated with the UAV. 14. A method according to any of clauses 4 to 13, wherein performing the second action comprises taking over control of the UAV. 15. A method according to any of clauses 1 to 14, comprising storing the received identification data in association with the received image data. 16. A method according to any of clauses 1 to 15, comprising transmitting the received identification data in association with the received image data. 17. A method according to any of clauses 1 to 16, comprising performing a look-up for additional data associated with the UAV using the received identification data. 18. A method according to any of clauses 1 to 17, comprising:
- 21. A method according to any of clauses 1 to 20, wherein the received identification data comprises a registration identifier of the UAV. 22.
- the received identification data comprises an equipment identifier of the UAV.
- the received identification data comprises a contact identifier of an entity associated with the UAV. 24.
- a method of controlling a computing device comprising:
- a computer program comprising instructions which, when executed, cause a computing device to perform a method according to any of clauses 1 to 33.
- a computing device arranged to perform a method according to any of clauses 1 to 33.
- 36 A controller device arranged to perform a method according to any of clauses 1 to 33.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Astronomy & Astrophysics (AREA)
- Multimedia (AREA)
- Closed-Circuit Television Systems (AREA)
- Mobile Radio Communication Systems (AREA)
- Traffic Control Systems (AREA)
- Studio Devices (AREA)
Abstract
A handheld computing device comprises: a display; a camera operable to capture image data representing a scene, the scene comprising a UAV; an RF receiver configured to receive identification data wirelessly from the UAV as a result of the UAV having broadcast the identification data; and a controller configured to cause the received identification data and/or data based on the received identification data to be displayed on the display at the same time as a representation of the UAV.
Description
- This application claims priority under 35 U.S.C. § 119(a) to UK Patent Application Nos. GB1717001.0, filed on Oct. 16, 2017, and GB1802095.8, filed on Feb. 8, 2018. The entire content of each of these patent applications is hereby incorporated by reference.
- This application is a continuation of U.S. patent application Ser. No. 16/833,906, filed Mar. 30, 2020, which is incorporated by reference in its entirety. U.S. patent application Ser. No. 16/833,906 is a continuation of U.S. patent application Ser. No. 16/159,751, filed Oct. 15, 2018, which is incorporated by reference in its entirety.
- This disclosure relates to methods, computer programs, computing devices and controllers.
- An unmanned aerial vehicle, UAV, which may also be known as a ‘drone’ or an ‘unmanned aircraft system (UAS)’, is an aircraft that does not have a human pilot aboard. The UAV may be controlled in real-time by a human operator and/or may operate with a degree of autonomy. Although UAVs provide new opportunities, for example in terms of exploration, there are concerns around UAVs, for example in terms of security and privacy.
- According to first embodiments, there is provided a method of controlling a computing device, the method comprising:
- receiving image data from a camera, the received image data representing a scene comprising an unmanned aerial vehicle, UAV;
- receiving identification data wirelessly from the UAV; and
- associating the received image data with the received identification data.
- According to second embodiments, there is provided a method of controlling a computing device, the method comprising:
- causing image data representing a scene comprising an unmanned aerial vehicle, UAV, to be captured using a camera of the computing device;
- receiving identification data wirelessly from the UAV; and
- causing the captured image data and/or data based on the captured image data to be displayed on a display of the computing device at the same time as the received identification data and/or data based on the received identification data.
- According to third embodiments, there is method of controlling a computing device, the method comprising:
- causing a photograph of a scene comprising an unmanned aerial vehicle, UAV, to be captured using a camera of the computing device;
- receiving identification data wirelessly from the UAV; and
- causing the received identification data and/or data based on the received identification data to be displayed on a display of the computing device at the same time as the photograph of the scene comprising the UAV.
- According to fourth embodiments, there is provided a computer program comprising instructions which, when executed, cause a computing device to perform a method provided in accordance with the first, second and/or third embodiments.
- According to fifth embodiments, there is provided a computing device configured to perform a method provided in accordance with the first, second and/or third embodiments.
- According to sixth embodiments, there is provided a controller for a computing device, the controller being configured to perform a method provided in accordance with the first, second and/or third embodiments.
- Various features will now be described, by way of example only, with reference to the accompanying drawing in which:
-
FIG. 1 shows a block diagram of an example computing device in accordance with embodiments. - In examples described herein, a computing device associates identification data received wirelessly from a UAV with image data captured by a camera, the image data representing a scene comprising the UAV. The computing device may be useful, for example, in providing photographic evidence that the UAV was, or was not, at a particular location at a particular point in time, in allowing a user of the computing device to identify one or more attributes or of associated with the UAV, etc. In various examples described herein, the computing device comprises a mobile computing device, such as a smartphone or tablet computing device. As such, the functionality described herein may, in some examples, be made readily available to members of the public using their existing computing devices. For example, software may be downloaded into an existing computing device to provide the computing device with the functionality described herein.
- Referring to
FIG. 1 , there is shown schematically an example of acomputing device 100. Thecomputing device 100 may take various forms. Thecomputing device 100 may be arranged in one or multiple geographical locations. For example, thecomputing device 100 may comprise a distributed computing system. Thecomputing device 100 may be provided in one or more housings. Thecomputing device 100 may process some or all data locally, within thecomputing device 100. Thecomputing device 100 may use one or more cloud-based services to process some or all data. - In some examples, the
computing device 100 comprises a mobile computing device, although the computing device could be fixed-location in other examples. Where thecomputing device 100 comprises a mobile computing device, thecomputing device 100 may comprise one or more elements in addition to the mobile computing device. Examples of mobile computing devices include, but are not limited to, wearable devices, smartphones, laptop computing devices, dedicated portable UAV-monitoring equipment, UAV remote control devices (for example handheld UAV remote control devices), and tablet computing devices. The mobile computing device may be a handheld computing device. Such devices may be relatively inexpensive compared, for example, to more complicated UAV monitoring equipment but may be sufficiently powerful to perform the techniques described herein in at least some desired scenarios. Further, some users may already have such devices and may be able to use such devices to perform the techniques described herein with relatively low additional expenditure, without acquiring further hardware etc. - Portability of the
computing device 100 may be effective where a user wishes to perform the techniques described herein in different locations, using the same device. Existing, fixed-location UAV monitoring equipment may not be designed or suitable for this. - In some examples, the
computing device 100 comprises a UAV (different from the UAV comprised in the scene represented in the received image data). Where thecomputing device 100 comprises a UAV, thecomputing device 100 may comprise one or more elements in addition to the UAV. A UAV may therefore perform the techniques described herein in relation to a further UAV. Using a UAV may provide flexibility where, for example, the UAV can approach the UAV comprised in the scene represented in the received image data, for example to interrogate the UAV, take a close-up photograph or video of the UAV, follow the UAV etc. In a similar manner to a mobile computing device, a UAV may enable the techniques described herein to be deployed in different locations using the same hardware. A UAV with a camera may, for example, patrol a given airspace, covering a relatively large area compared to a fixed-location camera or computing system. Further, a UAV may be dispatched on-demand from a given location to a different location where a further UAV of interest is in the different location. As such, the number of computing systems that cover a given area may be lower where the computing system comprises a UAV than where multiple fixed-location computing systems are used to cover the given area. - In this example, the
computing device 100 comprises acontroller 110. Thecontroller 110 is communicatively coupled to one or more other components of thecomputing device 100, for example via a bus. Thecontroller 110 may, for example, comprise a microprocessor. - In this example, the
computing device 100 comprises one ormore sensors 120. In this example, thecomputing device 100 comprises a sensor in the form of acamera 120. Thecamera 120 may, for example, capture still image data and/or video data. Thecamera 120 may, for example, capture visible light and/or infrared. Other types ofsensor 120 include, but are not limited to, ultrasonic sensors, Light Detection And Ranging (LiDAR) sensors etc. References to image data will be understood accordingly to be data captured by the sensor, dependent on the type of sensor. - In this example, the
computing device 100 comprises atransceiver 130. Although thetransceiver 130 is depicted as a single component inFIG. 1 , comprising both transmission and reception functionality, thetransceiver 130 could in other examples comprise separate transmitter and receiver components. Thetransceiver 130 may transmit and receive on the same, or different, frequencies. Thetransceiver 130 may transmit to and receive from the same, or different, entities. Thetransceiver 130 may transmit and receive using the same, or different, communication protocols. Thetransceiver 130 may be operable to transmit and receive simultaneously, or otherwise. Thetransceiver 130 may operate in the radio frequency (RF) part of the electromagnetic spectrum. - In this example, the
computing device 100 comprises adisplay 140. Thedisplay 140 may, for example, comprise a touch-sensitive display. However, other types of display may be used. - In some examples, the
camera 120 anddisplay 140 are on different surfaces of thecomputing device 100. For example, thecamera 120 may be on a front surface of thecomputing device 100 and thedisplay 140 may be on a rear surface of thecomputing device 100. As such, a user of thecomputing device 100 may be able to point thecamera 120 at the UAV and see a representation of the UAV on thedisplay 140 in real-time, as if they were seeing the UAV ‘through’ thecomputing device 100. However, in some examples, in which thecamera 120 anddisplay 140 are on different surfaces of thecomputing device 100, the user can alternatively or additionally point thecamera 120 at the UAV, capture image data representing the UAV, and see the representation of the UAV on thedisplay 140 at a later point in time. - In this example, the
computing device 100 comprisesmemory 150. The memory may store one or more computer programs. The one or more computer programs may comprise computer-readable instructions. Thecomputing device 100, for example thecontroller 110, may be configured to execute the one or more computer programs and, as a result, perform at least some of the techniques described herein. The one or more computer programs may be downloaded onto thecomputing device 100. For example, the one or more computer programs may be downloaded from a computer program store. - In this example, the
controller 110 receives image data from thecamera 120. The received image data represents a scene comprising a UAV. The scene may comprise one or more further objects. Examples of such further objects include, but are not limited, to UAVs, other vehicles, people and buildings. The received image data may comprise a photograph of the scene. The received image data may have been subject to image processing prior to being received at thecontroller 110. An example of such image processing is object recognition, for example to identify the UAV and/or further objects. Such object recognition may be performed using a trained Artificial Neural Network (ANN), for example. - In this example, the
controller 110 receives identification data wirelessly from the UAV. The identification data is useable to identity at least one attribute of or associated with the UAV. In this example, thecontroller 110 receives the identification data wirelessly from the UAV via thetransceiver 130. - The
controller 110 may use standardised wireless technology to receive the identification data. Standardised wireless technology may be considered to be wireless technology that is the subject of one or more standards. This can facilitate interoperability and/or adoption compared, for example, to proprietary wireless technology. However, proprietary wireless technology may be used in some examples. For example, proprietary wireless technology may allow enhanced customisation compared to standardised wireless technology. - The
computing device 100 may receive the identification data wirelessly from the UAV via a wireless local area network (WLAN), for example in accordance with Wi-Fi™ technology. For example, thecomputing device 100 may receive the identification data on the 2.4 GHz and/or 5.8 GHz Wi-Fi™ bands. This may provide a relatively large operating range and relatively low power consumption compared to some short-range technologies such as Bluetooth™. Thecomputing device 100 may use one or more designated Wi-Fi™ channels for the reception of the identification data. - However, the identification data may be received via a short-range technology, of which Bluetooth™ is an example. Bluetooth™ may have a typical operating range of around 10 m-100 m. Bluetooth™ 5.0 may have a typical operating range of around 40 m-400 m. The
computing device 100 may pair with the UAV in order to receive the identification data via Bluetooth™. Thecomputing device 100 may, however, be able to receive the identification data from the UAV over Bluetooth™ without being paired with the UAV. - Further, using a technology such as Wi-Fi™ or Bluetooth™ may not be reliant upon the availability of a cellular network, for example a 4G system, where connectivity may be limited in remote locations, where the cost of use may be relatively high etc. Further, such technologies may be supported by a relatively large number of computing devices. For example, most smartphones currently support both Wi-Fi™ and Bluetooth™.
- The
computing device 100 may be configured to parse the identification data from wireless transmissions from the UAV. For example, thecomputing device 100 may receive data in a given format from the UAV and may extract a given portion of the received data corresponding to the identification data, where the given portion is known to correspond to the identification data. The received data may be in a standardised or proprietary format having a predefined syntax, for example. Thecomputing device 100 may be configured to discard at least some data other than the identification data in wireless transmissions received from the UAV. - The
computing device 100 may receive the identification data from the UAV while the UAV is within the field of view of thecamera 120, or otherwise. Receiving the identification data from the UAV while the UAV is within the field of view of thecamera 120 may provide temporal correlation between the received image data and the received identification data which can help in associating the received identification data with the received image data. For example, a user of thecomputing device 100 may be able to capture a photograph of the UAV along with the identification data. - The
controller 110 may transmit an identification request to the UAV. For example, thecontroller 110 may detect the presence of the UAV and transmit an identification request to the UAV. Thecontroller 110 may receive the identification data from the UAV in response to the identification request, or otherwise. As such, thecomputing system 100 may interrogate the UAV to request the UAV to identify itself. In some examples, however, the UAV may transmit the identification request without being prompted to do so. For example, the UAV may be configured to transmit the identification request intermittently. - The received identification data may comprise a registration identifier of the UAV. The registration identifier may facilitate identification of the UAV with a UAV-registration body.
- The received identification data may comprise an equipment identifier of the UAV. The equipment identifier may facilitate identification of the UAV based on being able to identify the UAV equipment itself. The received identification data, for example the equipment identifier, may comprise a UAV make identifier, identifying a make of the UAV. The received identification data, for example the equipment identifier, may comprise a UAV model identifier, identifying a model of the UAV. The received identification data, for example the equipment identifier, may comprise a UAV type identifier, identifying a type of the UAV.
- The received identification data may comprise a contact identifier of an entity associated with the UAV. For example, the contact identifier may facilitate contact with an owner, operator or other entity associated with the UAV.
- The received identification data may comprise a WLAN identifier associated with the wireless reception of the identification data from the UAV. For example, the WLAN identifier may enable the UAV to be identified, where the WLAN identifier is known to be associated with the UAV. The WLAN identifier may, for example, comprise a Service Set ID (SSID) and/or a Basic Service Set ID (BSSID). A BSSID may be more reliable than an SSID in identifying a UAV since the SSID may be changed by a user and may not be sufficiently unique across multiple UAVs whereas a BSSID is more likely to be useable to readily identify an individual UAV. However, an SSID may be used in some examples.
- The received identification data may comprise one or multiple different types of identifier. The received identification data may not comprise any personally identifiable information, namely information that can be used to identify a human operator, owner, or the like of the UAV. This can help to preserve privacy, particularly but not exclusively where the UAV broadcasts or otherwise advertises the identification data to interested parties, as opposed to transmitting the identification data in a more targeted manner.
- The
controller 110 may receive the identification data as a result of transmission of the identification data from the UAV to a control device of an operator of the UAV over a predetermined communications link. The predetermined communications link may be a control link. As such, thecontroller 110 may, in effect, intercept the control data and use the control data in accordance with the techniques described herein. The identification data may be transmitted over the predetermined communications link from the UAV to the control device in an encrypted form. Thecontroller 110 may decrypt the identification data. For example, thecontroller 110 may have access to a decryption key useable to decode data communicated over the predetermined communications link. Different UAVs may use different encryption protocols, keys etc. For example, different users may use different keys and/or protocols to encrypt such data, different manufacturers may use different keys and/or protocols etc. Thecontroller 110 may have access to multiple different decryption keys and/or may be able to use multiple different decryption techniques to provide enhanced interoperability in relation to different users, manufacturers etc. - The
controller 110 may receive the identification data as a result of broadcasting of the identification data by the UAV. For example, the UAV may broadcast the identification data to interested receiving devices in addition to, or as an alternative to, transmitting the identification data to the control device over the predetermined communications link. For example, where the UAV is operating autonomously, not under the control of a human operator, a control link to a control device may be not available. The identification data may be broadcast intermittently, for example periodically. - In this example, the
controller 110 associates the received image data (and/or data based thereon) with the received identification data (and/or data based thereon). Examples of data based on the received image data include, but are not limited to, compressed image data, encrypted image data, map data generated based on the received image data etc. Examples of data based on the received identification data include, but are not limited to, contact information associated with an owner of the UAV, registration information associated with the UAV, telemetry information associated with the UAV, a cryptographically derived version of the received identification data (for example a hash of the received identification data) etc. - The ability to capture image data of a scene comprising the UAV at the same time as receiving the identification data wirelessly from the UAV may depend on transmission capabilities of the UAV, properties of the
camera 120 and/or properties of thetransceiver 130. For example, thetransceiver 130 may be able to receive signals from a UAV several miles away (potentially several or even tens of miles away), but thecamera 120 may be unable to capture image data of a UAV such a distance away. - The
controller 110 may store the received identification data (and/or data based thereon) in association with the received image data (and/or data based thereon). Thecontroller 110 may store the received identification data (and/or data based thereon) in association with the received image data (and/or data based thereon) in thememory 150 of thecomputing device 100, or elsewhere. - The
controller 110 may transmit the received identification data (and/or data based thereon) in association with the received image data (and/or data based thereon). Thecontroller 110 may transmit the received identification data (and/or data based thereon) in association with the received image data (and/or data based thereon) via thetransceiver 130, or otherwise. Thecontroller 110 may transmit the received identification data (and/or data based thereon) together with or separately from the received image data (and/or data based thereon). - The
controller 110 may, however, not transmit data in this manner, for example where this could cause privacy concerns. - In some examples, a user of the
computing device 100 may be restricted or inhibited from accessing the received identification data (and/or data based thereon). As such, although the association is made between the received identification data (and/or data based thereon) in association with the received image data (and/or data based thereon) and such data may, for example, be stored in thecomputing device 100 it may only be accessible by a designated entity (for example a law enforcement agency). - The
controller 110 may perform a look-up for additional data associated with the UAV using the received identification data (and/or data based thereon). Examples of additional data include, but are not limited to, authorisation information, flight path information etc. Performing the look-up may comprise querying a database using the received identification data (and/or data based thereon). The database may be local to or remote from thecomputing device 100. - The
computing device 110 may be associated with a restricted flight zone (or ‘restricted airspace’). The flight zone may be restricted in that one or more restrictions may be in place in relation to the flying of UAVs within the restricted flight zone. The restricted flight zone may correspond to a sensitive location, a landmark, space that can only be entered upon payment etc. Examples of restricted flight zones may include, but are not limited to, airports, military facilities, prisons, infrastructure, and schools. - The computing device 110 (or at least part of it) may be physically located within the restricted flight zone. The computing device 110 (or at least part of it) may not be physically located within the restricted flight zone, but may nevertheless be used to perform computation associated with restricted flight zone.
- The
controller 110 may determine whether or not the UAV is authorised to be in the restricted flight zone based on the received identification data (and/or data based thereon). For example, thecontroller 110 may query an authorisation database using the received identification data (and/or data based thereon) and receive an indication of authorisation accordingly. Alternatively, or additionally, thecontroller 110 may determine whether or not the UAV is authorised to be in the restricted flight zone based on authorisation data received from the UAV. - The
controller 110 may perform a first action in response to determining that the UAV is authorised to be in the restricted flight zone and may perform a second, different action in response to determining that the UAV is not authorised to be in the restricted flight zone. - Examples of the first action include, but are not limited to, allowing the UAV to enter the restricted flight zone (for example where the UAV is not already in the restricted flight zone), allowing the UAV to remain in the restricted flight zone (for example where the UAV is already in the restricted flight zone), allowing the UAV to leave the restricted flight zone (for example where the UAV is not already in the restricted flight zone).
- Examples of the second action include, but are not limited to, preventing the UAV from entering the restricted flight zone (for example where the UAV is not already in the restricted flight zone), preventing the UAV from remaining in the restricted flight zone (for example where the UAV is already in the restricted flight zone), preventing the UAV from leaving the restricted flight zone (for example where the UAV is already in the restricted flight zone), generating an alarm, notifying an entity associated with the UAV, notifying an entity that is not associated with the UAV, and taking over control of the UAV (for example to fly it out of a restricted flight zone, cause it to crash etc). Examples of entities that are not associated with the UAV may include, but are not limited to, police, security agencies and aviation authorities.
- The
controller 110 may receive telemetry data wirelessly from the UAV. Examples of telemetry data include, but are not limited to, location, altitude, speed, direction, battery level, and altitude. Thecontroller 110 may associate the received telemetry data with the received image data (and/or data based thereon) and/or the received identification data (and/or data based thereon). Thecontroller 110 may cause the telemetry data to be displayed at the same time as the received image data (and/or data based thereon) and/or the received identification data (and/or data based thereon). - The
controller 110 may receive authorisation data wirelessly from the UAV. For example, the UAV may be able to indicate to thecomputing device 100 that it is authorised to be in a given location. Thecontroller 110 may be able to act on the authorisation data received from the UAV in certain scenarios, for example where thecontroller 110 trusts that the UAV is indeed authorised. - The
controller 110 may associate the other data with the received image data (and/or data based thereon) and/or the received identification data (and/or data based thereon). Examples of other data include, but are not limited to, time data and location data. This can be used, along with the received image data (and/or data based thereon), as evidence that the UAV was, or was not, in a particular location at a particular point in time. - The
controller 110 may cause the received image data (and/or data based thereon) to be displayed at the same time as the received identification data (and/or data based thereon). In this example, where thecomputing device 100 comprises thedisplay 140, thecontroller 110 may cause both the received image data (and/or data based thereon) and the received identification data (and/or data based thereon) to be displayed together, at the same time, such that a user of thecomputing device 100 may determine that the received identification data (and/or data based thereon) relates to the UAV. As such, the received image data (and/or data based thereon) can be displayed at the same time as the received identification data (and/or data based thereon) on thedisplay 140 of thecomputing device 100 in which thecamera 120 is comprised. - The
controller 110 may cause the received image data (and/or data based thereon) to be displayed at the same time as the received identification data (and/or data based thereon) without specific user input. Alternatively, thecontroller 110 may cause the received image data (and/or data based thereon) to be displayed at the same time as the received identification data (and/or data based thereon) in response to specific predetermined user input. For example, thecontroller 110 may cause the received image data (and/or data based thereon) to be displayed initially without the received identification data (and/or data based thereon) and, in response to specific user input, cause the identification data (and/or data based thereon) to be displayed at the same time as the received image data (and/or data based thereon). Such specific user input may, for example, comprise the user selecting a representation of the UAV on thedisplay 140 of thecomputing device 100, selecting a soft button on thedisplay 140 of thecomputing device 100, etc. - The
controller 110 may cause the received image data (and/or data based thereon) to be displayed on thedisplay 140 and also cause the received identification data (and/or data based thereon) to be displayed on thedisplay 140, but at different times. For example, thecontroller 110 may cause the received image data (and/or data based thereon) to be displayed initially without the received identification data (and/or data based thereon) and, in response to specific user input, cause the identification data (and/or data based thereon) to be displayed instead of the received image data (and/or data based thereon). Such specific user input may, for example, comprise the user selecting a representation of the UAV on thedisplay 140 of thecomputing device 100, selecting a soft button on thedisplay 140 of thecomputing device 100, etc. - The received image data (and/or data based thereon) and the received identification data (and/or data based thereon) may be displayed in real-time in relation to activity of the UAV. In other words, the
display 140 may display the received image data (and/or data based thereon) and the received identification data (and/or data based thereon) while the UAV is still within the field of view of thecamera 120. - The received image data (and/or data based thereon) and the received identification data (and/or data based thereon) may be displayed in non-real-time in relation to activity of the UAV. In other words, the
display 140 may display the received image data (and/or data based thereon) and the received identification data (and/or data based thereon) after the UAV has left the field of view of thecamera 120. - The scene represented in the received image data may comprise multiple UAVs. The
controller 110 may determine that the received identification data relates to a particular one of the multiple UAVs and associate the received identification data (and/or data based thereon) with the particular one of the multiple UAVs accordingly. For example, thecontroller 110 may be able to determine that a wireless transmission comprising the received identification data originates from the particular one of the multiple UAVs based on determining a direction of arrival using of the transmission. Thecontroller 110 may be able to determine that a wireless transmission comprising the received identification data originates from the particular one of the multiple UAVs where the received identification data is associated with a particular type of UAV and where thecontroller 110 can recognise one of the multiple UAVs as being of that particular type. In some cases, thecontroller 110 may be unable to determine which particular one of the multiple UAVs the received identification data relates to but may nevertheless associate the received identification data (and/or data based thereon) with the received image data. For example, it may be sufficient to record image data of an authorised UAV in a restricted flight zone even if an authorised UAV was present in the restricted flight zone at the same time. Thecomputing device 100 may indicate the number of UAVs in the scene represented in the received image data. - Where the
controller 110 receives further identification data wirelessly from a further UAV of the multiple UAVs, thecontroller 100 may associate the further identification data (and/or data based thereon) with the received image data (and/or data based thereon). Thecontroller 110 may be able to determine that the further identification data relates to a particular further one of the multiple UAVs or otherwise. - Various measures (for example methods, computer programs, computing devices and controllers) are provided in relation to computing devices. Image data is received from a camera. The camera may or may not be comprised in the computing device. The received image data represents a scene comprising an unmanned aerial vehicle, UAV. Identification data is received wirelessly from the UAV. The received image data is associated with the received identification data. As such, image data representing a scene comprising a UAV is associated with identification data received wireless from the UAV. This can be used, for example, in providing photographic evidence that the UAV was at a particular location at a particular point in time, in identifying one or more attributes or of associated with the UAV, etc.
- The computing device and/or method may be associated with a restricted flight zone. It may be especially effective to be able to track, record and/or monitor such information in relation to a restricted flight zone which may, for example, correspond to a sensitive location in which only authorised UAVs should be allowed to fly. However, examples also relate to non-restricted flight zones. For example, an individual may take a photograph and/or video of a UAV on their smartphone and have identification data associated with the UAV recorded with the photograph and/or video, for example where the individual suspects contravention of a regulation, privacy etc.
- It may be determined whether or not the UAV is authorised to be in the restricted flight zone based on the received identification data. In particular, a first action may be performed in response to determining that the UAV is authorised to be in the restricted flight zone and a second, different action may be performed in response to determining that the UAV is not authorised to be in the restricted flight zone. This allows different actions to be taken depending on the nature of the UAV and its reasons for being in the restricted flight zone.
- Performing the first action may comprise allowing the UAV to enter the restricted flight zone. Performing the first action may comprise allowing the UAV to remain in the restricted flight zone. Performing the first action may comprise allowing the UAV to leave the restricted flight zone. As such, access control to, in and/or from the restricted flight zone may be provided by the computing device.
- Performing the second action may comprise preventing the UAV from entering the restricted flight zone. Performing the second action may comprise preventing the UAV from remaining in the restricted flight zone. Performing the second action may comprise preventing the UAV from leaving the restricted flight zone. As such, access control to, in and/or from the restricted flight zone may be provided by the computing device.
- Performing the second action may comprise generating an alarm. As such, an entity can be alerted to the presence of the unauthorised UAV in the restricted flight zones. Examples of such entity include, but are not limited to, people in the restricted flight zone.
- Performing the second action may comprise notifying an entity associated with the UAV. As such, an entity associated with the UAV (for example its operator) may be able to take remedial action following the unauthorised entry to the restricted flight zone.
- Performing the second action may comprise notifying an entity that is not associated with the UAV. As such, an entity not associated with the UAV (for example a law enforcement agency) may be able to take remedial action following the unauthorised entry to the restricted flight zone.
- Performing the second action may comprise taking over control of the UAV. As such, it may be possible to remove the UAV from the restricted flight zone, for example by flying the UAV to a non-restricted flight zone.
- The received identification data may be stored in association with the received image data. As such, a record can be kept of the association, for example for training, law enforcement, audit etc purposes.
- The received identification data may be transmitted in association with the received image data. As such, an entity other than the computing device can be informed of the association, for example for training, law enforcement, audit etc purposes.
- A look-up may be performed for additional data associated with the UAV using the received identification data. This can supplement the information available to the computing device in relation to the UAV.
- Telemetry data may be received wirelessly from the UAV. The received telemetry data may be associated with the received image data and/or the received identification data. As such, the computing device has additional data on which to base actions, to provide to a user etc.
- The received image data may be caused to be displayed at the same time as the received identification data. This may facilitate remedial action being taken promptly.
- The received image data may be caused to be displayed at the same time as the received identification data on a display of a computing device in which the camera is comprised. This may facilitate remedial action being taken promptly.
- The received identification data may comprise a registration identifier of the UAV. This can facilitate identification of the UAV and/or an associated entity.
- The received identification data may comprise an equipment identifier of the UAV. This can facilitate identification of the UAV and/or an associated entity.
- The received identification data may comprise a contact identifier of an entity associated with the UAV. This can facilitate identification of the UAV and/or an associated entity.
- The received identification data may comprise a wireless local area network, WLAN, identifier associated with the wireless reception of the identification data from the UAV. This can facilitate identification of the UAV and/or an associated entity when the relationship between the WLAN identifier and the UAV and/or an associated entity is known.
- The WLAN identifier may comprise a Basic Service Set ID, BSSID. This may provide a relatively accurate technique to identify the UAV based on WLAN identifier compared to the use of a Service Set ID, SSID.
- The identification data may be received as a result of transmission of the identification data from the UAV to a control device of an operator of the UAV over a predetermined communications link. As such, the techniques described herein may use an existing transmission to identify the UAV. This may result in existing hardware and/or software being used.
- The identification data may be transmitted over the predetermined communications link in an encrypted form and wherein the method comprises decrypting the identification data. This provides both privacy in relation to control of the UAV by an operator and the ability for the computing device (or another entity) to accurately identify the UAV in relation to the image data.
- The identification data may be received as a result of broadcasting of the identification data by the UAV. As such, existing hardware and/or software used to broadcast identification data may be used.
- The received image data may comprise a photograph of the scene. This can provide photographic evidence to facilitate proving that the UAV was in a particular location at a particular point in time.
- The computing device may comprise a mobile computing device. This can allow a given device to be used in multiple locations across a given area, potentially without multiple fixed-location cameras and/or computing devices to cover the given area.
- The computing device may comprise a UAV. This can allow UAV to patrol a given area, potentially without multiple fixed-location cameras and/or computing devices to cover the given area.
- Various measures (for example methods, computer programs, computing devices and controllers) are provided in relation to computing devices. Image data representing a scene comprising an unmanned aerial vehicle, UAV, is caused to be captured using a camera of the computing device. Identification data is received wirelessly from the UAV. The captured image data and/or data based on the captured image data is caused to be displayed on a display of the computing device at the same time as the received identification data and/or data based on the received identification data. As such, the computing device can conveniently capture a representation of a UAV and display identification data received from the UAV and/or data based thereon, on a display of the computing device, for example so that a user who has controlled the computing device to capture the representation can also see the identification data and/or the data based thereon on a display of the same computing device.
- Various measures (for example methods, computer programs, computing devices and controllers) are provided in relation to computing devices. A photograph of a scene comprising an unmanned aerial vehicle, UAV, is caused to be captured using a camera of the computing device. Identification data is received wirelessly from the UAV. The received identification data and/or data based on the received identification data is caused to be displayed on a display of the computing device at the same time as the photograph of the scene comprising the UAV. As such, the computing device can conveniently capture a photograph of a UAV and display identification data received from the UAV and/or data based thereon, on a display of the computing device, for example so that a user who has controlled the computing device to capture the photograph can also see the identification data and/or the data based thereon on a display of the same computing device.
- Various modifications and alternatives will be apparent to one skilled in the art. In particular, although several components are depicted in the
example computing device 100 shown inFIG. 1 , thecomputing device 100 may comprise a different combination of components (for example fewer, more, different). - Examples have been described above in which techniques are performed in relation to restricted airspace. Other triggers or conditions for the techniques described herein are envisaged including, but not limited to, detecting dangerous flying of the UAV (for example by the camera), detection of transmissions from the UAV (for example the UAV coming into range of the computing device 100), an explicit request to monitor the UAV (for example from a controller of the UAV or otherwise), etc.
- Examples have been described above in which image data received from a camera represents a scene comprising an unmanned aerial vehicle, UAV and in which identification data received wirelessly from the UAV is associated with the received image data. For example, the received image data (and/or data derived therefrom) and the received identification data (and/or data derived therefrom) can be displayed at the same time on a display. Another approach would be to display received identification data (and/or data derived therefrom) by itself or along with data that is not image data received from a camera (and/or data derived therefrom). For example, the received identification data (and/or data derived therefrom) could be displayed by itself, in tabular form along with the time and/or location at which the identification data was received, could be displayed on a map of a flight zone (the map not being based on data captured by the camera) etc.
- The following numbered clauses on pages 18 to 21 of the present description correspond to the claims of UK patent application nos. GB1717001.0 and GB1802095.8, from which the present application claims priority, as filed. The claims of the present application as filed can be found on the subsequent pages 22 to 24 of the specification which begin with the heading “CLAIMS”.
- 1. A method of controlling a computing device, the method comprising:
- receiving image data from a camera, the received image data representing a scene comprising an unmanned aerial vehicle, UAV;
- receiving identification data wirelessly from the UAV; and
- associating the received image data with the received identification data.
- 2. A method according to clause 1, wherein the method is performed in relation to a restricted flight zone.
3. A method according to clause 2, comprising determining whether or not the UAV is authorised to be in the restricted flight zone based on the received identification data.
4. A method according to clause 3, comprising: - performing a first action in response to determining that the UAV is authorised to be in the restricted flight zone; and
- performing a second, different action in response to determining that the UAV is not authorised to be in the restricted flight zone.
- 5. A method according to clause 4, wherein performing the first action comprises allowing the UAV to enter the restricted flight zone.
6. A method according to clause 4 or 5, wherein performing the first action comprises allowing the UAV to remain in the restricted flight zone.
7. A method according to any of clauses 4 to 6, wherein performing the first action comprises allowing the UAV to leave the restricted flight zone.
8. A method according to any of clauses 4 to 7, wherein performing the second action comprises preventing the UAV from entering the restricted flight zone.
9. A method according to any of clauses 4 to 8, wherein performing the second action comprises preventing the UAV from remaining in the restricted flight zone.
10. A method according to any of clauses 4 to 9, wherein performing the second action comprises preventing the UAV from leaving the restricted flight zone.
11. A method according to any of clauses 4 to 10, wherein performing the second action comprises generating an alarm.
12. A method according to any of clauses 4 to 11, wherein performing the second action comprises notifying an entity associated with the UAV.
13. A method according to any of clauses 4 to 12, wherein performing the second action comprises notifying an entity that is not associated with the UAV.
14. A method according to any of clauses 4 to 13, wherein performing the second action comprises taking over control of the UAV.
15. A method according to any of clauses 1 to 14, comprising storing the received identification data in association with the received image data.
16. A method according to any of clauses 1 to 15, comprising transmitting the received identification data in association with the received image data.
17. A method according to any of clauses 1 to 16, comprising performing a look-up for additional data associated with the UAV using the received identification data.
18. A method according to any of clauses 1 to 17, comprising: - receiving telemetry data wirelessly from the UAV; and
- associating the received telemetry data with the received image data and/or the received identification data.
- 19. A method according to any of clauses 1 to 18, comprising causing the received image data to be displayed at the same time as the received identification data.
20. A method according to clause 19, comprising causing the received image data to be displayed at the same time as the received identification data on a display of a computing device in which the camera is comprised.
21. A method according to any of clauses 1 to 20, wherein the received identification data comprises a registration identifier of the UAV.
22. A method according to any of clauses 1 to 21, wherein the received identification data comprises an equipment identifier of the UAV.
23. A method according to any of clauses 1 to 22, wherein the received identification data comprises a contact identifier of an entity associated with the UAV.
24. A method according to any of clauses 1 to 23, wherein the received identification data comprises a wireless local area network, WLAN, identifier associated with the wireless reception of the identification data from the UAV.
25. A method according to clause 24, wherein the WLAN identifier comprises a Basic Service Set ID, BSSID.
26. A method according to any of clauses 1 to 25, wherein the identification data is received as a result of transmission of the identification data from the UAV to a control device of an operator of the UAV over a predetermined communications link.
27. A method according to clause 26, wherein the identification data is transmitted over the predetermined communications link in an encrypted form and wherein the method comprises decrypting the identification data.
28. A method according to any of clauses 1 to 27, wherein the identification data is received as a result of broadcasting of the identification data by the UAV.
29. A method according to any of clauses 1 to 28, wherein the received image data comprises a photograph of the scene.
30. A method according to any of clauses 1 to 29, wherein the computing device comprises a mobile computing device.
31. A method according to any of clauses 1 to 29, wherein the computing device comprises a UAV.
32. A method of controlling a computing device, the method comprising: - causing image data representing a scene comprising an unmanned aerial vehicle, UAV, to be captured using a camera of the computing device;
- receiving identification data wirelessly from the UAV; and
- causing the captured image data and/or data based on the captured image data to be displayed on a display of the computing device at the same time as the received identification data and/or data based on the received identification data.
- 33. A method of controlling a computing device, the method comprising:
- causing a photograph of a scene comprising an unmanned aerial vehicle, UAV, to be captured using a camera of the computing device;
- receiving identification data wirelessly from the UAV; and
- causing the received identification data and/or data based on the received identification data to be displayed on a display of the computing device at the same time as the photograph of the scene comprising the UAV.
- 34. A computer program comprising instructions which, when executed, cause a computing device to perform a method according to any of clauses 1 to 33.
35. A computing device arranged to perform a method according to any of clauses 1 to 33.
36. A controller device arranged to perform a method according to any of clauses 1 to 33.
Claims (20)
1. An unmanned aerial vehicle, UAV, comprising:
a transmitter configured to transmit:
identification data of the UAV; and
telemetry data of the UAV,
wherein the telemetry data of the UAV comprises:
a location of the UAV; and
an altitude of the UAV.
2. The UAV of claim 1 , wherein the telemetry data of the UA comprises a speed of the UAV.
3. The UAV of claim 1 , wherein the telemetry data of the UA comprises a direction of the UAV.
4. The UAV of claim 1 , wherein the telemetry data of the UA comprises a velocity of the UAV, and wherein the velocity of the UAV comprises a speed of the UAV and a direction of the UAV.
5. The UAV of claim 1 , wherein the telemetry data of the UA comprises a battery level of the UAV.
6. The UAV of claim 1 , wherein the transmitter is configured to broadcast the identification data of the UAV and/or the telemetry data of the UAV.
7. The UAV of claim 1 , wherein the transmitter is configured to transmit the identification data of the UAV and/or the telemetry data of the UAV using a short-range wireless technology.
8. The UAV of claim 7 , wherein the short-range wireless technology comprises Bluetooth.
9. The UAV of claim 1 , wherein the transmitter is configured to transmit the identification data of the UAV and/or the telemetry data of the UAV using a wireless local area network, WLAN.
10. The UAV of claim 9 , wherein the WLAN is in accordance with Wi-Fi technology.
11. The UAV of claim 10 , wherein the transmitter is configured to transmit in a 2.4 GHz Wi-Fi band and/or a 5.8 GHz Wi-Fi band.
12. The UAV of claim 1 , wherein the transmitter is configured to transmit the identification data of the UAV and/or the telemetry data of the UAV in a radio frequency, RF, part of an electromagnetic spectrum.
13. The UAV of claim 1 , wherein the transmitter is operable to transmit the identification data of the UAV and the telemetry data of the UAV to a computing device of a law enforcement agency.
14. The UAV of claim 1 , wherein the transmitter is operable to transmit the identification data of the UAV and the telemetry data of the UAV to a computing device of an aviation authority.
15. The UAV of claim 1 , wherein the identification data of the UAV comprises an equipment identifier of the UAV.
16. The UAV of claim 1 , wherein the identification data of the UAV comprises a registration identifier of the UAV.
17. The UAV of claim 1 , wherein the transmitter is configured to transmit a transmitter identifier, the transmitter identifier being configured to identify the transmitter of the UAV and being different from the identification data of the UAV.
18. The UAV of claim 1 , wherein the transmitter identifier comprises a Basic Service Set ID, BSSID.
19. An unmanned aerial vehicle, UAV, comprising:
a transmitter configured to transmit, in a radio frequency, RF, part of an electromagnetic spectrum:
a transmitter identifier, the transmitter identifier being configured to identify the transmitter of the UAV;
a location of the UAV;
an altitude of the UAV;
a velocity of the UAV, the velocity of the UAV comprising:
a speed of the UAV; and
a direction of the UAV.
20. An unmanned aerial vehicle, UAV, comprising:
a transmitter configured to broadcast telemetry data to a computing device of an aviation authority and/or to a computing device of a law enforcement agency, the telemetry data being configured to identify a location, altitude and velocity of the UAV,
wherein the transmitter is configured to broadcast the telemetry data using a short-range wireless technology and/or a wireless local area network, WLAN.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/132,468 US20230245574A1 (en) | 2017-10-16 | 2023-04-10 | Methods, computer programs, computing devices and controllers |
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GBGB1717001.0 | 2017-10-16 | ||
GB1717001.0A GB2562813B (en) | 2017-10-16 | 2017-10-16 | Detecting and identifying unmanned aerial vehicles |
GBGB1802095.8 | 2018-02-08 | ||
GBGB1802095.8A GB201802095D0 (en) | 2017-10-16 | 2018-02-08 | Methods, computer programs, computing devices and controllers |
US16/159,751 US10657831B2 (en) | 2017-10-16 | 2018-10-15 | Methods, computer programs, computing devices and controllers |
US16/833,906 US11651698B2 (en) | 2017-10-16 | 2020-03-30 | Method and system for receiving and displaying UAV data |
US18/132,468 US20230245574A1 (en) | 2017-10-16 | 2023-04-10 | Methods, computer programs, computing devices and controllers |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/833,906 Continuation US11651698B2 (en) | 2017-10-16 | 2020-03-30 | Method and system for receiving and displaying UAV data |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230245574A1 true US20230245574A1 (en) | 2023-08-03 |
Family
ID=60419172
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/159,751 Expired - Fee Related US10657831B2 (en) | 2017-10-16 | 2018-10-15 | Methods, computer programs, computing devices and controllers |
US16/833,906 Active 2040-04-06 US11651698B2 (en) | 2017-10-16 | 2020-03-30 | Method and system for receiving and displaying UAV data |
US18/132,468 Pending US20230245574A1 (en) | 2017-10-16 | 2023-04-10 | Methods, computer programs, computing devices and controllers |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/159,751 Expired - Fee Related US10657831B2 (en) | 2017-10-16 | 2018-10-15 | Methods, computer programs, computing devices and controllers |
US16/833,906 Active 2040-04-06 US11651698B2 (en) | 2017-10-16 | 2020-03-30 | Method and system for receiving and displaying UAV data |
Country Status (3)
Country | Link |
---|---|
US (3) | US10657831B2 (en) |
CA (1) | CA3020776A1 (en) |
GB (3) | GB2562813B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11079757B1 (en) * | 2017-11-20 | 2021-08-03 | Amazon Technologies, Inc. | Unmanned aerial vehicles to survey locations and collect data about different signal sources |
US11782965B1 (en) * | 2018-04-05 | 2023-10-10 | Veritas Technologies Llc | Systems and methods for normalizing data store classification information |
CN110543872B (en) * | 2019-09-12 | 2023-04-18 | 云南省水利水电勘测设计研究院 | Unmanned aerial vehicle image building roof extraction method based on full convolution neural network |
WO2021126018A1 (en) * | 2019-12-16 | 2021-06-24 | Telefonaktiebolaget Lm Ericsson (Publ) | Mobile device, network node and methods for identifying equipment |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8351979B2 (en) * | 2008-08-21 | 2013-01-08 | Apple Inc. | Camera as input interface |
US8983682B1 (en) * | 2012-12-28 | 2015-03-17 | Google Inc. | Unlocking mobile-device and/or unmanned aerial vehicle capability in an emergency situation |
US9754496B2 (en) * | 2014-09-30 | 2017-09-05 | Elwha Llc | System and method for management of airspace for unmanned aircraft |
WO2016108220A1 (en) * | 2014-12-29 | 2016-07-07 | Klein Hagay | Improved visual monitoring on smartphone screen |
US9601022B2 (en) * | 2015-01-29 | 2017-03-21 | Qualcomm Incorporated | Systems and methods for restricting drone airspace access |
US9646502B1 (en) * | 2015-02-27 | 2017-05-09 | Amazon Technologies, Inc. | Universal unmanned aerial vehicle identification system |
CN107615358A (en) * | 2015-03-31 | 2018-01-19 | 深圳市大疆创新科技有限公司 | For identifying the Verification System and method of authorized participant |
US11017680B2 (en) * | 2015-09-30 | 2021-05-25 | Alarm.Com Incorporated | Drone detection systems |
US9800321B2 (en) * | 2015-12-31 | 2017-10-24 | Wellen Sham | Facilitating communication with a vehicle via a UAV |
US9875660B2 (en) * | 2016-03-28 | 2018-01-23 | Cisco Technology, Inc. | Multi-modal UAV certification |
US10710710B2 (en) * | 2016-10-27 | 2020-07-14 | International Business Machines Corporation | Unmanned aerial vehicle (UAV) compliance using standard protocol requirements and components to enable identifying and controlling rogue UAVS |
US10127822B2 (en) * | 2017-02-13 | 2018-11-13 | Qualcomm Incorporated | Drone user equipment indication |
US10750313B2 (en) * | 2017-06-05 | 2020-08-18 | Wing Aviation Llc | Map display of unmanned aircraft systems |
US10755586B2 (en) * | 2017-07-17 | 2020-08-25 | Verizon Patent And Licensing Inc. | Providing automatic dependent surveillance-broadcast data for unmanned aerial vehicles |
-
2017
- 2017-10-16 GB GB1717001.0A patent/GB2562813B/en not_active Expired - Fee Related
-
2018
- 2018-02-08 GB GBGB1802095.8A patent/GB201802095D0/en not_active Ceased
- 2018-10-15 US US16/159,751 patent/US10657831B2/en not_active Expired - Fee Related
- 2018-10-15 GB GB1816732.0A patent/GB2569425B/en not_active Expired - Fee Related
- 2018-10-15 CA CA3020776A patent/CA3020776A1/en active Pending
-
2020
- 2020-03-30 US US16/833,906 patent/US11651698B2/en active Active
-
2023
- 2023-04-10 US US18/132,468 patent/US20230245574A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
GB2562813A (en) | 2018-11-28 |
US20200226939A1 (en) | 2020-07-16 |
US20190114930A1 (en) | 2019-04-18 |
GB201717001D0 (en) | 2017-11-29 |
CA3020776A1 (en) | 2019-04-16 |
US11651698B2 (en) | 2023-05-16 |
GB201802095D0 (en) | 2018-03-28 |
US10657831B2 (en) | 2020-05-19 |
GB2569425B (en) | 2022-12-14 |
GB201816732D0 (en) | 2018-11-28 |
GB2569425A (en) | 2019-06-19 |
GB2562813B (en) | 2019-04-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230245574A1 (en) | Methods, computer programs, computing devices and controllers | |
US11908184B2 (en) | Image capture with privacy protection | |
US12022289B2 (en) | Integrated secure device manager systems and methods for cyber-physical vehicles | |
US20190103030A1 (en) | Aerial vehicle identification beacon and reader system | |
US11022407B2 (en) | UAV defense system | |
US10229329B2 (en) | Systems, methods, apparatuses, and devices for identifying, tracking, and managing unmanned aerial vehicles | |
US11247774B2 (en) | Moving body identification system and identification method | |
US10713959B2 (en) | Low altitude aircraft identification system | |
US10424176B2 (en) | AMBER alert monitoring and support | |
US20180129884A1 (en) | Systems, Methods, Apparatuses, and Devices for Identifying and Tracking Unmanned Aerial Vehicles via a Plurality of Sensors | |
US10291314B2 (en) | Method and system to dynamically identify and control a UAV with emitting instruments | |
EP3456040B1 (en) | Surveillance system and method for camera-based surveillance | |
US20180197420A1 (en) | System and method for aerial system discrimination and action | |
US20180025044A1 (en) | Unmanned vehicle data correlation, routing, and reporting | |
US20140140575A1 (en) | Image capture with privacy protection | |
WO2019032162A2 (en) | Secure beacon and reader system for remote drone and pilot identification | |
Belwafi et al. | Unmanned aerial vehicles’ remote identification: A tutorial and survey | |
US11917504B2 (en) | Remote identification system for the remote identification of an object | |
WO2018170737A1 (en) | Unmanned aerial vehicle control method and control device, and unmanned aerial vehicle supervision method and supervision device | |
KR20180039496A (en) | UAS Monitoring and Control system having Black Box | |
US20240153392A1 (en) | A Method, System and an Apparatus for Connecting to Unconnected Drones with Positioning | |
Wang et al. | On the security of the FLARM collision warning system | |
RU2593439C1 (en) | System and method of detecting wing unmanned aerial vehicles | |
US20240177613A1 (en) | Remote id conflict system | |
US11708161B1 (en) | Method and system for validating access keys for unmanned vehicle interdiction device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: RPX CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RUSSELL INNOVATIONS LIMITED;REEL/FRAME:064969/0116 Effective date: 20230905 Owner name: RUSSELL INNOVATIONS LIMITED, UNITED KINGDOM Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RUSSELL, IAIN MATTHEW;REEL/FRAME:064968/0979 Effective date: 20230829 |