CN112908038A - Method for determining position of unmanned aerial vehicle and air traffic control system - Google Patents

Method for determining position of unmanned aerial vehicle and air traffic control system Download PDF

Info

Publication number
CN112908038A
CN112908038A CN202011088133.0A CN202011088133A CN112908038A CN 112908038 A CN112908038 A CN 112908038A CN 202011088133 A CN202011088133 A CN 202011088133A CN 112908038 A CN112908038 A CN 112908038A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
user
uav
flight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011088133.0A
Other languages
Chinese (zh)
Inventor
龚明
戴劲
崔浩
王晓东
黄晗
吴军
范伟
马宁
荣新华
林星森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to CN202011088133.0A priority Critical patent/CN112908038A/en
Publication of CN112908038A publication Critical patent/CN112908038A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0004Transmission of traffic-related information to or from an aircraft
    • G08G5/0008Transmission of traffic-related information to or from an aircraft with other aircraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0269Inferred or constrained positioning, e.g. employing knowledge of the physical or electromagnetic environment, state of motion or other contextual information to infer or constrain a position
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/04Position of source determined by a plurality of spaced direction-finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/06Position of source determined by co-ordinating a plurality of position lines defined by path-difference measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/14Determining absolute distances from a plurality of spaced points of known location
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/33User authentication using certificates
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0004Transmission of traffic-related information to or from an aircraft
    • G08G5/0013Transmission of traffic-related information to or from an aircraft with a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0026Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located on the ground
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/003Flight plan management
    • G08G5/0034Assembly of a flight plan
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/003Flight plan management
    • G08G5/0039Modification of a flight plan
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0043Traffic management of multiple aircrafts from the ground
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0052Navigation or guidance aids for a single aircraft for cruising
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/006Navigation or guidance aids for a single aircraft in accordance with predefined flight zones, e.g. to avoid prohibited zones
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0078Surveillance aids for monitoring traffic from the aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0082Surveillance aids for monitoring traffic from a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0086Surveillance aids for monitoring terrain
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0091Surveillance aids for monitoring atmospheric conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • B64U2201/104UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] using satellite radio beacon positioning systems, e.g. GPS
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S2205/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S2205/01Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations specially adapted for specific applications
    • G01S2205/03Airborne
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2111Location-sensitive, e.g. geographical location, GPS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2117User registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2141Access rights, e.g. capability lists, access control lists, access tables, access matrices
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0056Navigation or guidance aids for a single aircraft in an emergency situation, e.g. hijacking

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A method of determining the position of an unmanned aerial vehicle and an air traffic control system, wherein the method of determining the position of an unmanned aerial vehicle comprises: calculating a position of the UAV based on data from a recorder, wherein the recorder is configured to receive one or more messages from the UAV; the location of the unmanned aerial vehicle is compared to the location of the geofence boundary, and one or more flight response actions are taken based on the comparison to regulate the activity of the unmanned aerial vehicle within or outside the geofence boundary.

Description

Method for determining position of unmanned aerial vehicle and air traffic control system
Background
Unmanned vehicles, such as Unmanned Aerial Vehicles (UAVs), have been developed for various fields, including consumer applications and industrial applications. For example, unmanned aerial vehicles may be maneuvered for entertainment, photography/videography, surveillance, delivery, or other applications.
The unmanned aerial vehicle expands the aspects of personal life. However, as the use of unmanned aerial vehicles becomes more prevalent, safety issues and challenges arise. For example, when the flight of the unmanned aerial vehicle is not restricted, the unmanned aerial vehicle may fly over an area where flight is or should be prohibited. This may be intentional or unintentional. In some cases, a novice user may lose control of the unmanned aerial vehicle or be unfamiliar with aviation flight regulations. There is also a potential risk of hijacking or hacking into the controls of the unmanned aerial vehicle.
Disclosure of Invention
The safety systems and methods described herein improve flight safety of Unmanned Aerial Vehicles (UAVs). Flight control and certification systems and methods may be provided that may facilitate tracking usage of an unmanned aerial vehicle. The system can uniquely identify the parties (e.g., user, remote control, unmanned aerial vehicle, geo-fencing device) that are interacting. In some cases, an authentication process may occur and only authorized parties may be allowed to operate the UAV. Flight controls may be imposed on the operation of the unmanned aerial vehicle and may override manual controls of the user (override). In some cases, the geofencing device may be used to provide information about flight controls or to assist in the flight control process.
One aspect of the invention relates to a system for controlling an Unmanned Aerial Vehicle (UAV), the system comprising: a first communication module; and one or more processors operatively coupled to the first communication module and individually or collectively configured to: receiving, using the first communication module or a second communication module, a user identifier indicating a user type; generating a set of flight controls for the UAV based on the user identifier; and transmitting the set of flight restrictions to the UAV using the first communication module or the second communication module.
Additionally, aspects of the invention may provide a method for controlling an Unmanned Aerial Vehicle (UAV), the method comprising: receiving a user identifier indicating a user type; generating, by way of one or more processors, a set of flight restrictions for the UAV based on the user identifier; and transmitting, by means of a communication module, the set of flight controls to the UAV.
According to an aspect of the invention, there may be provided a non-transitory computer readable medium containing program instructions for controlling an Unmanned Aerial Vehicle (UAV), the computer readable medium comprising: program instructions for receiving a user identifier indicating a user type; program instructions for generating a set of flight controls for the UAV based on the user identifier; and program instructions for generating a signal to transmit the set of flight controls to the UAV via a communication module.
Further, aspects of the invention may relate to an Unmanned Aerial Vehicle (UAV). The unmanned aerial vehicle may include: one or more propulsion units that enable flight of the UAV; a communication module configured to receive one or more flight commands from a remote user; and a flight control unit configured to generate flight control signals for delivery to the one or more propulsion units, wherein the flight control signals are generated according to a set of flight regulations for the UAV, wherein the flight regulations are generated based on a user identifier indicating a user type of the remote user.
Aspects of the invention may also include a system for controlling an Unmanned Aerial Vehicle (UAV), the system comprising: a first communication module; and one or more processors operatively coupled to the first communication module and individually or collectively configured to: receiving, using the first or second communication module, an unmanned aerial vehicle identifier indicating a type of unmanned aerial vehicle; generating a set of flight controls for the UAV based on the UAV identifier; and transmitting the set of flight controls to the UAV using the first or second communication module.
According to a further aspect of the invention, there may be provided a method for controlling an Unmanned Aerial Vehicle (UAV), the method comprising: receiving an unmanned aerial vehicle identifier indicating a type of unmanned aerial vehicle; generating, by way of one or more processors, a set of flight restrictions for the UAV based on the UAV identifier; and transmitting, by means of a communication module, the set of flight controls to the UAV.
Moreover, aspects of the invention may relate to a non-transitory computer-readable medium containing program instructions for controlling an Unmanned Aerial Vehicle (UAV), the computer-readable medium comprising: program instructions for receiving an unmanned aerial vehicle identifier indicating a type of unmanned aerial vehicle; program instructions for generating a set of flight controls for the UAV based on the UAV identifier; and program instructions for generating a signal to transmit the set of flight controls to the UAV via a communication module.
One aspect of the invention may relate to an Unmanned Aerial Vehicle (UAV), comprising: one or more propulsion units that enable flight of the UAV; a communication module configured to receive one or more flight commands from a remote user; and a flight control unit configured to generate flight control signals for delivery to the one or more propulsion units, wherein the flight control signals are generated according to a set of flight restrictions for the UAV, wherein the flight restrictions are generated based on an UAV identifier indicating a type of UAV for the remote user.
Further aspects of the invention may relate to an Unmanned Aerial Vehicle (UAV), comprising: a flight control unit configured to control operation of the UAV; and an identification module integrated into the flight control unit, wherein the identification module uniquely identifies the UAV from other UAVs.
Additionally, aspects of the invention may provide a method of identifying an Unmanned Aerial Vehicle (UAV), the method comprising: controlling operation of the UAV using a flight control unit; and using an identification module integrated into the flight control unit to uniquely identify the UAV from the other UAVs.
According to some aspects of the invention, an Unmanned Aerial Vehicle (UAV) may comprise: a flight control unit configured to control operation of the UAV, wherein the flight control unit includes an identification module and a chip, wherein the identification module is configured to (1) uniquely identify the UAV from other UAVs, (2) include an initial record of the chip, and (3) aggregate information about the chip after including the initial record of the chip, wherein the identification module is configured to undergo a self-test procedure that compares the aggregated information about the chip with the initial record of the chip, and wherein the identification module is configured to provide a warning when the aggregated information about the chip is inconsistent with the initial record of the chip.
Aspects of the invention may also relate to a method of identifying an Unmanned Aerial Vehicle (UAV), the method comprising: controlling operation of the UAV using a flight control unit, wherein the flight control unit includes an identification module and a chip; using the identification module to uniquely identify the UAV from other UAVs, wherein the identification module includes an initial record of the chip; aggregating information about the chip after including an initial record of the chip; comparing, using the identification module, the aggregated information about the chip to an initial record for the chip, thereby undergoing a self-test procedure; and providing a warning when the aggregated information about the chip is inconsistent with the initial record for the chip.
According to a further aspect of the invention, an Unmanned Aerial Vehicle (UAV) payload control system may be provided. The system may include: a first communication module; and one or more processors operatively coupled to the first module and individually or collectively configured to: receiving, using the first or second communication module, a signal indicative of a location dependent payload usage parameter; and generating one or more UAV operation signals that effect payload operation in compliance with the payload usage parameters.
Further, aspects of the invention may relate to a method for constraining payload usage of an Unmanned Aerial Vehicle (UAV), the method comprising: receiving a signal indicative of a location dependent payload usage parameter; and generating, by means of one or more processors, one or more UAV operation signals that effect payload operation in compliance with the payload usage parameters.
Additional aspects of the invention may provide a non-transitory computer readable medium containing program instructions for constraining payload usage of an Unmanned Aerial Vehicle (UAV), the computer readable medium comprising: program instructions for receiving a signal indicative of a location dependent payload usage parameter; and program instructions for generating one or more UAV operation signals that effect payload operation in compliance with the payload usage parameters.
According to an aspect of the invention, there may be provided an Unmanned Aerial Vehicle (UAV), the UAV comprising: a payload; a communication module configured to receive one or more payload commands from a remote user; and a flight control unit configured to generate payload control signals delivered to the payload or a carrier supporting the payload, wherein the payload control signals are generated in accordance with one or more UAV operation signals, wherein the UAV operation signals are generated based on a location-dependent payload usage parameter.
Aspects of the invention may relate to an Unmanned Aerial Vehicle (UAV) communications control system comprising: a first communication module; and one or more processors operatively coupled to the first communication module and individually or collectively configured to: receiving, using the first or second communication module, a signal indicative of a location-dependent communication usage parameter; and generating one or more UAV operation signals that enable operation of the UAV communication unit in compliance with the communication usage parameters.
Also, aspects of the invention may include a method for constraining wireless communication of an Unmanned Aerial Vehicle (UAV), the method comprising: receiving a signal indicative of a location dependent communication usage parameter; and generating, by means of one or more processors, one or more UAV operation signals that enable operation of the communication unit in compliance with the payload usage parameters.
According to an additional aspect of the invention, there may be provided a non-transitory computer readable medium containing program instructions for constraining wireless communication of an Unmanned Aerial Vehicle (UAV), the computer readable medium comprising: program instructions for receiving a signal indicative of a location dependent communication usage parameter; and program instructions for generating one or more UAV operation signals that enable operation of the communication unit in compliance with the communication usage parameters.
The method of the present invention may also relate to an Unmanned Aerial Vehicle (UAV), comprising: a communication unit configured to receive or transmit wireless communications; and a flight control unit configured to generate communication control signals delivered to the communication unit to enable operation of the communication unit, wherein the communication control signals are generated from one or more unmanned aerial vehicle operation signals, wherein the unmanned aerial vehicle operation signals are generated based on location-dependent communication usage parameters.
A further aspect of the invention may relate to a method of operating an Unmanned Aerial Vehicle (UAV), the method comprising: receiving an unmanned aerial vehicle identifier that uniquely identifies the unmanned aerial vehicle from among other unmanned aerial vehicles; receiving a user identifier that uniquely identifies the user from among other users; evaluating, with the aid of one or more processors, whether a user identified by the user identifier is authorized to operate an unmanned aerial vehicle identified by the unmanned aerial vehicle identifier; and allowing the user to operate the UAV when the user is authorized to operate the UAV.
According to an aspect of the invention, a non-transitory computer readable medium containing program instructions for operating an Unmanned Aerial Vehicle (UAV) may be provided. The computer-readable medium may include: program instructions for receiving an unmanned aerial vehicle identifier that uniquely identifies the unmanned aerial vehicle from among other unmanned aerial vehicles; program instructions for receiving a user identifier that uniquely identifies the user from among other users; program instructions for evaluating whether a user identified by the user identifier is authorized to operate an unmanned aerial vehicle identified by the unmanned aerial vehicle identifier; and program instructions for allowing the user to operate the UAV when the user is authorized to operate the UAV.
One aspect of the invention may provide an Unmanned Aerial Vehicle (UAV) authorization system comprising: one or more processors individually or collectively configured to: receiving an unmanned aerial vehicle identifier that uniquely identifies the unmanned aerial vehicle from among other unmanned aerial vehicles; receiving a user identifier that uniquely identifies the user from among other users; evaluating whether a user identified by the user identifier is authorized to operate an unmanned aerial vehicle identified by the unmanned aerial vehicle identifier; and transmitting a signal to allow the user to operate the UAV when the user is authorized to operate the UAV.
Further, aspects of the invention may relate to a method of operating an Unmanned Aerial Vehicle (UAV), the method comprising: authenticating an identity of an unmanned aerial vehicle, wherein the identity of the unmanned aerial vehicle is uniquely distinguishable from other unmanned aerial vehicles; authenticating an identity of a user, wherein the identity of the user is uniquely distinguishable from other users; evaluating, with the aid of one or more processors, whether the user is authorized to operate the UAV; and allowing the user to operate the UAV when the user is authorized to operate the UAV and both the UAV and the user are authenticated.
Further, according to an aspect of the invention, there may be provided a non-transitory computer readable medium containing program instructions for operating an Unmanned Aerial Vehicle (UAV), the computer readable medium comprising: program instructions for authenticating an identity of an unmanned aerial vehicle, wherein the identity of the unmanned aerial vehicle is uniquely distinguishable from other unmanned aerial vehicles; program instructions for authenticating an identity of a user, wherein the identity of the user is uniquely distinguishable from other users; program instructions for evaluating, with the aid of one or more processors, whether the user is authorized to operate the UAV; and program instructions for allowing the user to operate the UAV when the user is authorized to operate the UAV and both the UAV and the user are authenticated.
One aspect of the present invention may also relate to an Unmanned Aerial Vehicle (UAV) authentication system comprising: one or more processors individually or collectively configured to: authenticating an identity of an unmanned aerial vehicle, wherein the identity of the unmanned aerial vehicle is uniquely distinguishable from other unmanned aerial vehicles; authenticating an identity of a user, wherein the identity of the user is uniquely distinguishable from other users; evaluating whether the user is authorized to operate the UAV; and transmitting a signal to allow the user to operate the UAV when the user is authorized to operate the UAV and the user are both authenticated.
Moreover, aspects of the invention may relate to a method of determining a level of certification for operation of an Unmanned Aerial Vehicle (UAV), the method comprising: receiving context information regarding the UAV; evaluating, using one or more processors, a degree of authentication for the UAV or a user of the UAV based on the contextual information; enabling authentication of the UAV or the user according to the degree of authentication; and allowing the user to operate the UAV when the degree of authentication is complete.
According to an aspect of the invention, there may be provided a non-transitory computer readable medium containing program instructions for determining an authentication level for operating an Unmanned Aerial Vehicle (UAV), the computer readable medium comprising: program instructions for receiving contextual information about the UAV; program instructions for evaluating a degree of authentication of the UAV or a user of the UAV based on the contextual information; program instructions for enabling authentication of the UAV or the user based on the degree of authentication; and program instructions for providing a signal that allows the user to operate the UAV when the degree of authentication is complete.
Additionally, aspects of the invention may relate to an Unmanned Aerial Vehicle (UAV) authentication system comprising: one or more processors individually or collectively configured to: receiving context information regarding the UAV; evaluating a degree of authentication of the unmanned aerial vehicle or a user of the unmanned aerial vehicle based on the contextual information; and enabling authentication of the unmanned aerial vehicle or the user according to the degree of authentication.
According to an aspect of the invention, there may be provided a method of flight control rating for operation of an Unmanned Aerial Vehicle (UAV), the method comprising: evaluating, using one or more processors, a degree of authentication of the UAV or a user of the UAV; enabling authentication of the UAV or the user according to the authentication degree; generating a set of flight controls based on the degree of authentication; and enabling operation of the UAV in accordance with the set of flight controls.
Further aspects of the invention may relate to a non-transitory computer readable medium containing program instructions for determining a flight control level for an Unmanned Aerial Vehicle (UAV), the computer readable medium comprising: program instructions for evaluating a degree of authentication for the unmanned aerial vehicle or a user of the unmanned aerial vehicle; program instructions for enabling authentication of the UAV or the user based on the degree of authentication; program instructions for generating a set of flight controls based on the degree of authentication; and program instructions for providing signals that allow the unmanned aerial vehicle to operate in accordance with the set of flight controls.
Additionally, aspects of the invention may provide an Unmanned Aerial Vehicle (UAV) authentication system comprising: one or more processors individually or collectively configured to: evaluating a degree of authentication of the UAV or a user of the UAV; enabling authentication of the UAV or the user according to the authentication degree; and generating a set of flight controls based on the degree of certification.
According to one aspect of the invention, a method of warning a user when operation of an Unmanned Aerial Vehicle (UAV) is compromised may be provided. The method may include: authenticating a user to enable operation of the UAV; receiving one or more commands from a remote control that receives user input to effect operation of the UAV; detecting unauthorized communication interfering with one or more commands from the user; and alerting, via the remote control, the user regarding the unauthorized communication.
Aspects of the invention also include a non-transitory computer readable medium containing program instructions for alerting a user when operation of an Unmanned Aerial Vehicle (UAV) is compromised, the computer readable medium comprising: program instructions for authenticating a user to enable operation of the UAV; program instructions for receiving one or more commands from a remote control that receives user input to effect operation of the UAV; and program instructions for generating an alert to be provided to the user via the remote control, the alert regarding the detected unauthorized communication interfering with one or more commands from the user.
Further, one aspect of the invention may relate to an Unmanned Aerial Vehicle (UAV) warning system comprising: one or more processors individually or collectively configured to: authenticating a user to enable operation of the UAV; receiving one or more commands from a remote control that receives user input to effect operation of the UAV; detecting unauthorized communication interfering with one or more commands from the user; and generating a signal to alert the user via the remote control regarding the unauthorized communication.
According to an additional aspect of the invention, there may be provided a method of detecting flight deviations of an Unmanned Aerial Vehicle (UAV), the method comprising: receiving one or more flight commands provided by a user from a remote control; calculating, with the aid of one or more processors, a predicted position of the UAV based on the one or more flight commands; detecting an actual position of the UAV by means of one or more sensors; comparing the predicted location to the actual location to determine a deviation in unmanned aerial vehicle behavior; and providing an indication of risk that the unmanned aerial vehicle is not operating in accordance with the one or more flight commands based on the deviation in unmanned aerial vehicle behavior.
According to some aspects of the invention, there may be provided a non-transitory computer readable medium containing program instructions for detecting flight deviations of an Unmanned Aerial Vehicle (UAV), the computer readable medium comprising: program instructions for calculating a predicted position of the UAV based on one or more flight commands provided by a user from a remote control; program instructions for detecting an actual position of the UAV by means of one or more sensors; program instructions for comparing the predicted position to the actual position to determine a deviation in unmanned aerial vehicle behavior; and program instructions for providing, based on the deviation in unmanned aerial vehicle behavior, an indication of risk that the unmanned aerial vehicle is not operating in accordance with the one or more flight commands.
One aspect of the invention may relate to an Unmanned Aerial Vehicle (UAV) flight deviation detection system, comprising: one or more processors individually or collectively configured to: receiving one or more flight commands provided by a user from a remote control; calculating a predicted position of the UAV based on the one or more flight commands; detecting an actual position of the UAV by means of one or more sensors; comparing the predicted location to the actual location to determine a deviation in unmanned aerial vehicle behavior; and generating a signal to provide an indication of risk that the UAV is not operating in accordance with the one or more flight commands based on the UAV behavior deviation.
Moreover, one aspect of the invention may relate to a method of recording Unmanned Aerial Vehicle (UAV) behavior, the method comprising: receiving an unmanned aerial vehicle identifier that uniquely identifies the unmanned aerial vehicle from among other unmanned aerial vehicles; receiving a user identifier that uniquely identifies the user from among other users, wherein the user provides one or more commands via a remote control to effect operation of the UAV; and recording the one or more commands, the user identifier associated with the one or more commands, and the UAV identifier associated with the one or more commands in one or more memory storage units.
According to an aspect of the invention, there may be provided a non-transitory computer readable medium containing program instructions for recording Unmanned Aerial Vehicle (UAV) behavior, the computer readable medium comprising: program instructions for associating a user identifier with one or more commands from a user, wherein the user identifier uniquely identifies the user from among other users, and wherein the user provides one or more commands via a remote control to effect operation of the UAV; program instructions for associating an unmanned aerial vehicle identifier with the one or more commands, wherein the unmanned aerial vehicle identifier uniquely identifies the unmanned aerial vehicle from among other unmanned aerial vehicles; and program instructions for recording the one or more commands, the user identifier associated with the one or more commands, and the UAV identifier associated with the one or more commands in one or more memory storage units.
Aspects of the invention may include an Unmanned Aerial Vehicle (UAV) behavior recording system comprising: one or more memory storage units; and one or more processors operatively coupled to the one or more memory storage units and individually or collectively configured to: receiving an unmanned aerial vehicle identifier that uniquely identifies the unmanned aerial vehicle from among other unmanned aerial vehicles; receiving a user identifier that uniquely identifies the user from among other users, wherein the user provides one or more commands via a remote control to effect operation of the UAV; and recording the one or more commands, the user identifier associated with the one or more commands, and the UAV identifier associated with the one or more commands in one or more memory storage units.
According to an aspect of the invention, there may be provided a system for operating an Unmanned Aerial Vehicle (UAV), the system comprising: an identification registry database configured to store one or more unmanned aerial vehicle identifiers that uniquely identify unmanned aerial vehicles with respect to each other and one or more user identifiers that uniquely identify users with respect to each other; an authentication center configured to authenticate an identity of the unmanned aerial vehicle and an identity of the user; and an air management system configured to receive an unmanned aerial vehicle identifier of the certified unmanned aerial vehicle and a user identifier of the certified user, and provide a set of flight controls based on at least one of: a certified unmanned aerial vehicle identifier and a certified user identifier.
A further aspect of the invention may provide a method of determining a position of an Unmanned Aerial Vehicle (UAV), the method comprising: receiving one or more messages from the UAV at a plurality of recorders; time stamping the one or more messages from the UAV at the plurality of recorders; and calculating, by means of one or more processors, a location of the UAV based on the timestamps of the one or more messages.
In some aspects of the invention, a non-transitory computer readable medium containing program instructions for determining a position of an Unmanned Aerial Vehicle (UAV) may be provided, the computer readable medium comprising: program instructions for receiving one or more messages from the UAV at a plurality of recorders; program instructions for time stamping the one or more messages from the UAV at the plurality of recorders; and program instructions for calculating a position of the UAV based on the timestamps of the one or more messages.
According to a further aspect of the invention, there may be provided an Unmanned Aerial Vehicle (UAV) communication location system, the system comprising: a communication module; and one or more processors operatively coupled to the communication module and individually or collectively configured to calculate a position of the UAV based on timestamps of one or more messages sent from the UAV and received at a plurality of recorders remote from the UAV.
According to an aspect of the invention, there may be provided a method of authenticating an Unmanned Aerial Vehicle (UAV), the method comprising: receiving an authentication request from an unmanned aerial vehicle, wherein the authentication request includes an unmanned aerial vehicle identifier; retrieving information corresponding to the UAV identifier; generating an authentication vector based on the retrieved information, wherein the authentication vector comprises at least an authentication token; transmitting the authentication token and a key evaluation reference to the UAV, wherein the UAV authenticates the authentication vector based on a message authentication code generated based on the authentication token, the key evaluation reference, and a key encoded on the UAV; receiving a response from the UAV, wherein the response is based on the key evaluation criteria and a key encoded on the UAV; and verifying the authentication request based on a response received from the UAV.
Additional aspects of the invention may provide a system for authenticating an Unmanned Aerial Vehicle (UAV), the system comprising: an authentication module; a communication module; and one or more processors operatively coupled to the authentication module and the communication module and individually or collectively configured to: receiving an authentication request from an unmanned aerial vehicle, wherein the authentication request includes an unmanned aerial vehicle identifier; retrieving information corresponding to the UAV identifier; generating an authentication vector based on the retrieved information, wherein the authentication vector comprises at least an authentication token; transmitting the authentication token and a key evaluation reference to the UAV, wherein the UAV authenticates the authentication vector based on a message authentication code generated based on the authentication token, the key evaluation reference, and a key encoded on the UAV; receiving a response from the UAV, wherein the response is based on the key evaluation criteria and a key encoded on the UAV; and verifying the authentication request based on a response received from the UAV.
Further, aspects of the invention may relate to a non-transitory computer-readable medium containing program instructions for authenticating an Unmanned Aerial Vehicle (UAV), the computer-readable medium comprising: program instructions for receiving an authentication request from an unmanned aerial vehicle, wherein the authentication request includes an unmanned aerial vehicle identifier; program instructions for retrieving information corresponding to the UAV identifier; program instructions for generating an authentication vector based on the retrieved information, wherein the authentication vector includes at least an authentication token; program instructions for transmitting the authentication token and a key evaluation reference to the UAV, wherein the UAV authenticates the authentication vector based on a message authentication code generated based on the authentication token, the key evaluation reference, and a key encoded on the UAV; program instructions for receiving a response from the UAV, wherein the response is based on the key evaluation criteria and a key encoded on the UAV; and program instructions for validating the authentication request based on a response received from the UAV.
According to some aspects of the invention, there may be provided a method of authenticating a certificate authority, the method comprising: providing an authentication request from an unmanned aerial vehicle to an authentication center, wherein the authentication request includes an unmanned aerial vehicle identifier; receiving an authentication vector from the authentication center, wherein the authentication vector includes an authentication token and a key evaluation criterion, and wherein the authentication token is generated based on the retrieved information corresponding to the UAV identifier; calculating an authentication sequence number based on the authentication token; generating an authentication key based on the key evaluation criteria and a key encoded on the UAV; determining a message authentication code based on the authentication token, the authentication sequence number, and the authentication key; and authenticating the authentication center based on at least one of the authentication sequence number and the message authentication code, the message authentication code being determined by the authentication vector received from the authentication center.
According to an aspect of the present invention, there may be provided a system for authenticating a certificate authority, the system comprising: an authentication module; a communication module; and one or more processors operatively coupled to the authentication module and the communication module and individually or collectively configured to: providing an authentication request from the UAV to an authentication center, wherein the authentication request includes an UAV identifier; receiving an authentication vector from the authentication center, wherein the authentication vector includes an authentication token and a key evaluation criterion, and wherein the authentication token is generated based on the retrieved information corresponding to the UAV identifier; calculating an authentication sequence number based on the authentication token; generating an authentication key based on the key evaluation criteria and a key encoded on the UAV; determining a message authentication code based on the authentication token, the authentication sequence number, and the authentication key; and authenticating the authentication center based on at least one of the authentication sequence number and the message authentication code, the message authentication code being determined by the authentication vector received from the authentication center.
Aspects of the invention may also relate to a non-transitory computer-readable medium containing program instructions for authenticating an authentication center, the computer-readable medium comprising: program instructions for providing an authentication request from the UAV to an authentication center, wherein the authentication request includes an UAV identifier; program instructions for receiving an authentication vector from the authentication center, wherein the authentication vector includes an authentication token and a key evaluation criterion, and wherein the authentication token is generated based on the retrieved information corresponding to the UAV identifier; program instructions for calculating an authentication sequence number based on the authentication token; program instructions for generating an authentication key based on the key evaluation criteria and a key encoded on the UAV; program instructions for determining a message authentication code based on the authentication token, the authentication sequence number, and the authentication key; and program instructions for authenticating the authentication center based on at least one of the authentication sequence number and the message authentication code, the message authentication code determined from the authentication vector received from the authentication center.
Further, aspects of the invention may relate to a method of warning a user when operation of an Unmanned Aerial Vehicle (UAV) is compromised, the method comprising: authenticating a user to enable operation of the UAV; receiving one or more commands from a remote control that receives user input to effect operation of the UAV; detecting unauthorized communication interfering with one or more commands from the user; and responsive to detection of the unauthorized communication, causing the unmanned aerial vehicle to fly to a predetermined homing point while ignoring the unauthorized communication.
According to an aspect of the invention, there may be provided a non-transitory computer readable medium containing program instructions for alerting a user when operation of an Unmanned Aerial Vehicle (UAV) is compromised, the computer readable medium comprising: program instructions for authenticating a user to enable operation of the UAV; program instructions for receiving one or more commands from a remote control that receives user input to effect operation of the UAV; and program instructions for causing the unmanned aerial vehicle to fly to a predetermined homing point while ignoring the unauthorized communication in response to detecting the unauthorized communication.
According to a further aspect of the invention, there may be provided an Unmanned Aerial Vehicle (UAV) warning system comprising: one or more processors individually or collectively configured to: authenticating a user to enable operation of the UAV; receiving one or more commands from a remote control that receives user input to effect operation of the UAV; detecting unauthorized communication interfering with one or more commands from the user; and responsive to detection of the unauthorized communication, causing the unmanned aerial vehicle to fly to a predetermined homing point while ignoring the unauthorized communication.
It is to be understood that different aspects of the present invention may be understood separately, together or in combination with each other. The various aspects of the invention described herein may be applicable to any of the specific applications set forth below or to any other type of movable object. Any description herein of an aircraft, such as an unmanned aerial vehicle, may be applicable and used with any movable object, such as any vehicle. Additionally, the systems, devices, and methods disclosed herein in the context of airborne motion (e.g., flying) may also be applicable in the context of other types of motion, such as movement on the ground or on water, underwater motion, or motion in space.
Other objects and features of the present invention will become apparent by consideration of the specification, claims and drawings.
Is incorporated by reference
All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.
Drawings
The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings; in the drawings:
FIG. 1 illustrates an example of interaction between one or more users and one or more unmanned aerial vehicles, according to an embodiment of the present invention.
Fig. 2 shows an example of an authentication system according to an embodiment of the present invention.
FIG. 3 illustrates an example of one or more factors that may participate in generating a set of flight controls according to an embodiment of the present invention.
Fig. 4 shows an example of a flight control unit according to an embodiment of the invention.
FIG. 5 illustrates an additional example of a flight control unit according to an embodiment of the present invention.
Fig. 6 shows an example of a flight control unit according to an embodiment of the invention tracking the identity of a chip on the flight control unit.
FIG. 7 illustrates a diagram of a scenario incorporating multiple types of flight controls, according to an embodiment of the present invention.
Fig. 8 illustrates a process of considering whether a user is authorized to operate the unmanned aerial vehicle according to an embodiment of the present invention.
Fig. 9 illustrates a process of determining whether to allow a user to operate the unmanned aerial vehicle according to an embodiment of the present invention.
FIG. 10 illustrates a graph of flight control levels that may be affected by the degree of certification, according to an embodiment of the present invention.
Fig. 11 illustrates an example of device information that may be stored in a memory according to an embodiment of the present invention.
Fig. 12 shows a diagram of a scenario in which a hijacker attempts to take over control of an unmanned aerial vehicle, according to an embodiment of the invention.
FIG. 13 illustrates an example of unmanned aerial vehicle flight deviations, according to an embodiment of the present invention.
FIG. 14 shows an example of a monitoring system using one or more recorders, according to an embodiment of the present invention.
Fig. 15 shows a diagram of mutual authentication between an unmanned aerial vehicle and an authentication center according to an embodiment of the present invention.
Fig. 16 illustrates a process for sending a message with a cryptographic signature according to an embodiment of the invention.
FIG. 17 illustrates another process for verifying a message by decrypting a signature according to an embodiment of the present invention.
Fig. 18 shows an example of an unmanned aerial vehicle and a geofencing device, in accordance with an embodiment of the present invention.
Fig. 19 illustrates a side view of a geofencing device, a geofence boundary, and an unmanned aerial vehicle, in accordance with an embodiment of the present invention.
Fig. 20 illustrates a system in which a geo-fencing device transmits information directly to an unmanned aerial vehicle, according to an embodiment of the present invention.
Fig. 21 illustrates a system in which an air traffic system can communicate with a geofencing device and/or an unmanned aerial vehicle.
Fig. 22 illustrates a system in which an unmanned aerial vehicle detects a geofencing device, according to an embodiment of the present invention.
Fig. 23 illustrates an example of an unmanned aerial vehicle system in which the unmanned aerial vehicle and the geofencing device need not communicate directly with each other, in accordance with an embodiment of the present invention.
FIG. 24 illustrates an example of a geo-fencing device that may have multiple flight restriction zones.
FIG. 25 illustrates a process for generating a set of flight controls according to an embodiment of the present invention.
Fig. 26 illustrates a process for authenticating a geo-fence device according to an embodiment of the present invention.
Fig. 27 illustrates another example of device information that may be stored in a memory according to an embodiment of the present invention.
FIG. 28 illustrates a geo-fencing device that can provide different sets of flight restrictions in different scenarios, according to embodiments of the present invention.
FIG. 29 illustrates an example of a geo-fence device with multiple sets of flight controls that can change over time according to an embodiment of the present invention.
Fig. 30 illustrates a scenario in which an unmanned aerial vehicle may be provided within an overlapping region of multiple geofencing devices, according to an embodiment of the present invention.
Fig. 31 illustrates an example of different regulations for different geo-fencing devices in accordance with an aspect of the present invention.
Fig. 32 illustrates an example of a mobile geo-fencing device in accordance with an embodiment of the present invention.
Fig. 33 illustrates an example of mobile geofencing devices in proximity to each other, in accordance with an embodiment of the present invention.
Fig. 34 illustrates another example of a mobile geo-fencing device in accordance with an embodiment of the present invention.
Fig. 35 illustrates an example of a user interface showing information about one or more geo-fencing devices, according to an embodiment of the present invention.
Fig. 36 illustrates an unmanned aerial vehicle according to an embodiment of the invention.
Fig. 37 illustrates a movable object including a carrier and a payload according to an embodiment of the invention.
FIG. 38 illustrates a system for controlling a movable object according to an embodiment of the present invention.
Fig. 39 illustrates different types of communications between an unmanned aerial vehicle and a geofencing device, in accordance with an embodiment of the present invention.
Fig. 40 illustrates an example with multiple geo-fence devices (each with a corresponding geo-fence identifier) according to an embodiment of the present invention.
Figure 41 illustrates an example of an unmanned aerial vehicle system in which an aerial vehicle system interacts with a plurality of unmanned aerial vehicles and a plurality of geofencing devices in accordance with an embodiment of the present invention.
Fig. 42 illustrates an example of an environment with an unmanned aerial vehicle that can be traversing a flight path and one or more geo-fencing devices within the environment.
Fig. 43 provides an example of an apparatus that can accept user input to control one or more geo-fencing devices in accordance with an embodiment of the present invention.
Fig. 44 provides an illustration of how a geo-fencing device according to an embodiment of the present invention may be used with a private home to limit the use of unmanned aerial vehicles.
Fig. 45 provides an illustration of how a geo-fence device according to an embodiment of the present invention may be used to block an unmanned aerial vehicle.
Detailed Description
An unmanned vehicle, such as an Unmanned Aerial Vehicle (UAV), may operate according to a safety system for improving flight safety of the unmanned vehicle. Any description herein of an unmanned aerial vehicle may apply to any type of unmanned vehicle (e.g., sky-based vehicle, land-based vehicle, water-based vehicle, or space-based vehicle). Flight control and certification systems and methods may be provided that facilitate monitoring and controlling the use of unmanned aerial vehicles. The system can uniquely identify the parties (e.g., user, remote control, unmanned aerial vehicle, geo-fencing device) that are interacting. In some cases, the certification process may occur and only authorized parties may be allowed to operate the unmanned aerial vehicle. Flight controls may be imposed on the operation of the UAV and may override manual controls by the user. The geofencing device may be used to provide information about flight controls or to assist in the flight control process. The geofence device can provide a physical reference for one or more geofence boundaries, which can be associated with a corresponding set of flight restrictions.
Flight safety challenges during use of unmanned aerial vehicles can arise in many different forms. For example, traditionally, the flight of unmanned aerial vehicles is not restricted (e.g., an unmanned aerial vehicle may fly above somewhere it should be prohibited). For example, an unmanned aerial vehicle may be flown to a sensitive area without authorization (e.g., an airport, military base). Furthermore, the unmanned aerial vehicle may fly into the channels of other aircraft without authorization. Unmanned aerial vehicles may fly to business territories or personal territories without authorization, causing noise pollution, personal injury, and property damage. In some cases, unmanned aerial vehicles may fly to public areas without authorization and may cause personal injury and property damage. The systems and methods provided herein may provide a set of flight controls that may impose necessary restrictions on the UAV, which may be geographic-based, time-based, and/or activity-based. The unmanned aerial vehicle can automatically comply with flight regulations without input from a user. In some cases, control of the UAV may be generated based on flight controls that may override manual inputs from a user.
The flight of the unmanned aerial vehicle may be controlled by a user by means of one or more remote controls. In some cases, there is a potential risk that the flight is hijacked. A hijacker may interfere with an authorized user's instructions for the unmanned aerial vehicle. If the unmanned aerial vehicle receives and accepts the counterfeit instructions, it may perform uncontrolled tasks and have undesirable consequences. The systems and methods provided herein can identify when a hijacking occurs. The system and method may alert the user when a hijacking occurs. The system and method may also cause the unmanned aerial vehicle to take an action in response to the detected hijacking, and may override the control of the hijacker.
The unmanned aerial vehicle may carry various sensors on board the vehicle that may be used to obtain data. A hacker may attempt to steal the obtained data. For example, data from an unmanned aerial vehicle may be intercepted, or data transmitted to the ground over a remote wireless link may be intercepted. Systems and methods provided herein may provide encryption and authentication so that only authorized users may receive data.
In another example of unmanned aerial vehicle flight safety challenges, unmanned aerial vehicles may be abused. Traditionally, warning measures, identification measures, or deterrence violations have been lacking, particularly when unmanned aircraft operators are intentionally abusing unmanned aircraft. For example, unmanned aerial vehicles may be used for illegal advertising, unauthorized attacks, or privacy violations (e.g., unauthorized candid photography). The systems and methods provided herein may monitor usage, which may help identify when abuse of the UAV has occurred. The data may also be used to judiciously track parties involved in abuse or any relevant data. Systems and methods may also be provided: a user or other entity may be alerted when abuse occurs and/or any controls supporting abuse may be overridden.
When operating, the UAV may wirelessly transmit or receive data. In some cases, unmanned aerial vehicles may abuse wireless resources and/or aviation resources, which may result in a waste of common resources. For example, an unmanned aerial vehicle may interfere with authorized communications or steal bandwidth from other communications. The systems and methods provided herein may identify when such activity occurs and may provide a warning or prevent such interference from occurring.
In general, there are challenges in supervising the operation of unmanned aerial vehicles. As different types of unmanned aerial vehicles become increasingly common in different types of uses, authorization systems for unmanned aerial vehicle flight have traditionally been lacking. Distinguishing abnormal flight from normal flight; detecting a small unmanned aerial vehicle; visually detecting the unmanned aerial vehicle flying at night; tracking and penalizing anonymous flights; and/or associating the flight of an unmanned aerial vehicle with its user or owner in a non-repudiatable manner. The systems and methods described herein may perform one or more of these objectives. Identification data may be collected and one or more identifiers may be authenticated. While it may be traditionally difficult to provide security control due to a lack of one or more of the following: a secure channel between a supervisor and an owner or user of the unmanned aerial vehicle, a direct alert or warning mechanism, a legal mechanism for the supervisor to take over control, a mechanism for the unmanned aerial vehicle to distinguish the supervisor from a hijacker, and a measure of an offending act to force a halt of the unmanned aerial vehicle, although the systems and methods provided herein may provide one or more of these functions.
Similarly, there is a need for an assessment or ranking mechanism for the performance, capability, and authority of unmanned aerial vehicles. There is also a need for an evaluation or review mechanism for the operational skills and records of the UAV user. The systems and methods provided herein may advantageously provide this type of assessment. Optionally, flight controls may be generated and implemented based on the evaluation.
As previously mentioned, conventional unmanned aerial vehicle systems do not have safeguards regarding unmanned aerial vehicle flight safety. For example, lack of warning mechanisms for flight safety; lack of information sharing mechanisms for flight environments; or an emergency rescue mechanism. The described flight safety systems and methods may perform one or more of the functions described above.
Overview of the System
FIG. 1 illustrates an example of interactions between one or more users 110a, 110b, 110c and one or more unmanned aerial vehicles 120a, 120b, 120 c. The user may interact with the unmanned aerial vehicle by means of remote controls 115a, 115b, 115 c. The authentication system may include a memory store 130 that may store information about the user, remote control, and/or unmanned aerial vehicle.
The users 110a, 110b, 110c may be individuals associated with the UAV. The user may be an operator of the unmanned aerial vehicle. The user may be an individual authorized to operate the unmanned aerial vehicle. The user may provide input to control the unmanned aerial vehicle. The user may provide input to control the UAV using remote controls 115a, 115b, 115 c. The user may provide user input that controls the flight of the UAV, the operation of a payload of the UAV, the state of the payload relative to the UAV, the operation of one or more sensors of the UAV, the operation of UAV communications, or other functions of the UAV. The user may receive data from the UAV. Data obtained using one or more sensors of the UAV may be provided to a user, optionally via a remote control. The user may be the owner of the unmanned aerial vehicle. The user may be a registered owner of the unmanned aerial vehicle. The user may register as being authorized to operate the unmanned aerial vehicle. The user may be a human manipulator. The user may be an adult or a child. The user may or may not have a line of sight to the unmanned aerial vehicle while operating the unmanned aerial vehicle. The user may communicate directly with the UAV using a remote control. Alternatively, the user may communicate with the unmanned aerial vehicle indirectly via a network (optionally using a remote control).
A user may have a user identifier (e.g., user ID1, user ID2, user ID3 … …) that identifies the user. The user identifier may be unique to the user. Other users may have identifiers different from the user. The user identifier may uniquely distinguish and/or distinguish the user from other individuals. Each user may be assigned only a single user identifier. Alternatively, the user may be able to register multiple user identifiers. In some cases, a single user identifier may be assigned to only a single user. Alternatively, a single user identifier may be shared by multiple users. In a preferred embodiment, a one-to-one correspondence may be provided between users and corresponding user identifiers.
Alternatively, the user may be authenticated as an authorized user of the user identifier. The authentication process may include verification of the identity of the user. Examples of authentication processes are described in more detail elsewhere herein.
The unmanned aerial vehicles 120a, 120b, 120c may be operable when energized. The unmanned aerial vehicle may be in flight or may be in a landed state. The unmanned aerial vehicle may use one or more sensors (alternatively, the payload may be a sensor) to collect the data. The UAV may operate in response to controls from a user (e.g., manually via a remote control), autonomously (e.g., without user input), or semi-autonomously (e.g., may include some user input but may also include aspects that do not rely on user input). The unmanned aerial vehicle may be capable of responding to commands from remote controls 115a, 115b, 115 c. The remote control may not be connected to the UAV, and the remote control may communicate wirelessly with the UAV from a distance. The remote control may accept and/or detect user input. The unmanned aerial vehicle may be capable of following a set of pre-programmed instructions. In some cases, the unmanned aerial vehicle may operate semi-autonomously by responding to one or more commands from a remote control, and otherwise operate autonomously. For example, one or more commands from the remote control may initiate a series of autonomous or semi-autonomous actions by the UAV based on one or more parameters. The unmanned aerial vehicle may be switched between manual operation, autonomous operation, and/or semi-autonomous operation. In some cases, the activities of the unmanned aerial vehicle may be governed by one or more sets of flight controls.
The unmanned aerial vehicle may have one or more sensors. The unmanned aerial vehicle may include one or more vision sensors, such as image sensors. For example, the image sensor may be a monocular camera, a stereo vision camera, a radar, a sonar, or an infrared camera. The unmanned aerial vehicle may also include other sensors that may be used to determine the position of the unmanned aerial vehicle, such as Global Positioning System (GPS) sensors, inertial sensors (e.g., accelerometers, gyroscopes, magnetometers) that may be used as part of or separate from an Inertial Measurement Unit (IMU), lidar, ultrasonic sensors, acoustic sensors, WiFI sensors. Various examples of sensors may include, but are not limited to, a location sensor (e.g., a Global Positioning System (GPS) sensor, a mobile device transmitter supporting location triangulation), a visual sensor (e.g., an imaging device such as a camera capable of detecting visible, infrared, or ultraviolet light), a distance or range sensor (e.g., an ultrasonic sensor, a lidar, a time-of-flight (time-of-flight) camera, or a depth camera), an inertial sensor (e.g., an accelerometer, a gyroscope, an Inertial Measurement Unit (IMU)), an altitude sensor, a pose sensor (e.g., a compass), a pressure sensor (e.g., a barometer), an audio sensor (e.g., a microphone), or a field sensor (e.g., a magnetometer, an electromagnetic sensor). Any suitable number and combination of sensors may be used, such as one, two, three, four, five or more sensors.
Alternatively, data may be received from different types (e.g., two, three, four, five, or more types) of sensors. Different types of sensors may measure different types of signals or information (e.g., position, orientation, velocity, acceleration, distance, pressure, etc.) and/or utilize different types of measurement techniques to acquire data. For example, the sensors may include any suitable combination of active sensors (e.g., sensors that generate and measure energy from their respective energy sources) and passive sensors (e.g., sensors that detect available energy). As another example, some sensors may generate absolute measurement data provided in terms of a global coordinate system (e.g., position data provided by a GPS sensor, attitude data provided by a compass or magnetometer), while other sensors may generate relative measurement data provided in terms of a local coordinate system (e.g., relative angular velocity provided by a gyroscope; relative translational acceleration provided by an accelerometer; relative attitude information provided by a vision sensor; relative distance information provided by an ultrasonic sensor, lidar, or time-of-flight camera). Sensors on or off the drone may collect information such as the location of the drone, the location of other objects, the orientation of the drone, or environmental information. A single sensor may be capable of collecting a complete set of information in the environment, or a set of sensors may operate together to collect a complete set of information in the environment. The sensors may be used for mapping of locations, navigation between locations, detection of obstacles, or detection of targets. The sensors may be used for monitoring of the environment or the subject of interest. The sensor may be used to identify the target object. The target object may be distinguished from other objects in the environment.
The unmanned aerial vehicle may be an aircraft. An unmanned aerial vehicle may have one or more propulsion units that may allow the unmanned aerial vehicle to move about in the air. The one or more propulsion units may enable the unmanned aerial vehicle to move with respect to one or more, two or more, three or more, four or more, five or more, six or more degrees of freedom. In some cases, the unmanned aerial vehicle may be rotatable about one, two, three, or more axes of rotation. The axes of rotation may be orthogonal to each other. The axes of rotation may remain orthogonal to each other throughout the flight of the UAV. The axis of rotation may include a pitch axis, a roll axis, and/or a yaw axis. The unmanned aerial vehicle may be capable of moving along one or more dimensions. For example, the unmanned aerial vehicle may be able to move upward due to lift generated by one or more rotors. In some cases, the unmanned aerial vehicle may be movable along a Z-axis (which may be upward with respect to the orientation of the unmanned aerial vehicle), an X-axis, and/or a Y-axis (which may be lateral). The unmanned aerial vehicle may be capable of moving along one, two, or three axes that may be orthogonal to each other.
The unmanned aerial vehicle may be a rotorcraft. In some cases, the unmanned aerial vehicle may be a multi-rotor aircraft that may include multiple rotors. The plurality of rotors may be rotatable to generate lift for the unmanned aerial vehicle. The rotor may be a propulsion unit that may enable the unmanned aerial vehicle to move freely in the air. The rotors may rotate at the same rate and/or may generate an equal amount of lift or thrust. The rotors may optionally rotate at varying rates, which may generate unequal amounts of lift or thrust and/or allow the unmanned aerial vehicle to rotate. In some cases, one, two, three, four, five, six, seven, eight, nine, ten, or more rotors may be provided on an unmanned aerial vehicle. The rotors may be arranged so that their axes of rotation are parallel to each other. In some cases, the rotors may have axes of rotation at any angle relative to each other, which may affect the motion of the unmanned aerial vehicle.
The illustrated unmanned aerial vehicle can have multiple rotors. The rotor may be coupled to a fuselage of the unmanned aerial vehicle, which may include a control unit, one or more sensors, a processor, and a power source. The sensors may include visual sensors and/or other sensors that may gather information about the unmanned aircraft environment. Information from the sensors may be used to determine the position of the UAV. The rotor may be connected to the fuselage via one or more arms or extensions that may be bifurcated from a central portion of the fuselage. For example, one or more arms may extend radially from a central fuselage of the UAV and may have rotors at or near the distal ends of the arms. In another example, the unmanned aerial vehicle can include one or more arms that include one or more additional support members that can have one, two, three, or more rotors attached thereto. For example, a tee configuration may be used to support the rotor.
By maintaining and/or adjusting the output to one or more propulsion units of the unmanned aerial vehicle, the vertical position and/or speed of the unmanned aerial vehicle may be controlled. For example, increasing the rotational speed of one or more rotors of the unmanned aerial vehicle may help cause the unmanned aerial vehicle to increase in altitude or increase in altitude at a faster rate. Increasing the rotational speed of the one or more rotors may increase the thrust of the rotors. Reducing the rotational speed of one or more rotors of the unmanned aerial vehicle can help cause the unmanned aerial vehicle to lower altitude or to lower altitude at a faster rate. Reducing the rotational speed of the one or more rotors may reduce the thrust of the one or more rotors. When the unmanned aerial vehicle takes off, the output available to the propulsion unit may be increased from its previous landed state. When the UAV is landing, the output provided to the propulsion unit may be reduced from its previous flight state. The unmanned aerial vehicle may be configured to take off and/or land in a substantially vertical manner.
By maintaining and/or adjusting the output to one or more propulsion units of the unmanned aerial vehicle, the lateral position and/or speed of the unmanned aerial vehicle may be controlled. The altitude of the unmanned aerial vehicle and the rotational speed of one or more rotors of the unmanned aerial vehicle may affect the lateral movement of the unmanned aerial vehicle. For example, the unmanned aerial vehicle may be tilted in a particular direction to move in that direction, and the speed of the unmanned aerial vehicle's rotors may affect the speed and/or trajectory of the lateral movement. The lateral position and/or speed of the unmanned aerial vehicle can be controlled by varying or maintaining the rotational speed of one or more rotors of the unmanned aerial vehicle.
The unmanned aerial vehicle can have a small size. The unmanned aerial vehicle may be capable of being handled and/or carried by a human. The unmanned aerial vehicle may be capable of being carried by a human hand.
The unmanned aerial vehicle can have a maximum dimension (e.g., length, width, height, diagonal, diameter) of no more than 100 cm. In some cases, the maximum dimension can be less than or equal to 1mm, 5mm, 1cm, 3cm, 5cm, 10cm, 12cm, 15cm, 20cm, 25cm, 30cm, 35cm, 40cm, 45cm, 50cm, 55cm, 60cm, 65cm, 70cm, 75cm, 80cm, 85cm, 90cm, 95cm, 100cm, 110 cm, 120cm, 130cm, 140cm, 150cm, 160cm, 170cm, 180cm, 190cm, 200cm, 220cm, 250cm, or 300 cm. Alternatively, the maximum dimension of the unmanned aerial vehicle can be greater than or equal to any of the values described herein. The unmanned aerial vehicle can have a maximum dimension that falls within a range between any two values described herein.
The unmanned aerial vehicle may be lightweight. For example, the weight of the unmanned aerial vehicle may be less than or equal to 1mg, 5mg, 10mg, 50mg, 100mg, 500mg, 1g, 2g, 3g, 5g, 7g, 10g, 12g, 15g, 20g, 25g, 30g, 35g, 40g, 45g, 50g, 60g, 70g, 80g, 90g, 100g, 120g, 150g, 200g, 250g, 300g, 350g, 400g, 450g, 500g, 600g, 700g, 800g, 900g, 1kg, 1.1kg, 1.2kg, 1.3kg, 1.4kg, 1.5kg, 1.7kg, 2kg, 2.2kg, 2.5kg, 3kg, 3.5kg, 4kg, 4.5kg, 5.5kg, 6kg, 6.5kg, 7kg, 7.5kg, 8.5kg, 9.5kg, 9kg, 11kg, 15kg, 14 kg, or 15 kg. The unmanned aerial vehicle can have a weight greater than or equal to any of the values described herein. The unmanned aerial vehicle can have a weight that falls within a range between any two values described herein.
The unmanned aerial vehicle may have an unmanned aerial vehicle identifier (e.g., unmanned aerial vehicle ID1, unmanned aerial vehicle ID2, unmanned aerial vehicle ID3 … …) that identifies the unmanned aerial vehicle. The UAV identifier may be unique to the UAV. The other unmanned aerial vehicle may have an identifier different from the unmanned aerial vehicle. The UAV identifier may uniquely distinguish and/or distinguish the UAV from other UAVs. Each unmanned aerial vehicle is assigned only a single unmanned aerial vehicle identifier. Alternatively, multiple UAV identifiers may be registered for a single UAV. In some cases, a single UAV identifier may be assigned to only a single UAV. Alternatively, a single UAV identifier may be shared by multiple UAVs. In a preferred embodiment, a one-to-one correspondence may be provided between the unmanned aerial vehicle and the corresponding unmanned aerial vehicle identifier.
Alternatively, the unmanned aerial vehicle may be certified as an authorized unmanned aerial vehicle for the unmanned aerial vehicle identifier. The authentication process may include verification of the identity of the unmanned aerial vehicle. Examples of authentication processes are described in more detail elsewhere herein.
In some implementations, the remote control may have a remote control identifier that identifies the remote control. The remote controller identifier may be unique to the remote controller. Other remote controls may have identifiers different from the remote control. The remote controller identifier may uniquely distinguish and/or distinguish the remote controller from other remote controllers. Each remote control may be assigned only a single remote control identifier. Alternatively, a single remote controller may be assigned multiple remote controller identifiers. In some cases, a single remote identifier may be assigned to only a single remote. Alternatively, a single remote identifier may be shared by multiple remotes. In a preferred embodiment, a one-to-one correspondence may be provided between a remote control and a corresponding remote control identifier. The remote controller identifier may or may not be associated with a corresponding user identifier.
Alternatively, the remote control may be authenticated as an authorized remote control for the remote control identifier. The authentication process may include verification of the identity of the remote control. Examples of authentication processes are described in more detail elsewhere herein.
The remote control may be any type of device. The device may be a computer (e.g., personal computer, laptop, server), a mobile device (e.g., smartphone, cellular phone, tablet, personal digital assistant), or any other type of device. The device may be a network device capable of communicating over a network. The apparatus includes one or more memory storage units, which may include a non-transitory computer-readable medium that may store code, logic, or instructions for performing one or more of the steps described elsewhere herein. The apparatus may include one or more processors that may perform one or more steps individually or collectively according to code, logic, or instructions of a non-transitory computer readable medium as described herein. The remote control may be hand-held. The remote control may accept input from the user via any user interaction mechanism. In one example, the device may have a touch screen that may record user input when a user touches or slides the screen. The device may have any other type of user interaction component, such as a button, mouse, joystick, trackball, touch pad, stylus, inertial sensor, image capture device, motion capture device, or microphone. The device may sense when the device is tilted, which may affect the operation of the unmanned aerial vehicle. The remote control may be a single piece configured to perform the various functions of the remote control described elsewhere herein. Alternatively, the remote control may be provided as multiple parts or components that may individually or collectively perform the various functions of the remote control as provided elsewhere herein.
The authentication system may include a memory store 130 that may store information about the user, remote control, and/or unmanned aerial vehicle. The memory storage may include one or more memory storage units. The one or more memory storage units may be provided collectively or may be distributed over a network and/or at different locations. In some cases, the memory storage may be a cloud storage system. The memory storage may include one or more databases that store information.
The information may include identification information about the user, the remote control, and/or the UAV. For example, the identification may include a user identifier (e.g., user ID1, user ID2, user ID3 … …) and/or an unmanned aerial vehicle identifier (e.g., unmanned aerial vehicle ID1, unmanned aerial vehicle ID2, unmanned aerial vehicle ID3 … …). The remote controller identifier may also optionally be stored. The information may be stored in long term memory storage or may be stored only for a short period of time. The information may be received and buffered.
Fig. 1 illustrates a scenario in which each user 110a, 110b, 110c may control a corresponding unmanned aerial vehicle 120a, 120b, 120 c. For example, the first user 110a may control the first UAV 120a via a remote control. The second user 110b may control the second unmanned aerial vehicle 120b by means of a remote control. The third user 110c may control the third unmanned aerial vehicle 120c by means of a remote control. The users may be remote from each other. Alternatively, the user may operate the unmanned aerial vehicle in the same region. The user may operate their respective unmanned aerial vehicles simultaneously, or may operate them at different times. The usage times may overlap. The users and unmanned aerial vehicles may be individually identifiable such that instructions from each user may be accepted only by the corresponding unmanned aerial vehicle and not by the other unmanned aerial vehicles. This may reduce the likelihood of interfering signals when multiple unmanned aerial vehicles are operating simultaneously.
Each user may control the corresponding user's unmanned aerial vehicle. The user may pre-register the unmanned aerial vehicle such that only authorized users may control the corresponding unmanned aerial vehicle. The unmanned aerial vehicle may be pre-registered so that the user may only control authorized unmanned aerial vehicles. The relationship and/or association between the user and the UAV may be known. Optionally, the relationship and/or association between the user and the UAV may be stored in memory storage 130. The user identifier may be associated with an unmanned aerial vehicle identifier of a corresponding unmanned aerial vehicle.
The memory storage unit may track user commands to the UAV. The stored commands may be associated with a corresponding user identifier of the user and/or a corresponding unmanned aerial vehicle identifier of the unmanned aerial vehicle. Optionally, an identifier of the corresponding remote control may also be stored.
The identity of the devices or parties involved in the operation of the UAV may be authenticated. For example, the identity of the user may be authenticated. The user may be authenticated as the user associated with the user identifier. The identity of the unmanned aerial vehicle can be authenticated. The unmanned aerial vehicle can be verified as the unmanned aerial vehicle associated with the unmanned aerial vehicle identifier. The identity of the remote control may optionally be authenticated. The remote control may be verified as the remote control associated with the remote control identifier.
Fig. 2 shows an example of an authentication system according to an embodiment of the present invention. The authentication system may be an unmanned aircraft safety system or may operate as part of an unmanned aircraft safety system. The authentication system may provide improved unmanned aerial vehicle security. The authentication system may authenticate the user, the UAV, the remote control, and/or the geofencing device.
The authentication system may include an Identification (ID) registration database 210. The ID registration database may be in communication with the authentication center 220. The authentication system may be in communication with an air traffic management system 230, which may include a flight supervision module 240, a flight control module 242, a traffic management module 244, a user access control module 246, and an unmanned aerial vehicle access control module 248.
The ID registration database 210 may maintain identity information for the users 250a, 250b, 250c and the unmanned aerial vehicles 260a, 260b, 260 c. The ID registration database may assign a unique identifier (connection 1) to each user and each unmanned aerial vehicle. The unique identifier may optionally be a randomly generated alphanumeric string or any other type of identifier that may uniquely identify the user from other users or the unmanned aerial vehicle from other unmanned aerial vehicles. The unique identifier may be generated by an ID registration database or may be selected from a list of possible identifiers that remain unassigned. The ID registration database may optionally assign unique identifiers to the geofence devices and/or remote controls or any other devices that may be involved in an unmanned aircraft security system. The identifier may be used to authenticate the user, the UAV, and/or another device. The ID registration database may or may not interact with one or more users or one or more unmanned aerial vehicles.
The authentication center 220 may provide authentication of the identity of the user 250a, 250b, 250c or the UAV 260a, 260b, 260 c. The authentication center may optionally provide authentication for the identity of the geo-fence device and/or the remote control or any other device that may be involved in the unmanned aircraft security system. The certification authority may obtain information about the user and the unmanned aerial vehicle (and/or any other devices involved in the unmanned aerial vehicle security system) from the ID registration database 210 (connection 2). Further details regarding the authentication process are provided elsewhere herein.
The air traffic system 230 may interact with the authentication center 220. The air traffic system may obtain information about the user and the unmanned aerial vehicle (and/or any other devices involved in the unmanned aerial vehicle safety system) from the authentication center (connection 4). The information may include a user identifier and an unmanned aerial vehicle identifier. The information may relate to a confirmation or identification of the identity of the user and/or the identity of the unmanned aerial vehicle. The air management system may be a management cluster that may include one or more subsystems such as a flight supervision module 240, a flight control module 242, a traffic management module 244, a user access control module 246, and an unmanned aerial vehicle access control module 248. The one or more subsystems may be used for flight control, air traffic control, related authorization, user and unmanned aerial vehicle access management, and other functions.
In one example, the flight supervision module/subsystem 240 may be used to monitor the flight of the unmanned aerial vehicle within the allotted airspace. The flight supervision module may be configured to detect when one or more unmanned aerial vehicles deviate from a predetermined course. The flight supervision module may detect when one or more unmanned aerial vehicles perform an unauthorized action or an action that is not entered by the user. The flight supervision module may also detect when one or more unauthorized unmanned aircraft enter the allotted airspace. The flight supervision module may issue alerts or warnings to the unauthorized unmanned aerial vehicle. A warning can be provided to a remote control of a user operating the unauthorized unmanned aerial vehicle. The warning can be issued visually, audibly, or tactilely.
The flight supervision module may utilize data collected by one or more sensors on the UAV. The flight supervision module may utilize data collected by one or more sensors off-board the UAV. Data may be collected by radar, photoelectric sensors, or acoustic sensors that may monitor unmanned aerial vehicles or other activities within the allotted airspace. The data may be collected by one or more base stations, stops, battery stations, geo-fencing devices, or networks. The data may be collected by stationary devices. The stationary device may or may not be configured for physical interaction with the UAV (e.g., to recover energy for the UAV, to accept delivery from the UAV, or to provide repairs to the UAV). The data may be provided from wired or wireless communication.
The empty pipe system may also include a flight control module/subsystem 242. The flight control module may be configured to generate and store one or more sets of flight controls. Air traffic management may be regulated based on a set of flight controls. The generation of the flight control may include creating the flight control from scratch, or may include selecting one or more sets of flight controls from a plurality of sets of flight controls. The generating of the flight control may include combining selected sets of flight controls.
The unmanned aerial vehicle may operate according to one or more sets of imposed flight controls. Flight controls may regulate any aspect of the operation of the unmanned aerial vehicle (e.g., flight, sensors, communications, payload, navigation, power usage, items carried). For example, flight controls may indicate where an unmanned aerial vehicle may or may not be flying. Flight controls may indicate when an unmanned aerial vehicle may or may not be flying in a particular region. Flight control may indicate when data may be collected, transmitted, and/or recorded by one or more sensors on the UAV. Flight controls may indicate when the payload may be operational. For example, the payload may be an image capture device, and flight controls may indicate when and where the image capture device may capture images, transmit images, and/or store images. Flight regulations may specify how communication may occur (e.g., channels or methods that may be used) or what type of communication may occur.
The flight control module may include one or more databases that store information regarding flight control. For example, the one or more databases may store one or more locations where flight of the unmanned aerial vehicle is restricted. The flight control module may store multiple sets of flight controls for multiple types of unmanned aerial vehicles, and the multiple sets of flight controls may be associated with a particular unmanned aerial vehicle. It may be possible to access a set of flight controls associated with a particular type of unmanned aerial vehicle of multiple types of unmanned aerial vehicles.
The flight control module may approve or reject one or more flight plans of the unmanned aerial vehicle. In some cases, a flight plan may be assigned that includes a proposed flight path for the unmanned aerial vehicle. A flight path may be provided with respect to the unmanned aerial vehicle and/or the environment. The flight path may be fully defined (defining all points along the path), semi-defined (e.g., may include one or more waypoints, but the path to the waypoints may be variable), or less defined (e.g., may include a final destination or other parameters, but the path to the final destination may be undefined). The flight control module may receive the flight plan and may approve or reject the flight plan. The flight control module may reject the flight plan if the flight plan contradicts a set of flight controls for the UAV. The flight control module may suggest modifications to the flight plan that may cause the flight plan to comply with the set of flight controls. The flight control module may generate or suggest a set of flight plans for the unmanned aerial vehicle that may comply with the set of flight controls. The user may input one or more parameters or goals for the unmanned aerial vehicle mission, and the flight control module may generate or suggest a set of flight plans that may satisfy the one or more parameters while adhering to the set of flight controls. Examples of parameters or targets for the UAV mission may include a destination, one or more waypoints, timing requirements (e.g., overall time limit, time to be at certain locations), maximum speed, maximum acceleration, type of data to collect, type of image to capture, any other parameter or target.
A traffic management module/subsystem 244 may be provided for the air management system. The traffic management module may be configured to receive a request for a resource from a user. Examples of resources may include, but are not limited to, wireless resources (e.g., bandwidth, access to communication devices), location or space (e.g., location or space for a flight plan), time (e.g., time for a flight plan), access to a base station, access to a docking station, access to a battery station, access to a delivery point or a pick-up point, or any other type of resource. The traffic management module may be configured to plan a flight path for the unmanned aerial vehicle in response to the request. The flight path can utilize the assigned resources. The traffic management module may be configured to plan missions for the unmanned aerial vehicle, which may optionally include operation of flight channels and any sensors or other devices on the unmanned aerial vehicle. The task may utilize any assigned resources.
The traffic management module may be configured to adjust the mission based on conditions detected in the assigned airspace. For example, the traffic management module may adjust the predetermined flight path based on the detected condition. Adjusting the flight path may include adjusting the entire predetermined flight path, adjusting waypoints of a semi-defined flight path, or adjusting a destination of the flight path. The detected conditions may include a change in climate, available airspace, an accident, the establishment of a geo-fencing device, or a change in flight control. The traffic management module may inform the user of adjustments to the mission, such as adjustments to the flight path.
The users 250a, 250b, 250c may be individuals associated with the unmanned aerial vehicles 260a, 260b, 260c, such as individuals operating the unmanned aerial vehicles. Examples of users and unmanned aerial vehicles are described elsewhere herein. A communication channel may be provided between the user and the corresponding unmanned aerial vehicle, which may be used to control the operation of the unmanned aerial vehicle (connection 3). Controlling operation of the unmanned aerial vehicle may include controlling flight of the unmanned aerial vehicle or any other portion of the unmanned aerial vehicle as described elsewhere herein.
A communication channel (connection 5) may be provided between the unmanned aerial vehicle and the air traffic system, as the air traffic system may identify a condition, alert a user regarding the condition, and/or take over the unmanned aerial vehicle to improve the condition. The communication channel may also facilitate identity authentication when the user and/or the UAV is undergoing an authentication process. Alternatively, a communication channel may be established between the air traffic system and the user's remote control, and some similar functionality may be provided. In a system including geo-fence devices, a communication channel may be provided between the geo-fence devices for identification/authentication and/or condition identification, warning and/or takeover.
A communication channel (connection 6) may be provided between the user and the air traffic system, as the air traffic system may identify conditions, alert the user regarding the conditions, and/or take over the unmanned aerial vehicle to improve the conditions. The communication channel may also facilitate identity authentication when the user and/or the UAV is undergoing an authentication process.
Alternatively, connection 1 may be a logical channel. Connection 2 and connection 4 may be network connections. For example, connections 2 and 4 may be provided through a Local Area Network (LAN), a Wide Area Network (WAN), such as the internet, a telecommunications network, a data network, a cellular network, or any other type of network. Connection 2 and connection 4 may be provided through indirect communication (e.g., through a network). Alternatively, they may be provided through a direct communication channel. The connections 3, 5 and 6 may be network connections provided via a remote control or ground station, mobile access network connections, or any other type of connection. They may be provided via an indirect communication channel or a direct communication channel.
An authorized third party (such as an air traffic control system, a geo-fencing system, etc.) may identify a corresponding unmanned aerial vehicle by a certification center according to its unmanned aerial vehicle Identifier (ID) and obtain relevant information (such as the configuration of the unmanned aerial vehicle, its capability level, and security level). The safety system may be capable of handling different types of unmanned aerial vehicles. Different types of unmanned aerial vehicles may have different physical characteristics (e.g., model, shape, size, engine power, distance, battery life, sensors, performance, payload rating, or capabilities) or may be used to perform different tasks (e.g., surveillance, camera, communication, delivery). Different types of unmanned aerial vehicles may have different levels of security or priority. For example, different types of unmanned aerial vehicles may be authorized to perform different activities. For example, an unmanned aerial vehicle having a first authorization type may be authorized to enter an area that an unmanned aerial vehicle having a second authorization type may not be authorized to enter. The types of unmanned aerial vehicles may include different types of unmanned aerial vehicles created by the same manufacturer or designer, or by different manufacturers or designers.
An authorized third party, such as an air traffic management system, a geo-fencing system, etc., can identify the corresponding user through the authentication center according to a user Identifier (ID) and obtain related information. The security system may be capable of handling different types of users. Different types of users may have different skill levels, experience levels, associations with different types of unmanned aerial vehicles, authorization levels, or different demographic information. For example, users with different skill levels may be considered different types of users. The user may undergo certification or testing to verify the user skill level. One or more other users may verify or verify the skill level of the user. For example, a mentor to a user may verify the skill level of the user. The user may alternatively identify the user's skill level by itself. Users with different degrees of experience may be considered different types of users. For example, a user may log or certify a particular number of hours of operation of the unmanned aerial vehicle or number of tasks flown using the unmanned aerial vehicle. Other users may verify or verify the user's experience level. The user may identify the user's amount of experience by himself. The user type may indicate a training level of the user. The skill level and/or experience of the user may be generic to the UAV. Alternatively, the skill level and/or experience of the user may be specific to the type of UAV. For example, a user may have a high skill level or a rich experience volume for a first type of unmanned aerial vehicle, while having a low skill level or a less rich experience for a second type of unmanned aerial vehicle. Different users of different types may include users with different authorization types. Different authorization types may mean that different sets of flight controls may be imposed for different users. In some cases, some users may have a higher level of safety than other users, which may mean that fewer flight restrictions or restrictions may be placed on the users. In some cases, the regular users may be distinguished from administrative users, who may be able to take over control from the regular users. Regular users may be distinguished from control entity users (e.g., members of government agencies, members of emergency services, such as law enforcement). In some embodiments, the administrative user may be a control entity user or may be distinct from a control entity user. In another example, a parent may be able to take over flight control from the parent's child, or a mentor may be able to take over flight control from a student. The user type may indicate a category or class of users operating one or more types of unmanned aerial vehicles. Other user type information may be based on user demographics (e.g., address, age, etc.).
Similarly, any other device or party involved in the security system may be of its own type. For example, the geo-fence identifier may indicate a geo-fence device type, or the remote controller identifier may indicate a remote controller type.
Unmanned aerial vehicles operating within the security system may be assigned an unmanned aerial vehicle ID and a key. The ID and key may be assigned from an ID registration database. The ID and key may be globally unique and optionally may not be duplicated. A user operating an unmanned aerial vehicle within a security system may be assigned a user ID and a key. The ID and key may be assigned from an ID registration database. The ID and key may be globally unique and optionally may not be duplicated.
The unmanned aerial vehicle and the air traffic control system may have mutual authentication using an ID and a key, thereby allowing the unmanned aerial vehicle to be operated. In some cases, the authentication may include obtaining permission to fly in a restricted area. The user and the air traffic system may have mutual authentication using the ID and the key, thereby allowing the user to operate the unmanned aerial vehicle.
The key can be provided in various forms. In some embodiments, the key of the UAV may be non-separable from the UAV. The key may be designed to prevent the key from being stolen. The key may be implemented by a write-once and externally unreadable memory (e.g., cryptographic chip) or by a hardened Universal Subscriber Identity Module (USIM). In some cases, the user key or the remote control key may not be separable from the user's remote control. The key may be used by the authentication center to authenticate the UAV, the user, and/or any other device.
An authentication system as provided herein may include: an identification registry database configured to store one or more unmanned aerial vehicle identifiers that uniquely identify unmanned aerial vehicles with respect to each other and one or more user identifiers that uniquely identify users with respect to each other; an authentication center configured to authenticate an identity of the unmanned aerial vehicle and an identity of the user; and an air management system configured to receive an unmanned aerial vehicle identifier of the certified unmanned aerial vehicle and a user identifier of the certified user, and provide a set of flight controls based on at least one of: a certified unmanned aerial vehicle identifier and a certified user identifier.
The authentication system may be implemented using any hardware configuration or arrangement known in the art or later developed. For example, one or more servers may be used to operate an ID registration database, an authentication center, and/or an air management system, individually or collectively. One or more subsystems of the air management system, such as a flight surveillance module, a flight control module, a traffic management module, a user access control module, an unmanned aerial vehicle access control module, or any other module, may be implemented individually or collectively using one or more servers. Any description of a server may be applicable to any other type of device. The device may be a computer (e.g., personal computer, laptop, server), a mobile device (e.g., smartphone, cellular phone, tablet, personal digital assistant), or any other type of device. The device may be a network device capable of communicating over a network. The apparatus includes one or more memory storage units, which may include a non-transitory computer-readable medium that may store code, logic, or instructions for performing one or more of the steps described elsewhere herein. The apparatus may include one or more processors that may perform one or more steps, individually or collectively, in accordance with code, logic, or instructions of a non-transitory computer readable medium as described herein.
The various components, such as the ID registration database, the authentication center, and/or the air management system, may be implemented at the same location on the hardware or may be implemented at different locations. The authentication system components may be implemented using the same device or multiple devices. In some cases, a cloud computing infrastructure may be implemented to provide an authentication system. Alternatively, the authentication system may utilize a peer-to-peer (P2P) relationship.
The components may be provided off-board the UAV, on-board the UAV, or some combination thereof. The components may be provided external to the remote control, on the remote control, or some combination thereof. In some preferred embodiments, the components may be provided off-board the UAV or off-board the remote control, and may communicate with the UAV (and/or other UAVs) and the remote control (and/or other remotes). The components may communicate directly or indirectly with the unmanned aerial vehicle. In some cases, the communication may be relayed via another apparatus. The other device may be a remote control or another unmanned aerial vehicle.
Flight control
The activities of the unmanned aerial vehicle may be governed according to a set of flight regulations. The set of flight controls may include one or more flight controls. Various types and examples of flight controls are described herein.
Flight controls may govern the physical layout of the unmanned aerial vehicle. For example, flight controls may govern the flight of the unmanned aerial vehicle, takeoff of the unmanned aerial vehicle, and/or landing of the unmanned aerial vehicle. The flight control may indicate a surface area over which the unmanned aerial vehicle may or may not be flying, or a volume of space in which the unmanned aerial vehicle may or may not be flying. Flight control may relate to the position of the unmanned aerial vehicle (e.g., where the unmanned aerial vehicle is in space or above an underlying surface) and/or the orientation of the unmanned aerial vehicle. In some examples, flight controls may prevent the unmanned aerial vehicle from flying within an assigned volume (e.g., airspace) and/or above an assigned region (e.g., below ground or water). The flight control may include one or more boundaries within which the UAV is not permitted to fly. In other examples, flight controls may only allow the unmanned aerial vehicle to fly within the allotted volume and/or over the allotted area. The flight control may include one or more boundaries that allow the unmanned aerial vehicle to fly within. Alternatively, flight control may prevent the UAV from flying above an upper altitude limit, which may be fixed or variable. In another case, flight controls may prevent the unmanned aerial vehicle from flying below a lower altitude limit, which may be fixed or variable. The unmanned aerial vehicle may be required to fly at an altitude between a lower altitude limit and an upper altitude limit. In another example, the unmanned aerial vehicle may not be able to fly within one or more ranges of altitudes. For example, flight regulations may only allow unmanned aerial vehicle orientations within a certain range, or may not allow unmanned aerial vehicle orientations within a certain range. The range of unmanned aerial vehicle orientations may be about one, two or three axes. The axis may be an orthogonal axis, such as a yaw axis, a pitch axis, or a roll axis.
Flight controls may govern the movement of the unmanned aerial vehicle. For example, the flight control may govern translational velocity of the unmanned aerial vehicle, translational acceleration of the unmanned aerial vehicle, angular velocity of the unmanned aerial vehicle (e.g., angular velocity about one, two, or three axes), or angular acceleration of the unmanned aerial vehicle (e.g., angular acceleration about one, two, or three axes). The flight control may set a maximum limit for the unmanned aerial vehicle translational velocity, the unmanned aerial vehicle translational acceleration, the unmanned aerial vehicle angular velocity, or the unmanned aerial vehicle angular acceleration. Thus, the set of flight controls may include limiting the airspeed and/or acceleration of the unmanned aerial vehicle. The flight control may set a minimum threshold for the unmanned aerial vehicle translational velocity, the unmanned aerial vehicle translational acceleration, the unmanned aerial vehicle angular velocity, or the unmanned aerial vehicle angular acceleration. Flight control may require that the unmanned aerial vehicle move between a minimum threshold and a maximum limit. Alternatively, flight controls may prevent the unmanned aerial vehicle from moving within one or more translational velocity ranges, translational acceleration ranges, angular velocity ranges, or angular acceleration ranges. In one example, the unmanned aerial vehicle may not be allowed to hover within the assigned airspace. The UAV may be required to fly above a minimum translational velocity of 0 mph. In another example, the UAV may not be allowed to fly too fast (e.g., below a maximum speed limit of 40 mph). Movement of the unmanned aerial vehicle may be governed with respect to the assigned volume and/or over the assigned region.
Flight control may govern the takeoff and/or landing processes of an unmanned aerial vehicle. For example, an unmanned aerial vehicle may be allowed to fly in the assigned region without landing in the region. In another example, the unmanned aerial vehicle may only be able to take off from the assigned region in a certain manner or at a certain speed. In another example, manual takeoff or landing may not be permitted within the assigned region, and an autonomous landing or takeoff procedure must be used. Flight controls may govern whether takeoff is permitted, whether landing is permitted, any rules (e.g., speed, acceleration, direction, orientation, flight pattern) that must be followed for takeoff or landing. In some embodiments, only an automated sequence for takeoff and/or landing is allowed and manual landing or takeoff is not allowed, or vice versa. The takeoff and/or landing process of the unmanned aerial vehicle may be governed with respect to the assigned volume and/or over the assigned region.
In some cases, flight control may govern the operation of the payload of the unmanned aerial vehicle. The payload of the unmanned aerial vehicle may be a sensor, an emitter, or any other object that may be carried by the unmanned aerial vehicle. The payload may be powered on or off. The payload may appear to be operational (e.g., powered on) or non-operational (e.g., powered off). Flight controls may include conditions that do not allow the UAV to operate on the payload. For example, in the assigned airspace, flight control may require the payload to be powered off. The payload may emit a signal and flight controls may dictate the nature of the signal, the amplitude of the signal, the range of the signal, the direction of the signal, or any mode of operation. For example, if the payload is a light source, flight regulations may require that the light within the assigned airspace is not brighter than a threshold light intensity. In another example, if the payload is a speaker for emitting sound, flight regulations may require that the speaker not transmit any noise outside of the assigned airspace. The payload may be a sensor that collects information, and flight regulations may govern the mode in which information is collected, the mode about how information is pre-processed or processed, at what resolution information is collected, at what frequency or sampling rate information is collected, from what range information is collected, or from what direction information is collected. For example, the payload may be an image capture device. The image capture device may be capable of capturing still images (e.g., still images) or moving images (e.g., video). Flight regulations may govern the zoom of the image capture device, the resolution of the image captured by the image capture device, the sampling rate of the image capture device, the shutter speed of the image capture device, the aperture of the image capture device, whether a flash is used, the mode of the image capture device (e.g., illumination mode, color mode, still and video mode), or the focus of the image capture device. In one example, the camera may not be allowed to capture images on the assigned region. In another example, the camera may be allowed to capture images but not sound over the assigned region. In another example, the camera may be allowed to capture only high resolution photographs within the assigned region and take low resolution photographs outside the assigned region. In another example, the payload may be an audio capture device. Flight regulations may govern whether the audio capture device is allowed to be powered on, the sensitivity of the audio capture device, the decibel range that the audio capture device is capable of picking up, the directionality of the audio capture device (e.g., the directionality for a parabolic microphone), or any other quality of the audio capture device. In one example, the audio capture device may or may not be allowed to capture sound within the assigned region. In another example, the audio capture device may only be allowed to capture sound within a particular frequency range within the assigned region. The operation of the payload may be governed with respect to the allocated volume and/or over the allocated region.
Flight control may dictate whether the payload may transmit or store information. For example, if the payload is an image capture device, flight regulations may dictate whether images (static or dynamic) may be recorded. Flight controls may govern whether the images may be recorded in memory on-board the image capture device or memory on the UAV. For example, an image capture device may be allowed to power on and show a captured image on a local display, but may not be allowed to record any image. Flight controls may govern whether images may flow out of the image capture device or out of the unmanned aerial vehicle. For example, flight regulations may specify that when an unmanned aerial vehicle is located within an assigned airspace, an image capture device on the unmanned aerial vehicle may be allowed to stream video down to a terminal outside of the unmanned aerial vehicle, while when the unmanned aerial vehicle is located outside of the assigned airspace, the image capture device may not be able to stream video down. Similarly, if the payload is an audio capture device, flight regulations may govern whether sound may be recorded in memory on-board the audio capture device or memory on the UAV. For example, an audio capture device may be allowed to power on and play back captured sound on a local speaker, but may not be allowed to record any sound. Flight regulations may dictate whether images may flow out of the audio capture device or out of any other payload. Storage and/or transmission of collected data may be governed with respect to the assigned volume and/or over the assigned region.
In some cases, the payload may be an item carried by the unmanned aerial vehicle, and the flight control may specify a characteristic of the payload. Examples of payload characteristics may include a size of the payload (e.g., height, width, length, diameter, diagonal), a weight of the payload, a stability of the payload, a material of the payload, a fragility of the payload, or a type of the payload. For example, flight regulations may specify that an unmanned aerial vehicle may carry no more than 3lbs of packages while flying over an allotted area. In another example, flight control may allow an unmanned aerial vehicle to carry only parcels having a size greater than 1 foot within the allotted volume. Another flight control may allow an unmanned aerial vehicle to only fly within the allotted volume for 5 minutes when carrying a 1lb or larger package, and may be allowed to automatically land if the unmanned aerial vehicle has not left the allotted volume within 5 minutes. A limitation on the type of payload itself may be provided. For example, an unstable or potentially explosive payload may not be carried by an unmanned aerial vehicle. Flight restrictions may prevent fragile objects from being carried by the unmanned aerial vehicle. Characteristics of the payload may be regulated with respect to the assigned volume and/or over the assigned region.
Flight control may also specify activities that may be performed with respect to items carried by the unmanned aerial vehicle. For example, flight controls may indicate whether air-dropped items may be available in the assigned region. Similarly, flight controls may indicate whether items may be picked up from the assigned region. The unmanned aerial vehicle may have robotic arms or other mechanical structures that may facilitate aerial delivery or picking of items. The unmanned aerial vehicle may have a load-bearing compartment that may allow the unmanned aerial vehicle to carry items. Activity involving the payload may be regulated with respect to the assigned volume and/or the assigned region.
The positioning of the payload relative to the UAV may be governed by flight controls. The position of the payload relative to the UAV may be adjustable. The translational position of the payload relative to the UAV and/or the orientation of the payload relative to the UAV may be adjustable. The translational position may be adjustable about one, two or three orthogonal axes. The orientation of the payload may be adjustable about one, two, or three orthogonal axes (e.g., a pitch axis, a yaw axis, or a roll axis). In some embodiments, the payload may be coupled to an unmanned aerial vehicle having a carrier that may control the positioning of the payload relative to the unmanned aerial vehicle. The carrier may support the weight of the payload on the unmanned aerial vehicle. The carrier may optionally be a pan-tilt that may allow the payload to rotate about one, two, or three axes relative to the UAV. One or more frame members and one or more actuators may be provided that may enable adjustment of the payload positioning. Flight controls may control the vehicle or any other mechanism that adjusts the position of the payload relative to the UAV. In one example, flight regulations may not allow payload to face downward toward when flying above an assigned region. For example, the region may have sensitive data for which payload capture may not be desired. In another example, flight controls may move the payload downward in translation relative to the UAV when located within the assigned airspace, which may allow for a wider field of view, such as panoramic image capture. The positioning of the payload may be dictated with respect to the assigned volume and/or over the assigned region.
Flight control may govern the operation of one or more sensors of the unmanned aerial vehicle. For example, flight controls may govern whether sensors are turned on or off (or which sensors are turned on or off), the mode in which information is collected, the mode about how information is preprocessed or processed, at what resolution information is collected, at what frequency or sampling rate, from what range information is collected, or from what direction information is collected. Flight control may dictate whether the sensors may store or transmit information. In one example, when the unmanned aerial vehicle is within the allotted volume, the GPS sensor may be turned off while the visual sensor or inertial sensor is turned on for navigation purposes. In another example, the audio sensor of the unmanned aerial vehicle may be turned off when the flight is above the assigned region. Operation of one or more sensors may be governed with respect to the allotted volume and/or over the allotted region.
Communications of the unmanned aerial vehicle may be controlled according to one or more sets of flight controls. For example, an unmanned aerial vehicle may be capable of remote communication with one or more remote devices. Examples of remote devices may include a remote control that may control operation of the unmanned aerial vehicle, a payload, a carrier, a sensor, or any other component of the unmanned aerial vehicle, a display terminal that may show information received by the unmanned aerial vehicle, a database that may collect information from the unmanned aerial vehicle, or any other external device. The remote communication may be wireless communication. The communication may be a direct communication between the UAV and the remote device. Examples of direct communication may include WiFi, WiMax, radio frequency, infrared, visual, or other types of direct communication. The communication may be an indirect communication between the UAV and a remote device, which may include one or more intermediary devices or networks. Examples of indirect communications may include 3G, 4G, LTE, satellite, or other types of communications. Flight control may indicate whether to turn telecommunications on or off. Flight controls may include conditions that do not allow the UAV to communicate under one or more wireless conditions. For example, when the unmanned aerial vehicle is within the allotted airspace volume, communication may not be allowed. Flight regulations may specify communication modes that may or may not be allowed. For example, flight regulations may specify whether a direct communication mode is allowed, whether an indirect communication mode is allowed, or whether a preference is set up between a direct communication mode and an indirect communication mode. In one example, only direct communication is allowed within the assigned volume. In another example, over the assigned region, a preference for direct communication may be established as long as direct communication is available, otherwise indirect communication may be used, and no communication is allowed outside the assigned region. Flight regulations may specify characteristics of the communication such as bandwidth used, frequency used, protocol used, encryption used, devices to assist in communication that may be used. For example, flight controls may only allow communication using existing networks when the unmanned aerial vehicle is located within a predetermined volume. Flight controls may govern unmanned aerial vehicles with respect to assigned volumes and/or communications over assigned regions.
Other functions of the UAV may be governed by flight controls, such as navigation, power usage, and monitoring. Examples of charge usage and monitoring may include the amount of time remaining in flight based on the battery and charge usage information, the state of charge of the battery, or the amount remaining based on an estimated distance of the battery and charge usage information. For example, flight regulations may require that unmanned aerial vehicles operating within the allotted volume have a remaining battery life of at least 3 hours. In another example, flight regulations may require that an unmanned aerial vehicle have a state of charge of at least 50% when outside of an assigned region. Such additional functionality may be governed by flight controls with respect to the assigned volume and/or over the assigned region.
The assigned volume and/or the assigned region may be static for a set of flight controls. For example, boundaries for the assigned volume and/or the assigned region may remain the same for the set of flight controls at all times. Alternatively, the boundary may change over time. For example, the assigned region may be a school, and the boundary for the assigned region may contain the school during the class time. After school, the boundary may be narrowed or the assigned region may be removed. During the post school period, an assigned area may be created at a nearby park where the child attends post school activities. The rules regarding the assigned volumes and/or assigned regions may remain the same at all times for the set of flight controls or may change over time. The change may be indicated by a time of day, a day of week, a week of month, a quarter, a season, a year, or any other time-related factor. Information from the clock (which may provide time of day, date, or other time-related information) may be used to implement the change in boundaries or rules. A set of flight controls may have a dynamic component that is responsive to factors other than time. Examples of other factors may include climate, temperature, detected light levels, detected presence of individuals or machines, environmental complexity, physical traffic flow (e.g., terrestrial traffic flow, pedestrian traffic flow, aircraft traffic flow), wireless or network traffic flow, detected noise levels, detected movement, detected heat flags, or any other factor.
The assigned volume and/or the assigned region may or may not be associated with a geo-fencing device. The geofencing device may be a reference point for the assigned volume and/or the assigned region. As described elsewhere herein, the assigned volume and/or the location of the assigned region may be provided based on the location of the geo-fencing device. Alternatively, the assigned volume and/or region may be provided without the presence of a geo-fencing device. For example, without requiring an actual geo-fencing device to be at an airport, known coordinates of the airport may be provided and used as a reference for the assigned volume and/or the assigned region. Any combination of assigned volumes and/or regions may be provided, some of which may be dependent on the geo-fencing device and some of which may be independent of the device.
Flight control may cause any type of flight response measure for the UAV. For example, unmanned aerial vehicles may change course. The UAV may enter the autonomous or semi-autonomous flight control mode automatically from the manual mode, or may not respond to some user input. The UAV may allow another user to take over control of the UAV. The unmanned aerial vehicle can automatically land or take off. The UAV may send a warning to the user. The unmanned aerial vehicle can automatically decelerate or accelerate. The unmanned aerial vehicle may adjust the operation (which may include suspending operation or changing operating parameters thereof) of the payload, the carrier, the sensor, the communication unit, the navigation unit, the power conditioning unit. The flight response action may occur immediately or may occur after a period of time (e.g., 1 minute, 3 minutes, 5 minutes, 10 minutes, 15 minutes, 30 minutes). The time period may be a grace period for the user to react and exercise some control of the unmanned aerial vehicle before flight response measures begin to take effect. For example, if the user approaches a flight-restricted zone, the user may be alerted and the course of the unmanned aerial vehicle may be changed to leave the flight-restricted zone. If the user does not respond within the grace period, the unmanned aerial vehicle may automatically land within the flight-restricted zone. The UAV may operate normally in accordance with one or more flight commands from a remote control operated by a remote user. The flight response measure may override the one or more flight commands when the set of flight controls and the one or more flight commands conflict. For example, if a user commands the unmanned aerial vehicle to enter a no-fly zone, the unmanned aerial vehicle may automatically change the channel to avoid the no-fly zone.
The set of flight controls may include information regarding one or more of: (1) an assigned volume and/or region over which a set of flight controls may be imposed, (2) one or more rules (e.g., operation of an unmanned aerial vehicle, a payload, a carrier, a sensor, a communication module, a navigation unit, a power unit), (3) one or more flight response measures that cause the unmanned aerial vehicle to comply with the rules (e.g., response of the unmanned aerial vehicle, the payload, the carrier, the sensor, the communication module, the navigation unit, the power unit), or (4) time or any other factor that may affect the assigned volume and/or region, the rules or the flight response measures. A set of flight controls may include a single flight control, which may include information regarding (1), (2), (3), and/or (4). The set of flight controls may include a plurality of flight controls, which may each include information regarding (1), (2), (3), and/or (4). Any type of flight control may be incorporated, and any combination of flight response measures may occur as a function of the flight control. One or more assigned volumes and/or zones may be provided for a set of flight controls. For example, a set of flight controls may be provided for the unmanned aerial vehicle that do not allow the unmanned aerial vehicle to fly within the assigned first volume, that allow the unmanned aerial vehicle to fly within the assigned second volume below the upper altitude limit but do not allow operation of a camera on the unmanned aerial vehicle, and that only allow the unmanned aerial vehicle to record audio data within the assigned third volume. The unmanned aerial vehicle may have flight response measures that may enable the unmanned aerial vehicle to comply with flight regulations. The manual operation of the unmanned aerial vehicle may be overridden to cause the unmanned aerial vehicle to comply with rules of flight regulations. One or more flight response actions may occur automatically to override the manual input by the user.
A set of flight controls may be generated for the unmanned aerial vehicle. The generating of the set of flight restrictions may include creating the flight restrictions from scratch. The generating of the set of flight controls may include selecting a set of flight controls from among a plurality of sets of available flight controls. The generation of the set of flight controls may include combining one or more sets of characteristics of the flight controls. For example, generation of a set of flight controls may include determining elements, such as determining an assigned volume and/or region, determining one or more rules, determining one or more flight response measures, and/or determining any factors that may make any of the elements a dynamic element. These elements may be generated from scratch or may be selected from one or more pre-existing element options. In some cases, flight controls may be manually selected by a user. Alternatively, flight controls may be automatically selected without user intervention by way of one or more processors. In some cases, some user input may be provided, but the one or more processors may cause the final determination of flight control to comply with the user input.
FIG. 3 illustrates an example of one or more factors that may participate in the generation of a set of flight controls. For example, user information 310, unmanned aerial vehicle information 320, and/or geofence device information 330 may participate in the generation of a set of flight controls 340. In some cases, only user information, only unmanned aircraft information, only geofence information, only remote control information, or any number or any combination of these factors are considered in generating the set of flight controls.
Additional factors may be considered in generating the set of flight controls. These factors may include information about the local environment (e.g., environmental complexity, urban and rural areas, traffic flow information, climate information), information from one or more third party sources (e.g., government sources such as FAA), time-related information, user-entered preferences, or any other factor.
In some embodiments, a set of flight controls for a particular terrain (e.g., assigned volume, assigned region) may be the same regardless of user information, unmanned aerial vehicle information, geofence device information, or any other information. For example, all users may receive the same set of flight controls. In another case, all unmanned aerial vehicles may receive the same set of flight controls.
Alternatively, a set of flight controls for a particular terrain (e.g., assigned volume, assigned region) may differ based on user information, unmanned aerial vehicle information, and/or geofence device information. The user information may include information specific to an individual user (e.g., user flight history, records of previous user flights) and/or may include a user type as described elsewhere herein (e.g., user skill category, user experience category). The UAV information may include information specific to a single UAV (e.g., UAV flight history, records of maintenance or accident, unique serial number) and/or may include a type of UAV (e.g., UAV model number, characteristics) as described elsewhere herein.
A set of flight controls may be generated based on a user identifier indicating a user type. A system for controlling an Unmanned Aerial Vehicle (UAV) may be provided. The system may include: a first communication module; one or more processors operatively coupled to the first communication module and individually or collectively configured to: receiving, using the first communication module or a second communication module, a user identifier indicating a user type; generating a set of flight controls for the UAV based on the user identifier; and transmitting the set of flight restrictions to the UAV using the first communication module or the second communication module.
A method for controlling an Unmanned Aerial Vehicle (UAV) may comprise: receiving a user identifier indicating a user type; generating, by way of one or more processors, a set of flight restrictions for the UAV based on the user identifier; and transmitting, by means of a communication module, the set of flight controls to the UAV. Similarly, a non-transitory computer readable medium containing program instructions for controlling an Unmanned Aerial Vehicle (UAV) may be provided, the computer readable medium comprising: program instructions for receiving a user identifier indicating a user type; program instructions for generating a set of flight controls for the UAV based on the user identifier; and program instructions for generating a signal to transmit the set of flight controls to the UAV via a communication module.
An unmanned aerial vehicle may include: one or more propulsion units that enable flight of the UAV; a communication module configured to receive one or more flight commands from a remote user; and a flight control unit configured to generate flight control signals for delivery to the one or more propulsion units, wherein the flight control signals are generated according to a set of flight regulations for the UAV, wherein the flight regulations are generated based on a user identifier indicating a user type of the remote user.
The user type may have any of the characteristics as described elsewhere herein. For example, the user type may indicate a level of experience of a user operating the unmanned aerial vehicle, a level of training or certification of a user operating the unmanned aerial vehicle, or a class of users operating one or more types of unmanned aerial vehicles. The user identifier may uniquely identify the user from among other users. The user identifier may be received from a remote control remote from the UAV.
Generating the set of flight controls by selecting the set of flight controls from a plurality of sets of flight controls based on the user identifier. The set of flight controls is generated by an air traffic system off-board the UAV. The unmanned aerial vehicle may communicate with the air traffic system via a direct communication channel. The unmanned aerial vehicle may communicate with the air traffic system by relaying through a user or a remote control operated by the user. The unmanned aerial vehicle may communicate with the air duct system by relaying through one or more other unmanned aerial vehicles.
A set of flight controls may be generated based on an unmanned aerial vehicle identifier that indicates a type of unmanned aerial vehicle. A system for controlling an Unmanned Aerial Vehicle (UAV) may be provided. The system may include: a first communication module; one or more processors operatively coupled to the first communication module and individually or collectively configured to: receiving, using the first or second communication module, an unmanned aerial vehicle identifier indicating a type of unmanned aerial vehicle; generating a set of flight controls for the UAV based on the UAV identifier; and transmitting the set of flight controls to the UAV using the first or second communication module.
In some embodiments, a method for controlling an Unmanned Aerial Vehicle (UAV) may comprise: receiving an unmanned aerial vehicle identifier indicating a type of unmanned aerial vehicle; generating, by way of one or more processors, a set of flight restrictions for the UAV based on the UAV identifier; and transmitting, by means of a communication module, the set of flight controls to the UAV. Similarly, a non-transitory computer readable medium containing program instructions for controlling an Unmanned Aerial Vehicle (UAV) may comprise: program instructions for receiving an unmanned aerial vehicle identifier indicating a type of unmanned aerial vehicle; program instructions for generating a set of flight controls for the UAV based on the UAV identifier; and program instructions for generating a signal to transmit the set of flight controls to the UAV via a communication module.
An Unmanned Aerial Vehicle (UAV) may be provided, comprising: one or more propulsion units that enable flight of the unmanned aerial vehicle; a communication module configured to receive one or more flight commands from a remote user; and a flight control unit configured to generate flight control signals for delivery to the one or more propulsion units, wherein the flight control signals are generated according to a set of flight restrictions for the UAV, wherein the flight restrictions are generated based on an UAV identifier indicating a type of UAV for the remote user.
The unmanned aerial vehicle type may have any of the characteristics as described elsewhere herein. For example, the type of UAV may indicate a model of the UAV, a performance of the UAV, or a payload of the UAV. The UAV identifier may uniquely identify the UAV from other UAVs. The user identifier may be received from a remote control remote from the UAV.
A set of flight controls may be generated based on the inclusion of one or more additional factors, such as those described elsewhere herein. For example, environmental conditions may be considered. For example, more restrictions may be provided if the environment complexity is high, and less restrictions may be provided if the environment complexity is low. More restrictions may be provided if population density is high, and less restrictions may be provided if population density is low. More restrictions may be provided if there is a higher degree of traffic flow (e.g., air traffic flow or surface-based traffic flow), and less restrictions may be provided if there is a lower degree of traffic flow. In some embodiments, more restrictions may be provided if the ambient climate has extreme temperatures, is windy, includes the possibility of precipitation or lightning than if the ambient climate has a more favorable temperature, has less wind, has no precipitation or has little or no possibility of lightning.
Generating the set of flight controls by selecting the set of flight controls from a plurality of sets of flight controls based on the UAV identifier. The set of flight controls is generated by an air traffic system off-board the UAV. The unmanned aerial vehicle may communicate with the air traffic system via a direct communication channel. The unmanned aerial vehicle may communicate with the air traffic system by relaying through a user or a remote control operated by the user. The unmanned aerial vehicle may communicate with the air duct system by relaying through one or more other unmanned aerial vehicles.
As previously described, various types of flight controls may be provided in a set of flight controls. Flight control may or may not be specific to the unmanned aerial vehicle or the user.
FIG. 7 illustrates a diagram of a scenario involving multiple types of flight controls. Various regions may be provided. Boundaries may be provided to define the region. A set of flight controls may affect one or more regions (e.g., airspace or airspace volume above a two-dimensional surface region). The set of flight controls may include one or more rules associated with one or more zones.
In one example, a flight regulatory region 710 may be provided, a communication regulatory region 720 may be provided, and a payload regulatory region 730 may be provided. Payload and communication policing zone 750 may be provided, as well as non-policing zone 760. The zones may have boundaries of any shape or size. For example, a region may have a regular shape, such as a circle, ellipse, oval, square, rectangle, any type of quadrilateral, triangle, pentagon, hexagon, octagon, bar, curve, and the like. The zones may have an irregular shape, which may include convex or concave components.
Flight control zone 710 may enforce one or more rules regarding the layout or movement of the UAV. The flight control zone may impose flight response measures that may affect the flight of the unmanned aerial vehicle. For example, when located within a flight control zone, an unmanned aerial vehicle may only be able to fly at an altitude between a lower altitude limit and an upper altitude limit without imposing flight restrictions outside of the flight control zone.
Payload regulation zone 720 may enforce one or more rules regarding the operation or positioning of the payload of the UAV. The payload regulatory region may impose flight response measures that may affect the payload of the UAV. For example, when located within a payload regulated zone, the unmanned aerial vehicle may not be able to capture images using the image capture device payload without imposing payload restrictions outside of the payload regulated zone.
Communication regulatory region 730 may impose one or more rules regarding the operation of the communication units of the unmanned aerial vehicle. The communication regulatory region may impose flight response measures that affect the operation of the communication unit of the unmanned aerial vehicle. For example, when located in a communication regulated zone, the unmanned aerial vehicle may not be able to transmit the captured data, but may be allowed to receive flight control signals without imposing communication restrictions outside of the communication regulated zone.
Payload and communication regulatory region 750 may enforce one or more rules regarding the operation/positioning of the payload of the unmanned aerial vehicle and the communication units of the unmanned aerial vehicle. For example, when located within a payload and communication regulatory region, an unmanned aerial vehicle may not be able to store images captured by an image capture device payload on board the unmanned aerial vehicle, and may also not be able to stream or transmit images off board the unmanned aerial vehicle, without imposing such restrictions outside of the payload and communication regulatory region.
One or more non-regulated zones may be provided. The non-geofence may be outside of one or more boundaries, or may be within one or more boundaries. When located within the non-regulated zone, the user may retain control of the unmanned aerial vehicle without automatically initiating one or more flight response measures. The user may be able to freely operate the UAV within the physical limits of the UAV.
One or more of the zones may overlap. For example, the flight control zone may overlap 715 with the communication control zone. As another example, the communication policing region may overlap 725 with the payload policing region. As another example, the flight control zone may overlap 735 with the payload control zone. In some cases, the flight control zone, the communication control zone, and the payload control zone may all overlap 740.
When multiple zones overlap, rules from multiple zones may remain in place. For example, both flight restrictions and communication restrictions may remain in place in the overlap region. In some cases, rules from multiple zones may remain in place as long as they do not conflict with each other.
If there are conflicts between the rules, various rule responses may be applied. For example, the most restrictive set of rules may be applied. For example, if a first zone requires that the unmanned aerial vehicle fly below an altitude of 400 feet, and a second zone requires that the unmanned aerial vehicle fly below an altitude of 200 feet, in the overlap zone, rules regarding flying below an altitude of 200 feet may be applied. This may include mixing and matching a set of rules to form the most restrictive set. For example, if a first zone requires that the unmanned aerial vehicle fly above 100 feet and below 400 feet, and a second zone requires that the unmanned aerial vehicle fly above 50 feet and below 200 feet, then when the unmanned aerial vehicle is located in the overlap zone, it may use the lower flight limit from the first zone and the upper flight limit from the second zone to fly between 100 feet and 200 feet.
In another case, a level may be provided for the zone. Rules from higher level regions may dominate regardless of whether they have more or less restrictions than rules from lower level regions. The level may be specified according to the type of regulation. For example, flight regulations for unmanned aircraft location may be of a higher level than communication regulations, which may be of a higher level than payload regulations. In other cases, the rules regarding whether an unmanned aerial vehicle is not allowed to fly within a particular zone may override other regulations for that zone. The level may be preselected or pre-entered. In some cases, a user providing a set of rules for the zones may indicate which zones are hierarchically higher than other zones. For example, a first zone may require that the unmanned aerial vehicle fly below 400 feet and that the payload must be shut down. The second zone may require the unmanned aerial vehicle to fly below 200 feet and have no payload limitations. If the level of the first zone is higher, the rule from the first zone may be applied without applying any rule from the second zone. For example, an unmanned aerial vehicle may fly below 400 feet and have the payload shut down. If the level of the second zone is higher, the rule from the second zone may be applied without applying any rule from the first zone. For example, an unmanned aerial vehicle may fly below 200 feet and not have any payload restrictions.
As previously described, a set of flight controls may impose different types of rules on an unmanned aerial vehicle when the unmanned aerial vehicle is located in a zone. This may include restricting payload usage based on the position of the UAV or restricting wireless communication based on the position of the UAV.
Aspects of the invention may relate to an unmanned aerial vehicle payload control system comprising: a first communication module; and one or more processors operatively coupled to the first module and individually or collectively configured to: receiving, using the first or second communication module, a signal indicative of a location dependent payload usage parameter; and generating one or more UAV operation signals that effect payload operation in compliance with the payload usage parameters.
A method for constraining payload usage of an unmanned aerial vehicle, the method comprising: receiving a signal indicative of a location dependent payload usage parameter; and generating, by means of one or more processors, one or more UAV operation signals that effect payload operation in compliance with the payload usage parameters. Similarly, a non-transitory computer-readable medium containing program instructions for constraining payload usage of an unmanned aerial vehicle may be provided, the computer-readable medium comprising: program instructions for receiving a signal indicative of a location dependent payload usage parameter; and program instructions for generating one or more UAV operation signals that effect payload operation in compliance with the payload usage parameters.
According to an embodiment of the system, the unmanned aerial vehicle may comprise: a payload; a communication module configured to receive one or more payload commands from a remote user; and a flight control unit configured to generate payload control signals delivered to the payload or a carrier supporting the payload, wherein the payload control signals are generated in accordance with one or more UAV operation signals, wherein the UAV operation signals are generated based on a location-dependent payload usage parameter.
The payload usage parameter may limit payload usage at one or more predetermined locations. As previously described, the payload may be an image capture device and the payload usage parameters may limit the operation of the image capture device at one or more predetermined locations. The payload usage parameter may limit recording of one or more images using the image capture device at one or more predetermined locations. The payload usage parameter may limit transmission of one or more images using the image capture device at one or more predetermined locations. In other embodiments, the payload may be an audio capture device, and the payload usage parameters limit operation of the audio capture device at one or more predetermined locations.
Alternatively or in combination, the payload usage parameters may allow payload usage at one or more predetermined locations. When the payload is an image capture device, the payload usage parameters may allow operation of the image capture device at one or more predetermined locations. The payload usage parameter may allow one or more images to be recorded using the image capture device at one or more predetermined locations. The payload usage parameters may allow transmission of one or more images using the image capture device at one or more predetermined locations. The payload may be an audio capture device and the payload usage parameters may allow operation of the audio capture device at one or more predetermined locations.
The one or more processors may also be configured, individually or collectively, to: receiving, using the first or second communication module, a signal indicative of a position of an unmanned aerial vehicle; and comparing the position of the UAV to the position-dependent payload usage parameter and determining whether the UAV is at a location that limits or allows payload operation. The location may be a restricted flight zone. The restricted flight zone may be determined by a controller. The restricted flight zone may be within a predetermined distance from an airport, public gathering place, government property, military property, school, private residence, power plant, or any other area that may be designated as a restricted flight zone. The position may remain stationary at all times or may change over time.
A signal indicative of a location dependent payload usage parameter may be received from a control entity. The control entity is a regulator, an international organization or corporation, or any other type of control entity as described elsewhere herein. The control entity may be a global organization, such as any of the organizations and organizations described elsewhere herein. The control entity may be a source off-board or on-board the unmanned aerial vehicle. The control entity may be an air management system off-board the unmanned aircraft or any other part of an authentication system off-board the unmanned aircraft. The control entity may be a database that is stored in a memory of the unmanned aerial vehicle or may be stored off-board the unmanned aerial vehicle. The database may be configured to be updatable. The control entity may be a transport device which may be positioned at a location where payload operation is restricted or allowed. In some cases, the control entity may be a geo-fencing device as described elsewhere herein. In some cases, the signal may be transmitted based on a user identifier indicating a user of the UAV and/or an UAV identifier indicating a type of the UAV.
One aspect of the present invention may relate to an unmanned aerial vehicle communication control system including: a first communication module; and one or more processors operatively coupled to the first communication module and individually or collectively configured to: receiving, using the first or second communication module, a signal indicative of a location-dependent communication usage parameter; and generating one or more UAV operation signals that enable operation of the UAV communication unit in compliance with the communication usage parameters.
Further, a method for wireless communication of a constrained unmanned aerial vehicle is provided, comprising: receiving a signal indicative of a location dependent communication usage parameter; and generating, by means of one or more processors, one or more UAV operation signals that enable operation of the communication unit in compliance with the communication usage parameters. Similarly, a non-transitory computer-readable medium containing program instructions for constraining wireless communication of an Unmanned Aerial Vehicle (UAV) may be provided, the computer-readable medium comprising: program instructions for receiving a signal indicative of a location dependent communication usage parameter; and program instructions for generating one or more UAV operation signals that enable operation of the communication unit in compliance with the communication usage parameters.
Additional aspects of the invention may include an unmanned aerial vehicle comprising: a communication unit configured to receive or transmit wireless communications; and a flight control unit configured to generate communication control signals delivered to a communication unit to enable operation of the communication unit, wherein the communication control signals are generated from one or more unmanned aerial vehicle operation signals, wherein the unmanned aerial vehicle operation signals are generated based on location-dependent communication usage parameters.
The communication usage parameter may limit wireless communication usage at one or more predetermined locations. The wireless communication may be a direct communication. The wireless communication may include radio frequency communication, WiFi communication, Bluetooth (Bluetooth) communication, or infrared communication. The wireless communication may be an indirect communication. The wireless communication may include 3G, 4G or LTE communication. The communication usage may be limited by not allowing any wireless communication. The communication usage may be limited by allowing wireless communication usage within only selected frequency bands. The communication usage may be limited by allowing wireless communication usage only when the wireless communication usage does not interfere with higher priority communications. In some cases, all other pre-existing wireless communications may have a higher priority than unmanned aerial vehicle communications. For example, if an unmanned aerial vehicle is flying nearby, various wireless communications occurring nearby may be considered to have a higher priority. In some cases, certain types of communications may be considered to be communications with higher priority-e.g., emergency services communications, government or official communications, medical device or service communications, etc. Alternatively or in combination, the communication usage parameters may allow wireless communication usage at one or more predetermined locations. For example, indirect communication may be allowed within a specified region, while direct communication is not allowed.
The one or more processors may also be configured, individually or collectively, to: receiving, using the first or second communication module, a signal indicative of a position of an unmanned aerial vehicle; and comparing the position of the unmanned aerial vehicle to the location-dependent communication-usage parameters and determining whether the unmanned aerial vehicle is located at a location that limits or allows operation of the communication unit. The location may be a communication restricted area. The communication restricted zone may be determined by a policer or by a private person. The restricted flight zone may be within a predetermined distance from a private residence, an airport, a public gathering site, government property, military property, school, power plant, or any other area that may be designated as a restricted flight zone. The position may remain stationary at all times or may change over time.
The location may depend on the wireless communications existing within the region. For example, a particular region may be identified as a communication restricted region if operation of the communication unit would interfere with one or more existing wireless communications within the region. Operation of the UAV communication unit in compliance with the communication usage parameters may reduce electromagnetic interference or audio interference. For example, if surrounding electronic equipment is being used, certain operations of the unmanned aerial vehicle communication units may interfere with them, for example, interfering with their wireless signals. Operation of the UAV communication unit in compliance with the communication usage parameters may reduce or eliminate interference. For example, operation of the unmanned aerial vehicle communication unit within a limited frequency band may not interfere with surrounding electronic device operations or communications. In another case, the suspension of operation of the unmanned aerial vehicle communication units within the region may prevent interference with surrounding electronic device operations or communications.
A signal indicative of a location-dependent communication usage parameter may be received from a control entity. The control entity is a regulator, an international organization or company, or any other type of control entity as described elsewhere herein. The control entity may be a global organization, such as any of the organizations and organizations described elsewhere herein. The control entity may be a source off-board or on-board the unmanned aerial vehicle. The control entity may be an air management system off-board the unmanned aircraft or any other part of an authentication system off-board the unmanned aircraft. The control entity may be a database, which may be stored in a memory of the unmanned aerial vehicle, or may be stored off-board the unmanned aerial vehicle. The database may be configured to be updatable. The control entity may be a transport device positioned at a location that restricts or allows payload operation. In some cases, the control entity may be a geo-fencing device as described elsewhere herein. In some cases, the signal may be transmitted based on a user identifier indicating a user of the UAV and/or an UAV identifier indicating a type of the UAV.
Identification module
The unmanned aerial vehicle may include one or more propulsion units that may propel the unmanned aerial vehicle. In some cases, the propulsion unit may include a rotor assembly, which may include one or more motors that drive rotation of one or more rotor blades. The unmanned aerial vehicle may be a multi-rotor unmanned aerial vehicle that may include a plurality of rotor assemblies. The rotor blades, when rotating, may provide propulsion, such as lift, to the unmanned aerial vehicle. Each rotor blade of the unmanned aerial vehicle can rotate at the same speed or at different speeds. The operation of the rotor blades may be used to control the flight of the unmanned aerial vehicle. The operation of the rotor blades may be used to control takeoff and/or landing of the unmanned aerial vehicle. The operation of the rotor blades may be used to control the maneuvering of the unmanned aerial vehicle in the airspace.
The unmanned aerial vehicle may include a flight control unit. The flight control unit may generate one or more signals that may control the operation of the rotor assembly. The flight control unit may generate one or more signals that control operation of one or more motors of the rotor assembly, which may in turn affect the rotational speed of the rotor blades. The flight control unit may receive data from one or more sensors. Data from the sensors may be used to generate one or more flight control signals to the rotor assembly. Examples of sensors may include, but are not limited to, GPS units, inertial sensors, visual sensors, ultrasonic sensors, thermal sensors, magnetometers, or other types of sensors. The flight control unit may receive data from the communication unit. The data from the communication unit may comprise commands from a user. The command may be input via a remote control, which may be transmitted to the UAV. The data from the communication unit and/or the sensor may include detection of the geo-fence device or information transmitted from the geo-fence device. Data from the communication unit may be used to generate one or more flight control signals to the rotor assembly.
In some embodiments, the flight control unit may control other functions of the UAV instead of or in addition to flight. The flight control unit may control operation of the payload on the UAV. For example, the payload may be an image capture device and the flight control unit may control operation of the image capture device. The flight control unit may control the positioning of the payload on the UAV. For example, the carrier may support a payload, such as an image capture device. The flight control unit may control operation of the carrier to control positioning of the payload. The flight control unit may control operation of one or more sensors on the UAV. This may include any of the sensors described elsewhere herein. The flight control unit may control communications of the unmanned aerial vehicle, navigation of the unmanned aerial vehicle, power usage of the unmanned aerial vehicle, or any other function on board the unmanned aerial vehicle.
Fig. 4 shows an example of a flight control unit according to an embodiment of the invention. The flight control module 400 may include an identification module 410, one or more processors 420, and one or more communication modules 430. In some embodiments, the flight control module of the unmanned aerial vehicle may be a circuit board, which may include one or more chips, such as one or more identification chips, one or more processor chips, and/or one or more communication chips.
The identification module 410 may be unique to the UAV. The identification module may be capable of uniquely identifying and distinguishing the UAV from other UAVs. The identity module may include an unmanned aerial vehicle identifier and a key for the unmanned aerial vehicle.
The drone identifier stored in the identification module is not alterable. The UAV identifier may be stored in the identification module in an unalterable state. The identification module may be a hardware component capable of storing the unique identifier in a manner that prevents a user from altering the unique UAV identifier for the UAV.
The UAV key may be configured to provide authentication verification of the UAV. The UAV key may be unique to the UAV. The UAV key may be an alphanumeric string that may be unique to the UAV and may be stored in the identification module. The UAV key may be randomly generated.
An UAV identifier and an UAV key may be used in combination to authenticate the UAV and to allow operation of the UAV. The unmanned aerial vehicle identifier and the unmanned aerial vehicle key may be authenticated using an authentication center. The authentication center may be off-board the unmanned aerial vehicle. The authentication center may be part of an authentication system as described elsewhere herein (e.g., authentication center 220 in fig. 2).
The UAV identifier and UAV key may be issued by an ID registration database (e.g., ID registration module 210 of FIG. 2) as described elsewhere herein. The ID registration database may be off-board the unmanned aerial vehicle. The identification module may be configured to receive the UAV identifier and UAV key once and no longer change after the initial receipt. Thus, the UAV identifier and UAV key may not be modifiable once they have been determined. In other cases, the drone identifier and key may be fixed after receipt and may no longer be writable. Alternatively, the UAV identifier and UAV key may only be modifiable by an authorized party. A conventional handler for an unmanned aerial vehicle may not be able to alter or modify the unmanned aerial vehicle identifier and unmanned aerial vehicle key in the identification module.
In some cases, the ID registration database may self-issue identification modules that may be manufactured into the unmanned aerial vehicle. The ID registration database may issue the identifier prior to or concurrently with the manufacture of the unmanned aerial vehicle. The ID registration database may issue the identifier before the unmanned aerial vehicle is sold or distributed.
The identity module may be implemented as a USIM. The identification module may be a write-once memory. The identification module may optionally be externally unreadable.
The identification module 410 may not be separable from the flight control unit 400. The identification module cannot be removed from the rest of the flight control unit without compromising the functionality of the flight control unit. The identification module cannot be removed from the rest of the flight control unit by hand. Individuals may not manually remove the identification module from the flight control unit.
The unmanned aerial vehicle may include a flight control unit configured to control operation of the unmanned aerial vehicle; and an identification module integrated into the flight control unit, wherein the identification module uniquely identifies the UAV from other UAVs. A method of identifying an unmanned aerial vehicle may be provided, the method comprising: controlling operation of the unmanned aerial vehicle using a flight control unit; and using an identification module integrated into the flight control unit to uniquely identify the UAV from the other UAVs.
The identification module may be physically coupled or attached to the flight control unit. The identification module may be integrated into the flight control unit. For example, the identification module may be a chip soldered to a circuit board of the flight control unit. Various physical techniques may be employed to prevent the identification module from separating from the rest of the flight control unit.
System In Package (SIP) technology may be employed. For example, multiple functional chips (including processors, communication modules, and/or identification modules) may be integrated into one package to perform complete functions. If the identification module is to be split, other modules in the package will be damaged, resulting in the unavailability of the UAV.
Fig. 5 shows an additional example of a flight control unit 500 according to an embodiment of the invention. A possible configuration using SIP technology is illustrated. The identification module 510 and the processor 520 may be packaged in the same chip. The identification module may not be separate from the processor and any attempt to remove the identification module will result in the removal or damage of the processor, which will result in damage to the flight control unit. The identification module may be integrated with one or more other components of the flight control unit within one package of the same chip. In other examples, the identification module may be packaged in the same chip as the communication module 530. In some cases, the identification module, the processor, and the communication module may all be packaged within one chip.
Chip On Board (COB) packaging may be employed. The bare chip may be adhered to the interconnect substrate with a conductive or non-conductive adhesive. Wire bonding may then be performed to achieve electrical connection thereof, also referred to as soft encapsulation. The identification module may be soldered to a circuit board of the flight control unit. After COB packaging, the identification module, once soldered to the circuit board, may not be completely removed. Attempts to physically remove the identification module will result in damage to the circuit board or other parts of the flight control unit.
Software may be used to ensure that the identification module is not separable from the rest of the flight control unit of the unmanned aerial vehicle. For example, each UAV may be implemented with a software version corresponding to its identification module. In other words, there may be a one-to-one correspondence between software versions and identification modules. The software version may be unique or substantially unique to the unmanned aerial vehicle. Proper operation of the software may require the acquisition of the UAV key stored in the identification module. The software release may not run without the corresponding UAV key. If the identification module is changed or removed, the software of the unmanned aerial vehicle cannot operate normally.
In some embodiments, the identification module may be issued by the control entity. The control entity may be any entity that exercises a form of authority for identifying or exercising a form of authority on the unmanned aerial vehicle. In some cases, the controlling entity may be a government agency or an operator authorized by the government. The government may be a national government, a state/province government, a municipality or any form of local government. The controlling entity may be a governmental agency, such as the united states Federal Aviation Administration (FAA), the united states Federal Trade Commission (FTC), the united states Federal Communications Commission (FCC), the united states National Telecommunications and Information Administration (NTIA), the united states department of transportation (DoT), or the united states department of defense (DoD). The control entity may be a policer. The control entity may be a national or international organization or company. The control entity may be a manufacturer of the unmanned aerial vehicle or a dealer of the unmanned aerial vehicle.
FIG. 6 illustrates an example of a flight control unit tracking the identity of a chip on the flight control unit according to an embodiment of the present invention. The flight control unit 600 may have an identification module 610 and one or more other chips (e.g., chip 1620, chip 2630 … …). The identification module may have a unique UAV identifier 612, a chip record 614, and one or more processors 616.
The identification module 610 may not be separable from the rest of the flight control unit 600. Alternatively, the identification module may be removable from the flight control unit. The identification module may uniquely identify the UAV from other UAVs via a unique UAV identifier 612.
The identification module may include a chip record 614, which may store a record of one or more peripheral chips 620, 630. Examples of other chips may include one or more processing chips, communication chips, or any other type of chip. The chip record may store any type of data about one or more peripheral chips, such as the type (e.g., model) of the peripheral chip, information about the chip manufacturer, the serial number of the chip, performance characteristics of the chip, or any other data about the chip. The record may be unique to a particular chip, may be unique to a type of chip, and/or may include parameters that are not necessarily unique to a chip or chip type. The chip record may be a memory unit.
When the UAV is started, the identification module may initiate a self-test, which may aggregate information about the peripheral chips and compare the aggregated information to information stored in the chip records. The one or more processors 616 of the identification module may be used to perform the comparison. The identification module can check whether the peripheral chip is consistent with the chip record in the identification module, so that whether the identification module is transplanted or not can be distinguished. For example, if the information currently collected during a self-test matches the initial chip record, the identification module is likely not migrated. If the information currently collected during the self-test procedure does not match the initial chip record, the identification module is likely to have been migrated. An indication of whether the identity module has been ported or the likelihood of a porting occurring may be provided to the user or another device. For example, when the initial chip record does not match the peripheral chip information at the time of self-test, an alert may be sent to the user device or to the control entity.
In some embodiments, the chip log information may not change. The chip recording information may be a write-once memory. The chip record may include information about one or more chips in the periphery collected the first time the unmanned aerial vehicle is turned on. The information about the peripheral chip may be hardwired into the chip record. Information about the peripheral chip may be provided by the manufacturer and may be built into the chip record. In some cases, the chip record may be externally unreadable.
In alternative embodiments, the chip log information may vary. The chip log information may be updated each time a self-test procedure occurs. For example, information about the peripheral chip may be used to replace or supplement existing records about the peripheral chip. A comparison may be made between the initial chip record and the chip information gathered during the self-test. If no changes are detected, it is likely that the identification module has not been migrated. If a change is detected, the identity module is likely to have been migrated. Similarly, an indication of whether a transplant has occurred may be provided.
For example, the initial chip may include a record showing two peripheral chips, one of which is model X with serial number ABCD123 and the other of which is model Y with serial number DCBA 321. A self-test procedure may occur. During the self-test procedure, information about the peripheral chips may be aggregated, which may show two chips, one of which is model X with serial number 12345FG and the other of which is model S with serial number HIJK 987. Because the data does not match, a higher probability may be provided that the identity module has been transplanted. The initial identification module that would have recorded model X with serial number 12345FG and model S with serial number HIJK987 in the chip record may have been removed. The initial identification module may be replaced by a new identification module taken from a different unmanned aerial vehicle whose flight control unit has chips of model X with serial number ABCD123 and model Y with serial number DCBA 321. The initial chip record may include a record of the unmanned aerial vehicle's peripheral chips from an initial manufacturer or initial configuration of the unmanned aerial vehicle or from a previous operation of the unmanned aerial vehicle. Regardless, the difference may indicate that the identification module has been migrated for the UAV since initial manufacture or initial configuration or since a previous operation.
Accordingly, an unmanned aerial vehicle may be provided, comprising: a flight control unit configured to control operation of an unmanned aerial vehicle, wherein the flight control unit includes an identification module and a chip, wherein the identification module is configured to (1) uniquely identify the unmanned aerial vehicle from among other unmanned aerial vehicles, (2) make up an initial record of the chip, and (3) aggregate information about the chip after making up the initial record of the chip, wherein the identification module is configured to undergo a self-test procedure that compares the aggregated information about the chip with the initial record of the chip, and wherein the identification module is configured to provide a warning when the aggregated information about the chip is inconsistent with the initial record of the chip.
A method of identifying an unmanned aerial vehicle may include: controlling operation of an unmanned aerial vehicle using a flight control unit, wherein the flight control unit comprises an identification module and a chip; using the identification module to uniquely identify the UAV from other UAVs, wherein the identification module constitutes an initial record of the chip; aggregating information about the chip after composing an initial record of the chip; comparing, using the identification module, the aggregated information about the chip to an initial record of the chip, thereby undergoing a self-test procedure; and providing a warning when the aggregated information about a chip is inconsistent with the initial record for the chip.
The chip record may be an integral part of the identification module. The chip record may be inseparable from the rest of the identification module. In some cases, the chip record cannot be removed from the identification module without damaging the rest of the identification module and/or the flight control unit.
The self-check may occur automatically without any user input. The self-test procedure may be initiated automatically when the unmanned aerial vehicle is powered on. For example, once the unmanned aerial vehicle is turned on, a self-test procedure may be performed. The self-test procedure may be initiated automatically when the unmanned aerial vehicle begins flying. When the unmanned aerial vehicle is powered off, the self-checking program can be automatically started. The self-test procedure may be initiated automatically at regular intervals (e.g., at regular or irregular intervals) during operation of the UAV. The self-test procedure may also occur in response to a detected event or in response to user input.
In some embodiments, an authentication system may be involved in issuing the identity module. The authentication system may issue a physical identification module or data that may be provided in an identification module. The ID registration module and/or the certificate authority may be involved in issuing the identification module. In implementing the authentication system, a control mechanism may be involved. In issuing the identity module, a control entity may be involved. The controlling entity may be a particular government agency or an operator authorized by a government, or any other type of controlling entity as described elsewhere herein.
To prevent the unmanned aerial vehicle from being illegally retrofitted (e.g., with a new identification module or new identifier), the authentication system (e.g., authentication center) may require periodic inspections of the unmanned aerial vehicle. Once the unmanned aerial vehicle is qualified and no tampering is detected, the authentication process may continue. The authentication process may uniquely identify the unmanned aerial vehicle and confirm that the unmanned aerial vehicle is the actual unmanned aerial vehicle identified by the identifier.
Identification for operation
A user of the unmanned aerial vehicle can be uniquely identified. The user may be uniquely identified by means of a user identifier. The user identifier may uniquely identify the user and may distinguish the user from other users. The user may be an operator of the unmanned aerial vehicle. The user may be an individual controlling the unmanned aerial vehicle. The user may control the flight of the unmanned aerial vehicle, control payload operation and/or placement of the unmanned aerial vehicle, control communications of the unmanned aerial vehicle, control one or more sensors of the unmanned aerial vehicle, control navigation of the unmanned aerial vehicle, control power usage of the unmanned aerial vehicle, or control any other function of the unmanned aerial vehicle.
The unmanned aerial vehicle can be uniquely identified. The unmanned aerial vehicle can be uniquely identified by means of an unmanned aerial vehicle identifier. The UAV identifier may uniquely identify the UAV and may distinguish the UAV from other UAVs.
In some cases, the user may be authorized to operate the unmanned aerial vehicle. One or more individual users may need to be identified before being able to operate the unmanned aerial vehicle. In some cases, all users, when identified, may be authorized to operate the unmanned aerial vehicle. Alternatively, only a selected group of users, when identified, may be authorized to operate the unmanned aerial vehicle. Some users may not be authorized to operate the unmanned aerial vehicle.
FIG. 8 illustrates a process that considers whether a user is authorized to operate the UAV before allowing the user to operate the UAV. The process may include receiving a user identifier 810 and receiving an unmanned aerial vehicle identifier 820. A determination may be made whether the user is authorized to operate unmanned aerial vehicle 830. If the user is not authorized to operate the UAV, the user is not allowed to operate the UAV 840. If the user is authorized to operate the UAV, the user is allowed to operate the UAV 850.
A user identifier 810 may be received. A user identifier may be received from a remote control. The user identifier may be received from a user input. The user identifier may be pulled from memory based on user input. User input may optionally be provided to a remote control or another device. In providing the user identifier, the user may log in or go through any authentication procedure. The user may manually enter the user identifier. The user identifier may be stored on the user device. The user identifier may be stored in memory without requiring the user to manually enter the user identifier.
An unmanned aerial vehicle identifier 820 can be received. A user identifier may be received from the UAV. The unmanned aerial vehicle identifier may be received from a user input. The UAV identifier may be pulled from memory based on user input. User input may optionally be provided to a remote control or another device. In providing the unmanned aerial vehicle identifier, the user may undergo an authentication process. Alternatively, the unmanned aerial vehicle may automatically undergo a self-identification or self-authentication procedure. The UAV identifier may be stored on the UAV or on a user device. The UAV identifier may be stored in memory without requiring a user to manually enter the UAV identifier. The UAV identifier may be stored on an identification module of the UAV. The unmanned aerial vehicle identifier for the unmanned aerial vehicle may optionally be unalterable.
The UAV may broadcast the UAV identifier during operation. The unmanned aerial vehicle identifier may be continuously broadcast. Alternatively, the UAV identifier may be broadcast upon request. The UAV identifier may be broadcast upon request by an air management system external to the UAV, an authentication system external to the UAV, or any other device. The UAV identifier may be broadcast when communication between the UAV and the air traffic system may be encrypted or authenticated. In some cases, the UAV identifier may be broadcast in response to an event. For example, the drone identifier may be automatically broadcast when the drone is turned on. The UAV identifier may be broadcast during an initialization procedure. The UAV identifier may be broadcast during an authentication procedure. Alternatively, the UAV identifier may be broadcast via a wireless signal (e.g., a radio signal, a light signal, or an acoustic signal). The identifier may be broadcast using direct communication. Alternatively, the identifier may be broadcast using indirect communication.
The user identifier and/or the unmanned aerial vehicle identifier may be received by an authentication system. The user identifier and/or the unmanned aerial vehicle identifier may be received at an authentication center of the authentication system or at an air traffic system. The user identifier and/or the UAV identifier may be received by the UAV and/or a remote control of the UAV. The user identifier and/or the unmanned aerial vehicle identifier may be received at one or more processors, which may determine whether the user is authorized to operate the unmanned aerial vehicle.
A determination may be made, with the aid of one or more processors, as to whether the user is authorized to operate unmanned aerial vehicle 830. The determination may be made on-board the unmanned aerial vehicle or off-board the unmanned aerial vehicle. The determination may be made on or off the user's remote control. The determination may be made at a device separate from the UAV and/or the remote control. In some cases, the determination may be made at a component of the authentication system. The determination may be made at an authentication center of the authentication system (e.g., authentication center 220 as illustrated in fig. 2) or an air management system of the authentication system (e.g., air management system 230 as illustrated in fig. 2).
The determination may be made at a device or system that may generate one or more sets of flight controls. For example, the determination may be made at an air traffic system that may generate flight controls under which one or more sets of unmanned aerial vehicles are to operate. The one or more sets of flight controls may depend on the location of the UAV or any other factor related to the UAV. The one or more sets of flight controls may be generated based on the user identifier and/or the unmanned aerial vehicle identifier.
The user identifier and the unmanned aerial vehicle identifier may be considered when determining that the user is authorized to operate the unmanned aerial vehicle. In some cases, only the user identifier and the unmanned aerial vehicle identifier may be considered. Alternatively, additional information may be considered. The information about the user may be associated with a user identifier. For example, information about the user type (e.g., skill level, experience level, certification, license, training) may be associated with the user identifier. The user's flight history (e.g., where the user is flying, the type of UAV the user is flying, whether the user has encountered any incidents) may be associated with the user identifier. The information about the unmanned aerial vehicle may be associated with an unmanned aerial vehicle identifier. For example, information about the type of unmanned aerial vehicle (e.g., model, manufacturer, characteristics, performance parameters, difficulty of operation level) may be associated with the unmanned aerial vehicle identifier. The flight history of the UAV (e.g., the location the UAV is flying, the users who previously interacted with the UAV) may also be associated with the UAV identifier. In determining whether the user is authorized to operate the unmanned aerial vehicle, information associated with the user identifier and/or the unmanned aerial vehicle identifier may be considered. In some cases, additional factors may be considered, such as geographic factors, timing factors, environmental factors, or any other type of factor.
Optionally, only a single user is authorized to operate the corresponding unmanned aerial vehicle. A one-to-one correspondence may be provided between authorized users and corresponding unmanned aerial vehicles. Alternatively, multiple users may be authorized to operate the unmanned aerial vehicle. A many-to-one correspondence may be provided between authorized users and corresponding unmanned aerial vehicles. The user may only be authorized to operate a single corresponding unmanned aerial vehicle. Alternatively, the user may be authorized to operate multiple unmanned aerial vehicles. A one-to-many correspondence may be provided between authorized users and a plurality of corresponding unmanned aerial vehicles. Multiple users may be authorized to operate multiple corresponding unmanned aerial vehicles. A many-to-many correspondence may be provided between authorized users and a plurality of corresponding unmanned aerial vehicles.
In some cases, the user may be pre-registered to operate the unmanned aerial vehicle. For example, only users that are pre-registered to operate the unmanned aerial vehicle may be authorized to operate the unmanned aerial vehicle. The user may be a registered owner of the unmanned aerial vehicle. When a user purchases or receives an unmanned aerial vehicle, the user may register as the owner and/or operator of the unmanned aerial vehicle. In some cases, multiple users may be able to register as owners and/or operators of unmanned aerial vehicles. Alternatively, only a single user may be able to register as the owner and/or operator of the UAV. The single user may be able to specify one or more other users that are allowed to operate the unmanned aerial vehicle. In some cases, only users having a user identifier that have registered to operate the unmanned aerial vehicle may be authorized to operate the unmanned aerial vehicle. One or more registration databases may store information about registered users that are allowed to operate the unmanned aerial vehicle. The registration database may be onboard the UAV or offboard the UAV. The user identifier may be compared to information in the registration database and the user may be allowed to operate the unmanned aerial vehicle only if the user identifier matches a user identifier associated with the unmanned aerial vehicle in the registration database. The registration database may be specific to the unmanned aerial vehicle. For example, the first user may be pre-registered to operate the unmanned aerial vehicle 1, but may not be pre-registered to operate the unmanned aerial vehicle 2. The user may then be allowed to operate the unmanned aerial vehicle 1, but may not be allowed to operate the unmanned aerial vehicle 2. In some cases, the registration database may be specific to the type of unmanned aerial vehicle (e.g., all unmanned aerial vehicles having a particular model).
In other cases, the registration database may be open, independent of the unmanned aerial vehicle. For example, the user may be pre-registered as an operator of the unmanned aerial vehicle. The user may be allowed to maneuver any unmanned aerial vehicle as long as those particular unmanned aerial vehicles do not have any other requirements for authorization.
Alternatively, the UAV may allow all users to operate the UAV by default. All users may be authorized to operate the UAV. In some cases, all users not on the "blacklist" may be authorized to operate the UAV. Thus, when it is determined whether the user is authorized to operate the unmanned aerial vehicle, the user may be authorized to operate the unmanned aerial vehicle as long as the user is not on the blacklist. One or more blacklist databases may store information about users that are not allowed to operate the unmanned aerial vehicle. The blacklist database may store user identifiers for users that are not allowed to operate the unmanned aerial vehicle. The blacklist database may be on-board the unmanned aircraft or off-board the unmanned aircraft. The user identifier may be compared to information in the blacklist database and the user may be allowed to operate the unmanned aerial vehicle only if the user identifier does not match the user identifier in the blacklist database. The blacklist registration may be specific to the unmanned aerial vehicle or the type of unmanned aerial vehicle. For example, a user may be blacklisted as being unable to maneuver a first UAV, but may not be blacklisted as being unable to fly a second UAV. The blacklist registration may be specific to the type of unmanned aerial vehicle. For example, a user may not be allowed to maneuver certain models of unmanned aerial vehicles, but may be allowed to maneuver other models of unmanned aerial vehicles. Alternatively, the blacklist registration need not be specific to the unmanned aerial vehicle or the type of unmanned aerial vehicle. The blacklist registration may be applicable to all unmanned aerial vehicles. For example, if a user is prohibited from operating any unmanned aerial vehicle, the user may not be authorized to operate the unmanned aerial vehicle, regardless of the unmanned aerial vehicle identity or type, and may not be allowed to operate the unmanned aerial vehicle.
The pre-registration or blacklist registration may also be applicable to other factors besides unmanned aerial vehicles or types of unmanned aerial vehicles. For example, pre-registration or blacklisting registration may be applicable to a particular location or jurisdiction. For example, a user may pre-register to operate an unmanned aerial vehicle in a first jurisdiction and not pre-register to operate an unmanned aerial vehicle in a second jurisdiction. This may or may not be known to the identity or type of the unmanned aerial vehicle itself. In another example, pre-registration or blacklist registration may be applicable to particular climate conditions. For example, when the wind speed exceeds 30mph, the user may be blacklisted as being unable to operate the unmanned aerial vehicle. In another example, other environmental conditions may be considered, such as environmental complexity, population density, or air traffic flow.
Additional considerations for whether the user is authorized to operate the unmanned aerial vehicle may depend on the user type. For example, a user skill or experience level may be considered in determining whether the user is authorized to operate the unmanned aerial vehicle. Information about the user, such as the user type, may be associated with the user identifier. When considering whether the user is authorized to operate the unmanned aerial vehicle, information about the user, such as the user type, may be considered. In one example, a user may be authorized to operate the unmanned aerial vehicle only when the user has reached a threshold skill level. For example, if a user has experienced training for the flight of an unmanned aerial vehicle, the user may be authorized to operate the unmanned aerial vehicle. As another example, a user may be authorized to operate an unmanned aerial vehicle if the user experiences proof that the user has certain flight skills. In another example, a user may be authorized to operate the unmanned aerial vehicle only when the user has reached a threshold level of experience. For example, a user may be authorized to operate an unmanned aerial vehicle if the user logs at least a threshold number of time-of-flight units. In some cases, the threshold number may be applicable to time of flight units for any UAV or only for UAVs having a type that matches the UAV. The information about the user may include demographic information about the user. For example, a user may be authorized to operate an unmanned aerial vehicle only when the user reaches a threshold age (e.g., is an adult). In determining whether the user is authorized to operate the unmanned aerial vehicle, information about the user and/or the unmanned aerial vehicle may be pulled and considered by way of the one or more processors. In determining whether the user is authorized to operate the unmanned aerial vehicle, one or more considerations may be made in accordance with the non-transitory computer-readable medium.
As previously described, additional factors, such as geographic factors, temporal factors, or environmental factors, may be considered in determining whether the user is authorized to operate the unmanned aerial vehicle. For example, only some users may be authorized to operate the unmanned aerial vehicle during the night, while other users may be authorized to operate the unmanned aerial vehicle only during the day. In one example, users who have undergone night flight training may be authorized to operate the unmanned aerial vehicle during the day and night, while users who have not undergone night flight training may only be authorized to operate the unmanned aerial vehicle during the day.
In some cases, different modes of unmanned aerial vehicle authentication may be provided. For example, in the pre-registration mode, only pre-registered users may be authorized to operate the unmanned aerial vehicle. In the open mode, all users may be authorized to maneuver the unmanned aerial vehicle. In the skill-based mode, only users who have exhibited a certain skill or experience level may be allowed to maneuver the unmanned aerial vehicle. In some cases, a single mode may be provided for user authentication. In other cases, the user may switch between user modes of operation. For example, the owner of the unmanned aerial vehicle may switch the authentication mode in which the unmanned aerial vehicle is to be operated. In some cases, other factors such as the location of the unmanned aerial vehicle, time, air traffic levels, environmental conditions, etc., may determine the authentication mode in which the unmanned aerial vehicle is to be operated. For example, if the environmental condition is windy or difficult to fly, the UAV may automatically allow only authorized users in the skills mode to maneuver the UAV.
When the user is not authorized to operate the unmanned aerial vehicle, the user is not allowed to operate the unmanned aerial vehicle 840. In some cases, this may result in the unmanned aerial vehicle not responding to commands from the user and/or the user's remote control. The user may not be able to fly or control the flight of the UAV. The user may not be able to control any other component of the unmanned aerial vehicle, such as a payload, a carrier, a sensor, a communication unit, a navigation unit, or a power unit. The user may or may not be able to power on the UAV. In some cases, a user may power on an unmanned aerial vehicle, but the unmanned aerial vehicle may not be responsive to the user. If the user is unauthorized, the UAV may optionally power itself down. In some cases, a warning or message may be provided to the user that the user is not authorized to operate the unmanned aerial vehicle. The reason why the user is not authorized may or may not be provided. Optionally, a warning or message may be provided to a second user that the user is not authorized to operate the UAV or that the user has attempted to operate the UAV. The second user may be an owner or operator of the unmanned aerial vehicle. The second user may be an individual authorized to operate the unmanned aerial vehicle. The second user may be an individual exercising control of the unmanned aerial vehicle.
In some alternative embodiments, when a user is not authorized to operate the UAV, the user may only be allowed to operate the UAV in a restricted manner. This may include geographic limitations, time limitations, speed limitations, limitations on the use of one or more additional components (e.g., payload, carrier, sensors, communication units, navigation units, power units, etc.). This may include the mode of operation. In one example, when a user is not authorized to operate the unmanned aerial vehicle, the user may not operate the unmanned aerial vehicle at a selected location. In another example, when a user is not authorized to operate the unmanned aerial vehicle, the user may operate the unmanned aerial vehicle only at selected locations.
When a user is authorized to operate the unmanned aerial vehicle, the user may be allowed to operate the unmanned aerial vehicle 850. The unmanned aerial vehicle may respond to commands from a user and/or a user's remote control. The user may be able to control the flight of the UAV or any other component of the UAV. The user may manually control the unmanned aerial vehicle through user input via a remote control. In some cases, the unmanned aerial vehicle may automatically override the user input to comply with a set of flight regulations. The set of flight controls may be pre-established or may be received in flight. In some cases, one or more geo-fencing devices may be used in establishing or providing the set of flight controls.
Aspects of the invention may relate to a method of operating an unmanned aerial vehicle. The method may include: receiving an unmanned aerial vehicle identifier that uniquely identifies the unmanned aerial vehicle from among other unmanned aerial vehicles; receiving a user identifier that uniquely identifies the user from among other users; evaluating, with the aid of one or more processors, whether a user identified by the user identifier is authorized to operate an unmanned aerial vehicle identified by the unmanned aerial vehicle identifier; and allowing the user to operate the UAV when the user is authorized to operate the UAV. Similarly, a non-transitory computer-readable medium containing program instructions for operating an unmanned aerial vehicle may be provided, the computer-readable medium comprising: program instructions for receiving an unmanned aerial vehicle identifier that uniquely identifies the unmanned aerial vehicle from among other unmanned aerial vehicles; program instructions for receiving a user identifier that uniquely identifies the user from among other users; program instructions for evaluating whether a user identified by the user identifier is authorized to operate an unmanned aerial vehicle identified by the unmanned aerial vehicle identifier; and program instructions for allowing the user to operate the UAV when the user is authorized to operate the UAV.
In addition, there may be provided an unmanned aerial vehicle authorization system comprising: a first communication module; and one or more processors operatively coupled to the first communication module and individually or collectively configured to: receiving an unmanned aerial vehicle identifier that uniquely identifies the unmanned aerial vehicle from among other unmanned aerial vehicles; receiving a user identifier that uniquely identifies the user from among other users; evaluating whether a user identified by the user identifier is authorized to operate an unmanned aerial vehicle identified by the unmanned aerial vehicle identifier; and transmitting a signal to allow the user to operate the UAV when the user is authorized to operate the UAV. An Unmanned Aerial Vehicle (UAV) authorization module may include: one or more processors individually or collectively configured to: receiving an unmanned aerial vehicle identifier that uniquely identifies the unmanned aerial vehicle from among other unmanned aerial vehicles; receiving a user identifier that uniquely identifies the user from among other users; evaluating whether a user identified by the user identifier is authorized to operate an unmanned aerial vehicle identified by the unmanned aerial vehicle identifier; and transmitting a signal to allow the user to operate the UAV when the user is authorized to operate the UAV.
The second user may be able to take over control of the UAV from the first user. In some cases, both the first user and the second user may be authorized to operate the UAV. Alternatively, only a second user is authorized to operate the UAV. The first user may be authorized to operate the unmanned aerial vehicle in a more limited manner than the second user. The second user may be authorized to operate the unmanned aerial vehicle in a less restricted manner than the first user. One or more levels of operation may be provided. A higher operational level may indicate a priority at which the user may operate the vehicle. For example, in operating an unmanned aerial vehicle, users at a higher level of operation may be prioritized over users at a lower level of operation. A user at a higher level of operation may be able to take over control of the unmanned aerial vehicle from a user at a lower level of operation. In some cases, the second user may be at a higher level of operation than the first user. A user at a higher level of operation may optionally be authorized to operate the unmanned aerial vehicle in a less restricted manner than a user at a lower level of operation. A user at a lower level of operation may optionally be authorized to operate the UAV in a more limited manner than a user at a higher level of operation. When a second user is authorized to operate the unmanned aerial vehicle and has a higher level of operation than the first user, the second user may take over operation of the unmanned aerial vehicle from the first user.
When the unmanned aerial vehicle authenticates the second user's privilege to operate the unmanned aerial vehicle, operation of the unmanned aerial vehicle by the second user may be allowed. The authentication may occur by means of a digital signature and/or digital certificate which verifies the identity of the second user. Authentication of the second user and/or the first user may occur using any authentication process as described elsewhere herein.
In some embodiments, the second user that may take over control may be part of an emergency service. For example, the second user may be part of a law enforcement agency, a fire service, a medical service, or a disaster relief service. The second user may be an electronic police officer. In some cases, the second user may be part of a government agency, such as an agency that may regulate air traffic flow or other types of traffic flow. The second user may be an empty pipe system handler. The user may be a member or administrator of the authentication system. The second user may be a member of a national defense force or a quasi-national defense force. For example, the second user may be a member of the united states Air Force (Air Force), the united states Coast Police Force (Coast Guard), the united states National Police Force (National Guard), the Chinese Armed Police Force (CAPF), or any other type of defense Force or equivalent in any jurisdiction of the world.
The first user may be notified when the second user takes over control. For example, a warning or message may be provided to the first user. The alert or message may be provided via the first user's remote control. The warning may be visually displayed or may be audible or tactilely discernable. In some implementations, the second user may request to take over control of the unmanned aerial vehicle from the first user. The first user may choose to accept or reject the request. Alternatively, the second user may be able to take over control without acceptance or permission from the first user. In some embodiments, there may be some lag time between the time the first user is alerted that the second user is about to take over control and the time the second user takes over control. Alternatively, little or no lag time is provided so that the second user may be able to take over immediately. The second user may be able to take over control in less than 1 minute, 30 seconds, 15 seconds, 10 seconds, 5 seconds, 3 seconds, 2 seconds, 1 second, 0.5 seconds, 0.1 seconds, 0.05 seconds, or 0.01 seconds of attempting to take over control.
The second user may take over control from the first user in response to any scenario. In some embodiments, the second user may take over control when the unmanned aerial vehicle enters a restricted area. When the UAV exits the restricted area, control may be returned to the first user. A second user may operate the unmanned aerial vehicle while the unmanned aerial vehicle is within the restricted area. In another case, the second user may be able to take over control of the unmanned aerial vehicle at any time. In some cases, the second user may be able to take over control of the unmanned aerial vehicle when a security or assurance threat is determined. For example, if it is detected that the unmanned aerial vehicle is traveling on a channel that collides with the aircraft, the second user may be able to take over control to avoid the collision of the unmanned aerial vehicle with the aircraft.
Authentication
The user of the unmanned aerial vehicle may be authenticated. The user may be uniquely identified by means of a user identifier. The user identifier may be authenticated to verify that the user is indeed the user associated with the user identifier. For example, if a user self-identifies using a user identifier associated with Bob Smith, the user may be authenticated to confirm that the user is indeed Bob Smith.
The unmanned aerial vehicle may be certified. The unmanned aerial vehicle can be uniquely identified by means of an unmanned aerial vehicle identifier. The UAV identifier may be authenticated to verify that the UAV is indeed the UAV associated with the UAV identifier. For example, if an unmanned aerial vehicle self-identifies using an unmanned aerial vehicle identifier associated with unmanned aerial vehicle ABCD1234, the unmanned aerial vehicle may be authenticated to confirm that the unmanned aerial vehicle is indeed unmanned aerial vehicle ABCD 1234.
In some cases, the user may be authorized to operate the unmanned aerial vehicle. One or more individual users may need to be identified before being able to operate the unmanned aerial vehicle. The identity of the user may need to be authenticated as the user's own individual in order to allow the user to operate the unmanned aerial vehicle. Before allowing the user to operate the unmanned aerial vehicle, the user identity must first be authenticated and confirmed, with the authenticated identity being authorized to operate the unmanned aerial vehicle.
Fig. 9 illustrates a process of determining whether to allow a user to operate the unmanned aerial vehicle according to an embodiment of the present invention. The process may include authenticating user 910 and authenticating unmanned aerial vehicle 920. If the user is not passing the authentication process, the user may not be allowed to operate the unmanned aerial vehicle 940. If the UAV fails the authentication process, the user may not be allowed to operate the UAV 940. A determination may be made as to whether the user is authorized to operate the unmanned aerial vehicle 930. If the user is not authorized to operate the unmanned aerial vehicle, the user may not be allowed to operate the unmanned aerial vehicle 940. If the user passes the authentication process, the user may be allowed to operate the unmanned aerial vehicle 950. If the UAV passes the authentication process, the user may be allowed to operate UAV 950. If the user is authorized to operate the unmanned aerial vehicle, the user may be allowed to operate the unmanned aerial vehicle 950. In some cases, both the user and the UAV must pass an authentication process before allowing the user to operate the UAV 950. Optionally, before allowing the user to operate unmanned aerial vehicle 950, both the user and the unmanned aerial vehicle must pass an authentication process and the user must be authorized to operate the unmanned aerial vehicle.
In some cases, the permission to operate the unmanned aerial vehicle may apply in any situation, or may only apply within one or more allotted volumes or regions. For example, the user/drone may need to be authenticated to fully operate the drone. In other cases, the user may typically be able to operate the unmanned aerial vehicle, but within a selected airspace, such as a restricted area, the user may need to be authenticated to operate the unmanned aerial vehicle.
One aspect of the invention may relate to a method of operating an unmanned aerial vehicle, the method comprising: authenticating an identity of an unmanned aerial vehicle, wherein the identity of the unmanned aerial vehicle is uniquely distinguishable from other unmanned aerial vehicles; authenticating an identity of a user, wherein the identity of the user is uniquely distinguishable from other users; evaluating, with the aid of one or more processors, whether the user is authorized to operate the UAV; and allowing the user to operate the UAV when the user is authorized to operate the UAV and both the UAV and the user are authenticated. Similarly, a non-transitory computer-readable medium containing program instructions for operating an unmanned aerial vehicle may be provided, the computer-readable medium comprising: program instructions for authenticating an identity of an unmanned aerial vehicle, wherein the identity of the unmanned aerial vehicle is uniquely distinguishable from other unmanned aerial vehicles; program instructions for authenticating an identity of a user, wherein the identity of the user is uniquely distinguishable from other users; program instructions for evaluating, with the aid of one or more processors, whether the user is authorized to operate the UAV; and program instructions for allowing the user to operate the UAV when the user is authorized to operate the UAV and both the UAV and the user are authenticated.
Moreover, the systems and methods provided herein may include an unmanned aerial vehicle authentication system comprising: a first communication module; and one or more processors operatively coupled to the first communication module and individually or collectively configured to: authenticating an identity of an unmanned aerial vehicle, wherein the identity of the unmanned aerial vehicle is uniquely distinguishable from other unmanned aerial vehicles; authenticating an identity of a user, wherein the identity of the user is uniquely distinguishable from other users; evaluating whether the user is authorized to operate the UAV; and transmitting a signal to allow the user to operate the UAV when the user is authorized to operate the UAV and both the UAV and the user are authenticated. An unmanned aerial vehicle authentication module can include: one or more processors individually or collectively configured to: authenticating an identity of an unmanned aerial vehicle, wherein the identity of the unmanned aerial vehicle is uniquely distinguishable from other unmanned aerial vehicles; authenticating an identity of a user, wherein the identity of the user is uniquely distinguishable from other users; evaluating whether the user is authorized to operate the UAV; and transmitting a signal to allow the user to operate the UAV when the user is authorized to operate the UAV and both the UAV and the user are authenticated.
The user identifier and/or the drone identifier may be aggregated in any manner described elsewhere herein. For example, during flight, an unmanned aerial vehicle can broadcast its identity information in a continuous manner or whenever necessary. For example, the unmanned aerial vehicle may broadcast the user identifier when a monitoring instruction is received from an air traffic system (e.g., police) or when communications between the unmanned aerial vehicle and the air traffic system are to be encrypted and authenticated. The broadcast of the identification information can be implemented in various ways (e.g., radio signals, optical signals, acoustic signals, or any other type of direct or indirect communication method as described elsewhere herein).
Any technique known or later developed in the art may be used to authenticate the user and/or the unmanned aerial vehicle. Further details and examples of user and/or unmanned aerial vehicle authentication are provided elsewhere herein.
Unmanned aerial vehicle
The unmanned aerial vehicle can be authenticated by means of a key of the unmanned aerial vehicle. The unmanned aerial vehicle can have a unique unmanned aerial vehicle identifier. The unmanned aerial vehicle may further be authenticated by means of an unmanned aerial vehicle identifier. The UAV identifier and UAV key information may be used in conjunction to authenticate the UAV. The unmanned aerial vehicle identifier and/or the unmanned aerial vehicle key may be provided on the unmanned aerial vehicle. The UAV identifier and/or key may be part of an identification module of the UAV. The identification module may be part of a flight control unit of the unmanned aerial vehicle. As described elsewhere herein, the identification module may not be separable from the flight control unit. The UAV identifier and/or UAV key may be non-removable from the UAV. The UAV may not disassociate from the UAV identifier and the UAV key on the UAV machine. In a preferred embodiment, the UAV identifier and/or UAV key may not be erased or altered.
Further description of unmanned aerial vehicle certification is provided elsewhere herein. Further details of how unmanned aerial vehicle authentication using an unmanned aerial vehicle identifier and/or key may occur are provided in greater detail elsewhere herein.
According to some embodiments of the invention, the authentication center may not authenticate any UAV without the UAV identifier and UAV key. If the UAV identifier or UAV key is lost, the UAV may not successfully access the air management system and may not perform any activities within the flight-restricted area. In some cases, a user may not be allowed to fully operate the UAV if the UAV identity is not authenticated. Alternatively, a user may not be allowed to operate the unmanned aerial vehicle within a restricted airspace, but may be allowed to operate the unmanned aerial vehicle in other regions. Any violations of such unmanned aerial vehicles can be deterred and penalized.
In certain specific environments, the unmanned aerial vehicle and the user may directly start the flight mission without authentication. For example, when a communication connection cannot be established between the unmanned aerial vehicle, the user, and the authentication center, the user may still be allowed to start the task. In some cases, authentication of the user and/or the unmanned aerial vehicle may occur if a communication connection is established during the mission. Responsive action may be taken if the user and/or the UAV are not authenticated. For example, the unmanned aerial vehicle may land after a predetermined period of time. In another case, the unmanned aerial vehicle may return to the flight origin. If the user and/or the UAV is authenticated, the user may be able to continue operating the UAV in an uninterrupted manner. In some cases, authentication of the user and/or the unmanned aerial vehicle does not occur even if a communication connection is established during the mission.
Authentication may occur in a later task if the task has been initiated without authentication. Depending on whether authentication is passed, flight response action may or may not be taken. Alternatively, when authentication may occur, authentication does not occur in a later task. The determination of whether to proceed with authentication during the task may be made based on any number of factors. For example, one or more environmental conditions may be considered. For example, environmental climate, topology, population density, air or surface traffic flow, environmental complexity, or any other environmental condition may be considered. For example, the unmanned aerial vehicle may be able to determine (e.g., by means of GPS and maps) whether it is located in a city or a suburban area, and may not need to be authenticated if it is located in a suburban area. Thus, authentication may be required when there is a higher population density, and may not be required when there is a lower population density. Authentication may be required when population density exceeds a population density threshold. In another case, authentication may be required when there is a higher air traffic flow density and may not be required when there is a lower air traffic flow density. Authentication may be required when the air traffic flow density exceeds a threshold. Similarly, authentication may be required when the environmental complexity (e.g., a higher number or density of surrounding objects) is high, and authentication may not be required when the environmental complexity is low. Authentication may be required when the environment complexity exceeds a threshold. Other types of factors may be considered in determining whether authentication is required, such as geography, time, and any other factors described elsewhere herein.
When an unmanned aerial vehicle flies without certification, its flight capability may be limited. The unmanned aerial vehicle may be restricted from flying according to a set of flight controls. The set of flight controls may include one or more rules that may affect the operation of the UAV. In some embodiments, the unmanned aerial vehicle may be restricted according to flight controls only when the unmanned aerial vehicle is flying without certification, otherwise no set of flight controls are imposed if the unmanned aerial vehicle is certified. Alternatively, during normal operation, the UAV may be restricted based on a set of flight restrictions, and an additional set of flight restrictions may be imposed if the UAV is not authenticated. In some embodiments, operation of the UAV may be restricted according to a set of flight controls, whether or not the UAV is certified, but the set of flight controls may require different rules based on whether or not the UAV is certified. In some cases, the rules may be more restrictive if the UAV is not authenticated. In general, non-certification of the unmanned aerial vehicle may result in a smaller degree of freedom for an operator of the unmanned aerial vehicle to control the unmanned aerial vehicle according to any aspect of the unmanned aerial vehicle (e.g., flight, payload operation or positioning, carrier, sensors, communications, navigation, power usage, or any other aspect).
Examples of the types of unmanned aerial vehicle flight capabilities that may be limited may include one or more of the following, or may include other types of limitations for unmanned aerial vehicles as described elsewhere herein. For example, the distance that the UAV flies may be limited, e.g., it must be within the user's visual range. The altitude and/or speed of flight may be limited. Optionally, a device carried by the unmanned aerial vehicle (such as a camera or other type of payload) may be required to temporarily suspend operation.
Different restrictions may be imposed according to the user's level. For example, for a more experienced user, fewer restrictions may be imposed. For example, a user with a higher level of experience or skill may be allowed to perform functions that a novice user may not be allowed to perform. A user with a higher level of experience or skill may be allowed to fly in areas or locations where novice users may not be allowed to fly. As described elsewhere herein, a set of flight controls for an unmanned aerial vehicle may be customized for a user type and/or an unmanned aerial vehicle type.
The unmanned aerial vehicle may be in communication with an authentication system. In some examples, the authentication system may have one or more characteristics as described elsewhere herein (e.g., fig. 2). The unmanned aerial vehicle may be in communication with an air traffic system of the authentication system. Any description herein of communication between the unmanned aerial vehicle and the air management system may apply to any communication between the unmanned aerial vehicle and any other portion of the authentication system. Any description herein of communication between the unmanned aerial vehicle and the air traffic control system may apply to communication between the unmanned aerial vehicle and any other external device or system that may contribute to unmanned aerial vehicle flight safety, security, or regulatory.
The unmanned aerial vehicle can communicate with the air traffic system in any manner. For example, the unmanned aerial vehicle may form a direct communication channel with the air duct system. Examples of direct communication channels may include radio connections, WiFi, WiMax, infrared, bluetooth, or any other type of direct communication. The unmanned aerial vehicle may form an indirect communication channel with the empty pipe system. The communication may be relayed via one or more intermediary devices. In one example, the communication may be relayed via a user and/or a user device (such as a remote control). Alternatively or additionally, the communication may be relayed via a single or multiple other unmanned aerial vehicles. The communication may be relayed via a ground station, router, tower, or satellite. The unmanned aerial vehicle can communicate using a single mode or multiple modes as described herein. Any communication means may be combined. In some cases, different communication modes may be used simultaneously. Alternatively or additionally, the unmanned aerial vehicle may switch between different communication modes.
After the unmanned aerial vehicle (possibly in conjunction with the user) has mutually authenticated with the authentication center (or any part of the authentication system), a secure communication connection with the air traffic control system may be obtained. A secure communication connection between the unmanned aerial vehicle and the user may also be obtained. The unmanned aerial vehicle can communicate directly with an air condition monitoring server of the air management system and/or one or more geo-fencing devices. The unmanned aerial vehicle may also communicate with the user and may be relayed via the user to reach a certification center or an air traffic system. In some embodiments, the direct communication with the air monitoring server and/or the one or more geo-fencing devices may occur only after authentication of the unmanned aerial vehicle and/or the user occurs. Alternatively, direct communication may occur even if authentication has not occurred. In some implementations, the communication between the UAV and the user may occur only after authentication of the UAV and/or the user. Alternatively, communication between the UAV and the user may occur even though authentication of the UAV and/or the user has not occurred.
In some embodiments, the flight plan of the unmanned aerial vehicle may be pre-registered with an empty pipe system. For example, the user may need to specify a planned location and/or timing of the flight. The empty pipe system may be capable of determining whether to allow the unmanned aerial vehicle to fly according to a flight plan. The flight plan may be accurate or may be an approximate estimate. During flight, the unmanned aerial vehicle may be autonomously controlled or semi-autonomously controlled to fly according to a flight plan. Alternatively, the user may have discretion to manually control the unmanned aerial vehicle, but should remain within the estimates of the flight plan. In some cases, the flight of the UAV may be monitored and if the manual control deviates too much from the proposed flight plan, the UAV may be forced to take flight response measures. Flight response measures may include forcing the UAV back onto the channel (e.g., taking over the flight by a computer or another individual), forcing the UAV to land, forcing the UAV to hover, or forcing the UAV to fly back to its origin. In some cases, the air management system may determine whether to allow the UAV to fly according to the flight plan based on the flight plan of the other UAVs, the current monitored air conditions, environmental conditions, any flight restrictions for the region and/or time, or any other factors. In an alternative embodiment, pre-registration of the flight plan may not be required.
After establishing the secure link, the unmanned aerial vehicle may apply for resources (e.g., airlines and airtimes or any other resources described elsewhere herein) from the traffic management module of the air management system. The traffic management module may manage traffic rights. The UAV may accept a set of flight controls (e.g., distance, altitude, speed, or any other type of flight control described elsewhere herein) for flight. The unmanned aerial vehicle can take off only when the flight permission is acquired. Flight plans may be documented in traffic management.
During flight, the unmanned aerial vehicle may periodically report its status to the air condition monitoring subsystem of the air management system. Any technique may be used to communicate the status of the UAV to the air monitoring subsystem. Direct communication or indirect communication, such as those described elsewhere herein, may be used. External sensor data may or may not be used in determining the unmanned aerial vehicle status and communicating the status information to the air condition monitoring subsystem. In some examples, the unmanned aerial vehicle status information may be broadcast or may be relayed to the traffic management subsystem by a ground station or other intermediate device. The unmanned aerial vehicle may receive supervision of a traffic management subsystem. The traffic management subsystem may communicate with the unmanned aerial vehicle using direct or indirect communication methods, such as those described elsewhere herein. If the scheduled flight is to be modified, the unmanned aerial vehicle may submit an application to the traffic management subsystem. The application may be filed prior to the start of flight of the unmanned aerial vehicle, or may occur after the unmanned aerial vehicle has started flight. The application can be made when the unmanned aerial vehicle is flying. Traffic management may have the ability to monitor the flight of an unmanned aerial vehicle. The traffic management subsystem may monitor the flight of the unmanned aerial vehicle based on information from the unmanned aerial vehicle and/or information from one or more sensors external to the unmanned aerial vehicle.
During flight, the UAV may communicate with other devices, including but not limited to other UAVs or geofencing devices. During flight, the unmanned aerial vehicle may also be capable of authentication (including, but not limited to, digital signature + digital certificate) and/or response (e.g., in response to an authenticated geo-fence device).
During flight, as further detailed elsewhere herein, the unmanned aerial vehicle may accept takeover control from a higher-level user (such as an air traffic system or electronic police). If the UAV authenticates the privilege, a higher level user may take over the control.
After the flight, the unmanned aerial vehicle may release the requested resource. The requested resource may also be released if the response to the traffic management subsystem times out. For example, the resource may be the location and/or timing of the planned unmanned aerial vehicle flight. When the flight is over, the unmanned aerial vehicle may send a signal to the traffic management subsystem to release the resource. Alternatively, the traffic management subsystem may self-initiate the release of the resource. The traffic management subsystem may self-initiate the release of the resource if the unmanned aerial vehicle ceases to communicate with the traffic management subsystem after a predetermined period of time. In some cases, if the application period ends (e.g., if the unmanned aerial vehicle is locked for a period of 3:00-4:00PM because of a mission and 4:00 has passed), the traffic management subsystem may self-initiate the release of the resource.
The unmanned aerial vehicle may respond to authentication requests and/or identity check requests. The request may come from an authentication system. In some cases, the request may come from an air condition monitoring server. The request may occur when the UAV is powered on. The request may occur when the unmanned aerial vehicle makes a request for a resource. The request may come prior to flight of the unmanned aerial vehicle. Alternatively, the request may come during the flight of the unmanned aerial vehicle. In some embodiments, the safety-enabled UAV will respond to authentication requests and/or identity check requests from the airborne monitoring server. In some implementations, the response may be made in any case (which may include an authentication failure).
During flight, if communication between the unmanned aerial vehicle and the air management system is interrupted and/or lost, the unmanned aerial vehicle may be able to quickly return to a relatively limited-rights flight state and return quickly. Thus, if communication between the unmanned aerial vehicle and the air traffic system is interrupted, flight response measures may be taken. In some cases, the flight response measure may be to cause the unmanned aerial vehicle to automatically fly back to the origin. The flight response measure may be to cause the unmanned aerial vehicle to automatically fly to a location of a user of the unmanned aerial vehicle. The flight response measure may be to automatically return the UAV to a homing position, which may or may not be the starting point of the UAV flight. The flight response measure may be an automatic descent. The flight response measure may be to automatically enter an autonomous flight mode in which the unmanned aerial vehicle flies according to a pre-registered flight plan.
User' s
The user may be an operator of the unmanned aerial vehicle. The users may be classified according to user types. In one example, users may be classified according to their skill and/or experience level. The authentication system may issue identification information for the user. For example, the authentication system may be responsible for issuing a certificate to a user and assigning a corresponding user identifier and/or user key. In some cases, the ID registration database may perform one or more functions. For example, the ID registration database may provide a user identifier and/or a user key.
The user may be authenticated. User authentication may occur using any technique known in the art or later developed. The user authentication technique may be similar to or different from the unmanned aerial vehicle authentication technique.
In one example, a user may be authenticated based on information provided by the user. The user may be authenticated based on knowledge that the user may have. In some cases, the knowledge may be known only to the user and not to other users. For example, the user may be authenticated by providing the correct username and password. The user may be authenticated by the user submitting a password, typing or swipe movement, signature, or any other type of information. The user may be authenticated by responding correctly to one or more challenges of the system. In some embodiments, the user may apply for a login name and/or password from the authentication center. The user may be able to log in using the login name and password.
The user may be authenticated based on physical characteristics of the user. Biometric information about the UAV may be used to authenticate the user. For example, a user may be authenticated by submitting biometric information. For example, the user may undergo a fingerprint scan, palm scan, iris scan, retina scan, or scan of any other portion of the user's body. The user may provide a body sample, such as saliva, blood, clipped nails, or clipped hair that may be analyzed to identify the user. In some cases, DNA analysis of a sample from a user may occur. The user may be authenticated by undergoing face recognition or gait recognition. The user may be authenticated by submitting a voiceprint. The user may submit the user's height and/or weight for analysis.
The user may be authenticated based on devices that may be owned by the user. The user may be authenticated based on a memory unit that may be owned by the user and/or information on the memory unit. For example, a user may have a memory device issued by a certification authority, other parts of a certification system, or any other source. The memory device may be an external memory device such as a U-disk (e.g., USB drive), an external hard drive, or any other type of memory device. In some embodiments, the external device may be coupled to a user remote control. For example, an external device such as a usb-disk may be physically connected to (e.g., embedded in/plugged into) or may communicate with (e.g., transmit signals that may be received by) the remote control. The device may be a physical memory storage device.
The user may be authenticated based on information that may be stored in memory that may be owned by the user. A separate physical memory device may or may not be used. For example, a token (such as a digitized token) may be owned by a user. The digitized token may be stored on a U-disk, hard drive, or other form of memory. The digitized token may be stored on a memory of the remote control. For example, the digitized token may be received from an authentication center, an ID registration database, or any other source via a remote control. In some embodiments, the digitized token may be externally unreadable from the memory of the remote control. The digitized token may or may not be alterable on the memory of the remote control.
The user may be authenticated by means of an identification module that may be provided on the remote control. The identification module may be associated with a user. The identification module may include a user identifier. In some implementations, the identification module can include user key information. The data stored in the identification module may or may not be externally readable. The data stored in the identification module may optionally not be modifiable. The identification module may optionally not be detachable from the remote control. The identification is optionally not removable from the remote control without damaging the flight controller. The identification module may optionally be integrated in the remote control. The information in the identification module may be recorded via a certificate authority. For example, the authentication center may keep a record of information from the identification module of the remote control. In one example, the user identifier and/or user key may be recorded by the authentication center.
The user may be authenticated by undergoing a mutual authentication process. In some cases, the mutual authentication procedure may be similar to an Authentication and Key Agreement (AKA) procedure. The user may be authenticated by means of a key on a user terminal used by the user to communicate with the unmanned aerial vehicle. The terminal may optionally be a remote control that may send one or more command signals to the UAV. The terminal may be a display device that may show information based on data received from the unmanned aerial vehicle. Alternatively, the key may be part of the identity module of the user terminal and may be integrated into the user terminal. The key may be part of the identification module of the remote control and may be integrated into the remote control. The key may be provided by the authentication system (e.g., an ID registration database of the authentication system). Further examples and details of mutual authentication of users may be provided in more detail elsewhere herein.
In some implementations, the user may need to have software or an application to operate the unmanned aerial vehicle. The software and applications themselves may be authorized as part of the user authentication process. In one example, a user may have a smartphone app that may be used to operate an unmanned aerial vehicle. The smart phone app itself may be directly authorized. When the smartphone app used by the user is authenticated, further user authentication may or may not be used. In some cases, the smartphone authorization may be coupled with additional user authentication steps detailed elsewhere herein. In some cases, smartphone app authorization may be sufficient to authenticate the user.
The user may be authenticated by means of an authentication system. In some cases, the user may be authenticated by an authentication center of the authentication system (e.g., authentication center 220 as illustrated in fig. 2) or any other component of the authentication system.
User authentication may occur at any point in time. In some embodiments, user authentication may occur automatically when the UAV is turned on. User authentication may occur automatically when the remote control is turned on. User authentication may occur when the remote control forms a communication channel with the UAV. User authentication may occur when the remote control and/or the UAV forms a communication channel with an authentication system. User authentication may occur in response to input from a user. User authentication may occur, for example, when a user attempts to log in or provide information about the user (e.g., username, password, biometric information). In another example, user authentication may occur when information from a memory device (e.g., a U-disk) is provided to an authentication system, or when information (e.g., a digitized token or key) is provided to an authentication system. The authentication process can be pushed from the user or user device. In another example, user authentication may occur when authentication is requested from an authentication system or another external source. The authentication center of the authentication system or the air traffic system may request authentication of the user. The authentication center or the air traffic system may request authentication from the user one or more times. Authentication may occur prior to and/or during flight of the unmanned aerial vehicle. In some cases, the user may be authenticated prior to performing the flight mission using the unmanned aerial vehicle. The user may be authenticated before the flight plan may be approved. The user may be authenticated before the user is able to exercise control over the UAV. The user may be authenticated prior to the UAV being allowed to take off. The user may be authenticated after losing and/or reestablishing connection with the authentication system. The user may be authenticated when one or more events or conditions (e.g., unusual UAV flight patterns) are detected. The user may be authenticated when a suspected unauthorized takeover of the unmanned aerial vehicle occurs. The user may be authenticated when a suspected interference with the communications of the UAV occurs. The user may be authenticated when the UAV deviates from the desired flight plan.
Similarly, unmanned aerial vehicle authentication may occur at any time, such as the times mentioned above for user authentication. The user and unmanned aerial vehicle authentication can occur substantially simultaneously (e.g., within 5 minutes or less, 4 minutes or less, 3 minutes or less, 2 minutes or less, 1 minute or less, 30 seconds or less, 15 seconds or less, 10 seconds or less, 5 seconds or less, 3 seconds or less, 1 second or less, 0.5 seconds or less, or 0.1 seconds or less of each other). User and unmanned aerial vehicle authentication may occur in similar conditions or scenarios. Alternatively, they may occur at different times and/or in response to different conditions or scenarios.
Information about the user may be collected after the user is authenticated. The information about the user may include any of the information described elsewhere herein. For example, the information may include a user type. The user type may include a skill and/or experience level of the user. The information may include past flight data of the user.
Authentication center
An authentication system may be provided according to an embodiment of the present invention. The authentication system may include an authentication center. Any description of the authentication center herein may apply to any component of the authentication system. Any description herein of the authentication system may apply to an external device or entity, or one or more functions of the authentication system may be performed on the UAV and/or on a remote control.
The certification center may be responsible for maintaining data about one or more users and/or unmanned aerial vehicles. The data may include an associated user identifier, an associated user key, an associated UAV identifier, and/or an associated UAV key. The authentication center may receive an identity of the user and/or an identity of the UAV. In some implementations, the authentication system may be responsible for maintaining all data about one or more users and the unmanned aerial vehicle. Alternatively, the authentication system may be responsible for maintaining a subset of all data about one or more users and the unmanned aerial vehicle.
The controller of the unmanned aerial vehicle (e.g., the user's remote control) and the unmanned aerial vehicle may issue a login request to the air management system. The controller and/or the UAV may issue a login request prior to flight of the UAV. The controller and/or the UAV may issue a login request prior to allowing flight of the UAV. The controller and/or the unmanned aerial vehicle may issue a login request when the controller and/or the unmanned aerial vehicle is turned on. The controller and/or the unmanned aerial vehicle may issue a login request when a connection is established between the controller and the unmanned aerial vehicle, or when a connection is established between the controller and an external device, or when a connection is established between the unmanned aerial vehicle and an external device. The controller and/or the UAV may issue a login request in response to a detected event or condition. The controller and/or the unmanned aerial vehicle may issue a login request when providing instructions for authentication. The controller and/or the unmanned aerial vehicle may issue a login request when instructions for authentication are provided by an external source (e.g., an authentication center). The controller and/or the unmanned aerial vehicle may initiate the login request, or the login request may be provided in response to an initiation from outside the controller and/or the unmanned aerial vehicle. The controller and/or the UAV may make a login request at a single point in time during the UAV session. Alternatively, the controller and/or the UAV may make login requests at multiple points in time during the UAV session.
The controller and the unmanned aerial vehicle can issue the login request substantially simultaneously (e.g., within less than 5 minutes, less than 3 minutes, less than 2 minutes, less than 1 minute, less than 30 seconds, less than 15 seconds, less than 10 seconds, less than 5 seconds, less than 3 seconds, less than 1 second, less than 0.5 seconds, or less than 0.1 seconds of each other). Alternatively, the controller and the unmanned aerial vehicle may make login requests at different times. The controller and the UAV may make a login request based on detection of the same event or condition. For example, when a connection is established between the controller and the unmanned aerial vehicle, both the controller and the unmanned aerial vehicle may make a login request. Alternatively, the controller and the UAV may make login requests based on different events or conditions. Therefore, the controller and the unmanned aerial vehicle can make a login request independently of each other. For example, the controller may make a login request when the controller is powered on, and the UAV may make a login request when the UAV is powered on. These events may occur at times independent of each other.
Any description of a login request may be applicable to any type of authentication as described elsewhere herein. For example, any description of a login request may be applicable to providing a username and password. In another example, any description of the login request may apply to the initiation of the AKA protocol. In another example, any description of the login request may include a provision of a physical characteristic of the user. The login request may be to initiate an authentication procedure or to request authentication.
After receiving a login request from a user of the unmanned aerial vehicle and/or the unmanned aerial vehicle, the authentication system may initiate an authentication process. In some cases, the bare pipe system may receive a login request and may initiate an authentication process using an authentication center. Alternatively, the authentication center may receive the login request and initiate the authentication process itself. The login request information may be transmitted to an authentication center, and the authentication center may authenticate the identity information. In some cases, the login request information may include a username and/or password. In some implementations, the login information may include a user identifier, a user key, an unmanned aerial vehicle identifier, and/or an unmanned aerial vehicle key.
The communication connection between the air traffic system and the authentication center may be secure and reliable. Alternatively, the empty pipe system and the authentication center may utilize one or more sets of the same processor and/or memory storage units. Alternatively, they may not utilize the same processor and/or memory storage unit or units. The blank pipe system and the certification center may or may not utilize the same set of hardware. The empty pipe system and the authentication center may or may not be provided at the same location. In some cases, a hardwired connection may be provided between the air management system and the authentication center. Alternatively, wireless communication may be provided between the air traffic system and the authentication center. Direct communication may be provided between the air traffic system and the authentication center. Alternatively, indirect communication may be provided between the air traffic system and the authentication center. The communication between the air management system and the authentication center may or may not traverse the network. The communication connection between the blank pipe system and the authentication center may be encrypted.
After authentication of the user and/or the unmanned aerial vehicle at the authentication center, a communication connection is established between the unmanned aerial vehicle and the air traffic control system. In some implementations, authentication of both the user and the unmanned aerial vehicle may be required. Alternatively, authentication of the user or authentication of the unmanned aerial vehicle may be sufficient. A communication connection may optionally be established between the remote control and the air traffic control system. Alternatively or additionally, a communication connection may be established between the remote control and the unmanned aerial vehicle. Further authentication may or may not occur after a communication connection, such as the one described herein, has been established.
The unmanned aerial vehicle may communicate with the air traffic system via a direct communication channel. Alternatively, the unmanned aerial vehicle may communicate with the air traffic system via an indirect communication channel. The unmanned aerial vehicle may communicate with the air traffic system by relaying through a user or a remote control operated by the user. The unmanned aerial vehicle may communicate with the air duct system by relaying through one or more other unmanned aerial vehicles. Any other type of communication may be provided, such as those described elsewhere herein.
The user's remote control may communicate with the air traffic system via a direct communication channel. Alternatively, the remote control may communicate with the air traffic system via an indirect communication channel. The remote control may then communicate with the air traffic system by relaying via an unmanned aerial vehicle operated by the user. The remote control may communicate with the air traffic system by relaying through one or more other unmanned aerial vehicles. Any other type of communication may be provided, such as those described elsewhere herein. In some cases, there is no need to provide a communication connection between the remote control and the air traffic system. In some cases, a communication connection between the remote control and the unmanned aerial vehicle may be sufficient. Any type of communication may be provided between the remote control and the UAV, such as those described elsewhere herein.
After authenticating the user and/or the unmanned aerial vehicle, the unmanned aerial vehicle may be allowed to apply for resources from a traffic management module of the air management system. In some implementations, authentication of both the user and the unmanned aerial vehicle may be required. Alternatively, authentication of the user or authentication of the unmanned aerial vehicle may be sufficient.
The resources may include air routes and/or airtimes. The resources may be used according to a flight plan. The resources may include one or more of the following: sensing and avoidance assistance, access to one or more geo-fencing devices, access to battery stations, access to fuel stations, or access to base stations and/or docks. Any other resources as described elsewhere herein may be provided.
The traffic management module of the air management system may record one or more flight plans of the unmanned aerial vehicle. The unmanned aerial vehicle may be allowed to apply for a modification to the scheduled flight of the unmanned aerial vehicle. The unmanned aerial vehicle may be allowed to modify a flight plan of the unmanned aerial vehicle prior to initiating the flight plan. The unmanned aerial vehicle may be allowed to modify the flight plan while the unmanned aerial vehicle is executing the flight plan. The traffic management module may make a determination whether to allow the unmanned aerial vehicle to make the requested modification. If the unmanned aerial vehicle is allowed to make the requested modification, the flight plan may be updated to include the requested modification. If the unmanned aerial vehicle is not allowed to make the requested modification, the flight plan may not be changed. The unmanned aerial vehicle may be required to comply with the original flight plan. If the UAV deviates significantly from the flight plan (whether initially or updated), flight response measures may be applied to the UAV.
In some implementations, after authenticating the user and/or the unmanned aerial vehicle at the authentication center, a communication connection may be established between the unmanned aerial vehicle and the one or more geo-fence devices. Further details regarding the geo-fencing device are provided elsewhere herein.
After authenticating the user and/or the UAV at the authentication center, a communication connection may be established between the UAV and one or more authenticated intermediaries. The certified intermediary may be another certified unmanned aerial vehicle or a certified geo-fencing device. The authenticated intermediary may be a base station or a station or device that may relay communications. The authenticated intermediary may undergo any type of authentication process, such as those described elsewhere herein. For example, the authenticated intermediary may pass authentication using an AKA procedure.
A determination may be made whether the user is authorized to operate the unmanned aerial vehicle. The determination may be made prior to authenticating the user and/or the unmanned aerial vehicle, while authenticating the user and/or the unmanned aerial vehicle, or after authenticating the user and/or the unmanned aerial vehicle. If the user is not authorized to operate the UAV, the user may not be allowed to operate the UAV. If the user is not authorized to operate the unmanned aerial vehicle, the user may only be able to operate the unmanned aerial vehicle in a limited manner. When the user is not authorized to operate the unmanned aerial vehicle, the user may only be allowed to operate the unmanned aerial vehicle at a selected location. One or more flight controls, such as those described elsewhere herein, may be imposed on a user that is not authorized to operate the unmanned aerial vehicle. In some implementations, the set of flight restrictions imposed on the user when the user is not authorized to operate the UAV may be more restrictive or stricter than the restrictions that may be imposed on the user when the user is authorized to operate the UAV. A set of flight restrictions may or may not be imposed on a user when the user is authorized to operate the unmanned aerial vehicle. When a user is authorized to operate the UAV, a set of flight restrictions imposed on the user may include a null value. When the user is authorized to operate the unmanned aerial vehicle, the user may be able to operate the unmanned aerial vehicle in an unrestricted manner. Alternatively, some limitations may be imposed, but these may be less stringent or may be different than the limitations that may be imposed on the user when the user is not authorized to operate the UAV.
A set of flight controls may depend on the identity of the unmanned aerial vehicle and/or the identity of the user. In some cases, the set of flight controls may be changed based on the identity of the UAV and/or the identity of the user. The limits on the flight of the unmanned aerial vehicle may be adjusted or maintained based on the identity of the unmanned aerial vehicle. The limits on the flight of the unmanned aerial vehicle may be adjusted or maintained based on the identity of the user. In some embodiments, a set of default limits on the flight of the UAV may be provided. The default value may be in place prior to authenticating and/or identifying the UAV. The default value may be in place prior to authenticating and/or identifying the user. The default value may be maintained or adjusted based on an authenticated identity of the user and/or the UAV. In some cases, the default value may be adjusted to a less restrictive set of flight restrictions. In other cases, the default values may be adjusted to a more restrictive set of flight restrictions.
In some implementations, if a user and/or an unmanned aerial vehicle is identified and authenticated, the user may be authorized to operate the unmanned aerial vehicle. In some cases, even if a user and/or an unmanned aerial vehicle is identified and authenticated, the user may not be authorized for the unmanned aerial vehicle. Whether the user is authorized to operate the unmanned aerial vehicle may be independent of whether the user and/or the unmanned aerial vehicle is authenticated. In some cases, identification and/or authentication may occur prior to determining whether a user is authorized to confirm the user and/or the UAV before making a determination whether the user is authorized to operate the UAV.
In some cases, only a single user is authorized to operate the unmanned aerial vehicle. Alternatively, multiple users may be authorized to operate the unmanned aerial vehicle.
The unmanned aerial vehicle may be authenticated prior to allowing the unmanned aerial vehicle to take off. The user may be authenticated prior to allowing the UAV to take off. The user may be authenticated prior to allowing the user to exercise control of the UAV. The user may be authenticated prior to allowing the user to send one or more operating commands to the UAV via the user remote control.
Degree of authentication
Different degrees of authentication may occur. In some cases, different authentication processes may occur, such as those described elsewhere herein. In some cases, a higher degree of authentication may occur, while in other cases, a lower degree of authentication may occur. In some implementations, the determination may be made as to the extent or type of authentication process to be experienced.
One aspect of the invention provides a method of determining a level of certification for operation of an Unmanned Aerial Vehicle (UAV), the method comprising: receiving context information regarding the UAV; evaluating, using one or more processors, a degree of authentication for the UAV or a user of the UAV based on the contextual information; enabling authentication of the UAV or the user according to the degree of authentication; and allowing the user to operate the UAV when the degree of authentication is complete. Similarly, a non-transitory computer-readable medium containing program instructions for determining a level of authentication for operating an Unmanned Aerial Vehicle (UAV) may be provided, the computer-readable medium comprising: program instructions for receiving contextual information about the UAV; program instructions for evaluating a degree of authentication of the UAV or a user of the UAV based on the contextual information; program instructions for enabling authentication of the UAV or the user based on the degree of authentication; and program instructions for providing a signal that allows the user to operate the UAV when the degree of authentication is complete.
An Unmanned Aerial Vehicle (UAV) authentication system may include: a communication module; and one or more processors operatively coupled to the communication module and individually or collectively configured to: receiving context information regarding the UAV; evaluating a degree of authentication of the unmanned aerial vehicle or a user of the unmanned aerial vehicle based on the contextual information; and enabling authentication of the unmanned aerial vehicle or the user according to the degree of authentication. An Unmanned Aerial Vehicle (UAV) authentication module may be provided, comprising: one or more processors individually or collectively configured to: receiving context information regarding the UAV; evaluating a degree of authentication of the unmanned aerial vehicle or a user of the unmanned aerial vehicle based on the contextual information; and enabling authentication of the unmanned aerial vehicle or the user according to the degree of authentication.
A degree of authentication may be provided for the user and/or the UAV. In some cases, the degree of authentication for the user may be variable. Alternatively, the degree of authentication for the user may be fixed. The degree of certification for the unmanned aerial vehicle may be variable. Alternatively, the degree of certification for the unmanned aerial vehicle may be fixed. In some embodiments, the degree of authentication for both the user and the UAV may be variable. Alternatively, the degree of authentication for both the user and the unmanned aerial vehicle may be fixed. Alternatively, the degree of authentication for the user may be variable and the degree of authentication for the unmanned aerial vehicle may be fixed, or the degree of authentication for the user may be fixed and the degree of authentication for the unmanned aerial vehicle may be variable.
The degree of certification may include any certification that is not required for the user and/or the UAV. For example, the degree of authentication may be zero. Thus, the degree of authentication may include not authenticating the UAV or the user. The degree of authentication may include authenticating the UAV and the user. The degree of authentication may include authenticating the unmanned aerial vehicle without authenticating the user, or may include authenticating the user without authenticating the unmanned aerial vehicle.
The degree of authentication may be selected from a plurality of options for the degree of authentication of the UAV and/or the user. For example, three options for the degree of authentication of the UAV and/or the user may be provided (e.g., high, medium, or low degrees of authentication). Any number of degrees of authentication options may be provided (e.g., 2 or more, 3 or more, 4 or more, 5 or more, 6 or more, 7 or more, 8 or more, 9 or more, 10 or more, 12 or more, 15 or more, 20 or more, 25 or more options). In some cases, the degree of authentication may be generated and/or determined without selecting from one or more predetermined options. The authentication degree may be generated in flight.
A higher degree of authentication may provide a higher level of determination that the user is the identified user or that the unmanned aerial vehicle is the identified unmanned aerial vehicle than a lower degree of authentication. A higher degree of authentication may provide a higher level of determination that the user identifier matches the actual user and/or that the unmanned aerial vehicle identifier matches the actual unmanned aerial vehicle than a lower degree of authentication. A higher degree of authentication may be a more stringent authentication process than a lower degree of authentication. A higher degree of authentication may include an authentication process with a lower degree of authentication plus an additional authentication process. For example, a lower degree of authentication may include only a username/password combination, while a higher degree of authentication may include the username/password combination plus the AKA authentication process. Alternatively, a higher degree of authentication may occupy more resources or computational power. Alternatively, a higher degree of authentication may take a greater amount of time.
Any description herein of the degree of authentication may apply to the type of authentication. For example, authentication type usage may be selected from a number of different options. The authentication type may or may not indicate a high degree of authentication. Different types of authentication procedures may be selected according to the context information. For example, a username/password combination may be used for authentication or biometric data may be used for authentication based on contextual information. Depending on the context information, the AKA authentication process plus biometric data authentication may occur, or username/password plus biometric sample data authentication may occur. Any description herein of selecting a degree of authentication may also apply to selecting a type of authentication.
Context information may be used to evaluate the degree of authentication. Contextual information may include information about users, unmanned aerial vehicles, remote controls, geo-fencing devices, environmental conditions, geographic conditions, timing conditions, communication or network conditions, task risks (e.g., risks of attempted takeover or interference), or any other type of information that may be relevant to a task. Contextual information may include information provided by a user, a remote control, an unmanned aerial vehicle, a geo-fencing device, an authentication system, an external device (e.g., an external sensor, an external data source), or any other device.
In one example, the context information may include environmental conditions. For example, the contextual information may include an environment within which the unmanned aerial vehicle is to be operated. The environment may be of an environmental type such as a rural, suburban or urban area. A higher degree of certification may be required when the unmanned aerial vehicle is located in an urban area than when the unmanned aerial vehicle is located in a rural area. A higher degree of authentication may be required when the unmanned aerial vehicle is located in a metropolitan area than when the unmanned aerial vehicle is located in a suburban area. A higher degree of certification may be required when the unmanned aerial vehicle is located in a suburban area than when the unmanned aerial vehicle is located in a rural area.
The environmental conditions may include population density of the environment. A higher degree of certification may be required when the unmanned aerial vehicle is located in an environment having a higher population density than when the unmanned aerial vehicle is located in an environment having a lower population density. A higher degree of certification may be required when the unmanned aerial vehicle is located in an environment where population density meets or exceeds a population threshold, and a lower degree of certification may be required when the unmanned aerial vehicle is located in an environment where population does not exceed or fall below the population threshold. Any number of population thresholds may be provided, which may be used to determine the degree of authentication. For example, three population thresholds may be provided, wherein an increased degree of authentication may be required when each threshold is reached and/or exceeded.
The environmental conditions may include a degree of traffic flow within the environment. Traffic flow may include air traffic flow and/or surface-based traffic flow. Surface-based traffic flow may include land vehicles and/or water vehicles in the environment. A higher degree of certification may be required when the unmanned aerial vehicle is located in an environment having a higher degree of traffic flow than when the unmanned aerial vehicle is located in an environment having a lower degree of traffic flow. A higher degree of certification may be required when the unmanned aerial vehicle is located in an environment where the degree of traffic flow meets or exceeds the traffic flow threshold, while a lower degree of certification may be required when the unmanned aerial vehicle is located in an environment where the degree of traffic flow does not exceed or fall below the traffic flow threshold. Any number of traffic flow thresholds may be provided, which may be used to determine the degree of authentication. For example, five traffic flow thresholds may be provided, wherein an increased degree of authentication may be required when each threshold is reached and/or exceeded.
The environmental condition may include an environmental complexity of the environment. The environmental complexity may indicate obstacles and/or potential safety hazards within the environment. The environmental complexity factor may be used to represent the degree to which an obstacle occupies the environment. The environmental complexity factor may be a quantitative measure or a qualitative measure. In some implementations, the environmental complexity factor can be determined based on one or more of: a number of obstacles, a volume or percentage of space occupied by an obstacle within a distance of the UAV, a volume or percentage of space unobstructed by an obstacle within a distance of the UAV, a distance of an obstacle from the UAV, an obstacle density (e.g., number of obstacles per unit of space), a type of obstacle (e.g., stationary or moving), a spatial layout (e.g., position, orientation) of obstacles, a motion (e.g., velocity, acceleration) of an obstacle, etc. For example, an environment with a relatively high density of obstacles would be associated with a high environmental complexity factor (e.g., indoor environment, urban environment), while an environment with a relatively low density of obstacles would be associated with a low environmental complexity factor (e.g., high altitude environment). As another example, an environment in which a large percentage of the space is occupied by an obstacle will have a higher complexity, while an environment with a large percentage of unobstructed space will have a lower complexity. An environment complexity factor may then be computed based on the generated environment representation. The environment complexity factor may be determined based on a three-dimensional digital representation of the environment generated using the sensor data. The three-dimensional digital representation may comprise a three-dimensional point cloud or an occupancy grid. A higher degree of certification may be required when the unmanned aerial vehicle is located in an environment with higher environmental complexity than when the unmanned aerial vehicle is located in an environment with lower environmental complexity. A higher degree of certification may be required when the unmanned aerial vehicle is located in an environment having an environmental complexity level that meets or exceeds an environmental complexity threshold, and a lower degree of certification may be required when the unmanned aerial vehicle is located in an environment having an environmental complexity level that does not exceed or fall below the environmental complexity threshold. Any number of environmental complexity thresholds may be provided, which may be used to determine the degree of authentication. For example, two environmental complexity thresholds may be provided, wherein an increased degree of authentication may be required when each threshold is reached and/or exceeded.
The environmental conditions may include ambient climate conditions. Examples of climate conditions may include, but are not limited to, temperature, precipitation, wind speed or direction, or any other climate condition. A higher degree of certification may be required when the unmanned aerial vehicle is located in an environment having more extreme or potentially harmful weather conditions than when the unmanned aerial vehicle is located in an environment having less extreme or harmful weather conditions. A higher degree of certification may be required when the unmanned aerial vehicle is located in an environment that meets or exceeds a climate threshold, and a lower degree of certification may be required when the unmanned aerial vehicle is located in an environment that does not exceed or fall below the climate threshold. Any number of climate thresholds may be provided, which may be used to determine the degree of authentication. For example, multiple climate thresholds may be provided, wherein an increased degree of authentication may be required when each threshold is reached and/or exceeded.
The context information may include geographical information. For example, the contextual information may include a location of the UAV. The contextual information may include geographic flight limits for the location of the UAV. Some locations may be classified as sensitive locations. In some examples, the location may include an airport, school, campus, hospital, military area, security area, research institution, jurisdictional landmark, power plant, private residence, shopping mall, any other type of location of a place of aggregation. In some cases, the location may be categorized into one or more categories that may indicate a level of "sensitivity" for the location. A higher degree of certification may be required when the unmanned aerial vehicle is located at a location with higher sensitivity than when the unmanned aerial vehicle is located at a location with lower sensitivity. For example, a higher degree of authentication may be required when the unmanned aerial vehicle is located at a secure military facility than when the user is located at a shopping mall. A higher degree of certification may be required when the unmanned aerial vehicle is located at a location where the sensitivity meets or exceeds the location sensitivity threshold, and a lower degree of certification may be required when the unmanned aerial vehicle is located at a location where the sensitivity does not exceed or fall below the location sensitivity threshold. Any number of location sensitivity thresholds may be provided, which may be used to determine the degree of authentication. For example, multiple location sensitivity thresholds may be provided, wherein an increased degree of authentication may be required when each threshold is reached and/or exceeded.
The context information may include time-based information. The time-based information may include a time of day, a day of week, a date, a month, a quarter, a season, a year, or any other time-based information. A higher degree of authentication may be required for these time periods than for other time periods. For example, a higher degree of authentication may be required during a day of the week with higher historical traffic flow. A higher degree of certification may be required for a time of day with a higher historical traffic flow or incident. For seasons with more extreme environmental climates, a higher degree of certification may be required. A higher degree of authentication may be required when the time is within one or more specified time ranges. Any number of specified time ranges may be provided, which may be used to determine the degree of authentication. For example, ten time ranges may be provided, and a different degree or type of authentication may be required for each time range. In some cases, in determining the degree of authentication, multiple types of time ranges may be weighted simultaneously. For example, a certain time of day and a certain day of the week may be considered and weighted to determine the degree of authentication.
The context information may include information about the user. The context information may include the identity of the user. The identity of the user may indicate the user type. The context information may include a user type. Examples of user types may include skill levels and/or experience of the user. Any other user information as described elsewhere herein may be used as context information. A higher degree of authentication may be required when the user has less skill or experience than when the user has more skill or experience. A lower degree of authentication may be required when the user has a skill or experience level that meets or exceeds a skill or experience threshold, and a higher degree of authentication may be required when the user has a skill or experience level that is less than or equal to the skill or experience threshold. Any number of skill or experience thresholds may be provided, which may be used to determine the degree of authentication. For example, three skill or experience thresholds may be provided, wherein a reduced degree of authentication may be required when each threshold is reached and/or exceeded.
The situational information may include information about the unmanned aerial vehicle. The contextual information may include an identity of the UAV. The identity of the UAV may indicate the UAV type. The situational information may include a type of unmanned aerial vehicle. Examples of types of unmanned aerial vehicles may include models of unmanned aerial vehicles. Any other unmanned aerial vehicle information as described elsewhere herein may be used as contextual information. A higher degree of certification may be required when the model of the unmanned aerial vehicle is a more complex or more difficult to operate model than when the model of the unmanned aerial vehicle is a simpler or more easily operated model. A higher degree of certification may be required when the complexity or difficulty of the model of the unmanned aerial vehicle meets or exceeds a complexity or difficulty threshold, and a lower degree of certification may be required when the complexity or difficulty of the model of the unmanned aerial vehicle is less than or equal to the complexity or difficulty threshold. Any number of complexity or difficulty thresholds may be provided, which may be used to determine the degree of authentication. For example, four complexity or difficulty thresholds may be provided, wherein an increased degree of authentication may be required when each threshold is reached and/or exceeded.
The contextual information may include the complexity of the task to be performed by the UAV. The unmanned aerial vehicle may perform one or more tasks during the mission. The mission may include flying along a flight path. The task may include collecting information about the unmanned aerial vehicle environment. The mission may include transmitting data from the unmanned aerial vehicle. The tasks may include picking, carrying, and/or placing payloads. The task may include managing power on the UAV. The tasks may include monitoring or photography tasks. In some cases, task complexity may be higher when more computing or processing resources on the UAV are used in completing the task. In one example, the task of detecting a moving target and following the moving target with the unmanned aerial vehicle may be more complex than the task of playing pre-recorded music from the speakers of the unmanned aerial vehicle. A higher degree of certification may be required when the unmanned aerial vehicle mission is more complex than when the unmanned aerial vehicle mission is simpler. A higher degree of certification may be required when the unmanned aerial vehicle mission complexity meets or exceeds a mission complexity threshold, and a lower degree of certification may be required when the unmanned aerial vehicle mission complexity is less than or equal to the mission complexity threshold. Any number of task complexity thresholds may be provided, which may be used to determine the degree of authentication. For example, multiple task complexity thresholds may be provided, wherein an increased degree of authentication may be required when each threshold is reached and/or exceeded.
The context information may comprise information about the surrounding communication system. For example, the presence or absence of a wireless signal in the environment may be an example of context information. In some cases, a possibility to influence one or more ambient wireless signals may be provided as context information. The number of wireless signals in the environment may or may not affect the likelihood of affecting one or more surrounding wireless signals. If a larger number of signals is provided, there may be a higher probability that at least one of them may be affected. The security level of a wireless signal in the environment may or may not affect the likelihood of affecting one or more surrounding wireless signals. For example, the more wireless signals that have some assurance, the less likely they will be affected. A higher degree of authentication may be required when the likelihood of affecting one or more ambient wireless signals is higher than when the likelihood of affecting one or more ambient wireless signals is lower. A higher degree of authentication may be required when the likelihood of affecting the one or more ambient wireless signals meets or exceeds the communication threshold, while a lower degree of authentication may be required when the likelihood of affecting the one or more ambient wireless signals is less than or equal to the communication threshold. Any number of communication thresholds may be provided, which may be used to determine the degree of authentication. For example, multiple communication thresholds may be provided, wherein an increased degree of authentication may be required when each threshold is reached and/or exceeded.
The risk of interfering with the operation of the unmanned aerial vehicle may be an example of situational information. The contextual information may include information regarding the risk of the unmanned aerial vehicle being hacked/hijacked. Another user may attempt to take over control of the unmanned aerial vehicle in an unauthorized manner. A higher degree of authentication may be required when there is a higher risk of hacking/hijacking than when the risk of hacking/hijacking is lower. A higher degree of authentication may be required when the risk of hacker intrusion/hijacking meets or exceeds the risk threshold, and a lower degree of authentication may be required when the risk of hacker intrusion/hijacking is below or equal to the risk threshold. Any number of risk thresholds may be provided, which may be used to determine the degree of authentication. For example, multiple risk thresholds may be provided, wherein an increased degree of authentication may be required when each threshold is reached and/or exceeded.
The situational information may include information about the risk of interfering with the unmanned aerial vehicle communication. For example, another unauthorized user can interfere with the authorized user's communication with the unmanned aerial vehicle in an unauthorized manner. An unauthorized user may interfere with an authorized user's commands to the unmanned aerial vehicle, which may affect control of the unmanned aerial vehicle. An unauthorized user may interfere with data sent from the UAV to the authorized user's device. A higher degree of authentication may be required when there is a higher risk of interfering with the unmanned aerial vehicle communication than when there is a lower risk of interfering with the unmanned aerial vehicle communication. A higher degree of authentication may be required when the risk of interfering with the unmanned aerial vehicle communication meets or exceeds a risk threshold, and a lower degree of authentication may be required when the risk of interfering with the unmanned aerial vehicle communication is less than or equal to the risk threshold. Any number of risk thresholds may be provided, which may be used to determine the degree of authentication. For example, multiple risk thresholds may be provided, wherein an increased degree of authentication may be required when each threshold is reached and/or exceeded.
The contextual information may include information regarding one or more sets of flight controls. The context information may include information about the degree of flight restrictions in the area. This may be based on current flight limits or historical flight limits. The flight restrictions may be imposed by the control entity. A higher degree of certification may be required when a higher degree of flight restriction exists in the zone than when a lower degree of flight restriction exists in the zone. A higher degree of certification may be required when the degree of flight restriction in the area meets or exceeds the restriction threshold, while a lower degree of certification may be required when the degree of flight restriction in the area is below or equal to the restriction threshold. Any number of limit thresholds may be provided, which may be used to determine the degree of authentication. For example, multiple limit thresholds may be provided, wherein an increased degree of authentication may be required when each threshold is reached and/or exceeded.
Any type of contextual information may be used, alone or in combination, in determining the degree of authentication for the user and/or the UAV. The type of context information used may remain the same all the time or may change. When evaluating the plurality of types of context information, they may be evaluated substantially simultaneously to obtain a determination of the degree of authentication. Various types of context information may be considered as equal factors. Alternatively, a plurality of types of scene information may be given a weight, and are not necessarily equal factors. The type of context information having a greater weight may have a greater influence on the determined degree of authentication.
The determination of the degree of certification may be made on the unmanned aerial vehicle. The unmanned aerial vehicle may receive and/or generate contextual information for use. The one or more processors of the unmanned aerial vehicle may receive the contextual information from an external data source (e.g., authentication system) or a data source on board the unmanned aerial vehicle (e.g., sensor, clock). In some implementations, the one or more processors may receive information from an air management system off-board the unmanned aerial vehicle. Information from the air traffic system may be evaluated to determine the degree of authentication. The information from the air traffic system may be context information or may be in addition to the types of context information described elsewhere herein. The one or more processors may use the received context information to make the determination.
The determination of the degree of certification may be made outside of the UAV. The determination may be made, for example, by the authentication system. In some cases, an air management system or an authentication center outside the unmanned aerial vehicle may make a determination regarding the degree of authentication. The one or more processors of the authentication system may receive contextual information from an external data source (e.g., unmanned aerial vehicle, external sensors, remote control) or a data source on the authentication system (e.g., clock, information about other unmanned aerial vehicles). In some embodiments, the one or more processors may receive information from the UAV, a remote control, a remote sensor, or other external device other than an authentication system. The one or more processors may use the received context information to make the determination.
In another case, the determination may be made on the user's remote control. The remote control may receive and/or generate context information for use. The one or more processors of the remote control may receive the context information from an external data source (e.g., authentication system) or a data source on the remote control (e.g., memory, clock). In some implementations, the one or more processors can receive information from an empty pipe system outside of the remote control. Information from the air traffic system may be evaluated to determine the degree of authentication. The information from the air traffic system may be context information or may be in addition to the types of context information described elsewhere herein. The one or more processors may use the received context information to make the determination.
In making the determination of the degree of authentication based on the context information, any other external device may be used. A single external device may be used or a plurality of external devices may be used in common. Other external devices may receive contextual information from an off-board source or an on-board source. Other external devices may include one or more processors that may use the received context information to make the determination.
FIG. 10 illustrates a graph of flight control levels that may be affected by the degree of certification, according to an embodiment of the present invention. A set of flight controls may be generated that may affect the operation of the unmanned aerial vehicle. A set of flight controls may be generated based on the degree of certification. A set of flight controls may be generated based on the degree of certification that is completed. Whether authentication is successfully passed may be considered. The degree of authentication may be applicable to any portion of the system, such as unmanned aerial vehicle authentication, user authentication, remote control authentication, geo-fence device authentication, and/or any other type of authentication.
In some embodiments, as certification level 1010 increases, flight control level 1020 may decrease. If a higher degree of certification has been developed, then flight restrictions may be less of a concern and need. The level of certification and the level of flight control may be inversely proportional. The degree of certification and the level of flight control may be linearly proportional (e.g., inversely linearly proportional). The certification level and the flight control level may be exponentially proportional (e.g., exponentially inversely proportional). Any other inverse relationship may be provided between the degree of certification and the level of flight control. In alternative embodiments, the relationship may be directly proportional. The relationship may be directly linearly proportional, directly exponentially proportional, or any other relationship. The level of flight control may depend on the degree of certification performed. In alternative embodiments, the level of flight control may be independent of the degree of certification. The level of flight control may or may not be selected with respect to the degree of certification. When the level of certification is low, a more restrictive set of flight controls may be generated. When the level of certification is high, a less restrictive set of flight restrictions may be generated. A set of flight controls may or may not be generated based on the degree of certification.
One aspect of the invention relates to a method of determining a level of flight control for operation of an unmanned aerial vehicle, the method comprising: evaluating, using one or more processors, a degree of authentication of the UAV or a user of the UAV; enabling authentication of the UAV or the user according to the authentication degree; generating a set of flight controls based on the degree of authentication; and enabling operation of the UAV in accordance with the set of flight controls. Similarly, embodiments of the invention may relate to a non-transitory computer-readable medium containing program instructions for determining a flight control level of an unmanned aerial vehicle, the computer-readable medium comprising: program instructions for evaluating a degree of authentication for the unmanned aerial vehicle or a user of the unmanned aerial vehicle; program instructions for enabling authentication of the UAV or the user according to the degree of authentication; program instructions for generating a set of flight controls based on the degree of authentication; and program instructions for providing signals that allow the unmanned aerial vehicle to operate in accordance with the set of flight controls.
An unmanned aerial vehicle authentication system may be provided, comprising: a communication module; and one or more processors operatively coupled to the communication module and individually or collectively configured to: evaluating a degree of authentication of the UAV or a user of the UAV; enabling authentication of the UAV or the user according to the authentication degree; and generating a set of flight controls based on the degree of certification. An unmanned aerial vehicle authentication module can include: one or more processors individually or collectively configured to: evaluating a degree of authentication of the UAV or a user of the UAV; enabling authentication of the UAV or the user according to the authentication degree; and generating a set of flight controls based on the degree of certification.
As described elsewhere herein, the degree of authentication may include a lack of any authentication of the user and/or the UAV. For example, the degree of authentication may be zero. Thus, the degree of authentication may include not authenticating the UAV or the user. The degree of authentication may include authenticating both the unmanned aerial vehicle and the user. The degree of authentication may include authenticating the unmanned aerial vehicle without authenticating the user, or may include authenticating the user without authenticating the unmanned aerial vehicle.
The degree of authentication may be selected from a plurality of options for the degree of authentication of the UAV and/or the user. For example, three options for the degree of authentication of the UAV and/or the user may be provided (e.g., high, medium, or low degrees of authentication). Any number of degrees of authentication options may be provided (e.g., 2 or more, 3 or more, 4 or more, 5 or more, 6 or more, 7 or more, 8 or more, 9 or more, 10 or more, 12 or more, 15 or more, 20 or more, 25 or more options). In some cases, the degree of authentication may be generated and/or determined without selecting from one or more predetermined options. The authentication degree may be generated in flight. The unmanned aerial vehicle and/or the user may be authenticated according to the degree of authentication. An UAV and/or a user may be considered certified if/when they pass a certification process. An unmanned aerial vehicle and/or user may be considered unauthenticated if they experience but fail the authentication process. For example, an identifier/key mismatch may be an example when authentication is not passed. Biometric data provided that does not match biometric data about a file may be another example when authentication is not passed. Providing an incorrect login username/password combination may be an additional example when the authentication process is not passed.
The information about the degree of authentication may include the level or kind of authentication that has occurred. The level or species may be qualitative and/or quantitative. The information about the degree of authentication may include one or more types of authentication that have occurred. The information about the degree of authentication may include data collected during authentication (e.g., if authentication includes processing biometric data, the biometric data itself may be provided).
A set of flight controls may be generated based on the degree of certification. Any description herein of the degree of authentication may also apply to the type of authentication. A set of flight controls may be generated according to any of the techniques as described elsewhere herein. The set of flight controls is generated, for example, by selecting a set of flight controls from a plurality of sets of flight controls. In another example, a set of flight restrictions may be generated from scratch. A set of flight controls may be generated via input from a user.
A set of flight controls may be generated with the aid of one or more processors. The generation of the set of flight controls may occur on an unmanned aerial vehicle. The degree of certification used may be received and/or generated by the UAV. The one or more processors of the unmanned aerial vehicle may receive information regarding the degree of authentication from an external data source or a data source on board the unmanned aerial vehicle. In some implementations, the one or more processors may receive information from an air management system off-board the unmanned aerial vehicle. Information from the air traffic system may be evaluated to generate a set of flight controls. The one or more processors may use the received information regarding the degree of authentication to make the determination.
The generation of the set of flight controls may occur off-board the UAV. For example, the generation of the set of flight controls may be implemented by an authentication system. In some cases, an air management system or certification center external to the UAV may generate a set of flight controls. The one or more processors of the authentication system may receive information regarding the degree of authentication from an external data source or a data source on top of the authentication system. In some embodiments, the one or more processors may receive information from the UAV, a remote control, a remote sensor, or other external device other than an authentication system. The one or more processors may use the received information regarding the degree of authentication to make the determination.
In another case, the generation of the set of flight controls may occur on a user's remote control. The remote control may receive and/or generate the degree of authentication used. The one or more processors of the remote control may receive information regarding the degree of authentication from an external data source or a data source on the remote control. In some implementations, the one or more processors can receive information from an empty pipe system outside of the remote control. Information from the air traffic system may be evaluated to generate a set of flight controls. The one or more processors may use information regarding the degree of authentication to make the determination.
Any other external device may be used in generating the set of flight controls based on the degree of authentication. A single external device may be used or a plurality of external devices may be used in common. Other external devices may receive information regarding the degree of authentication from an external or onboard source. Other external devices may include one or more processors that may use the received information regarding the degree of authentication to make the determination.
The unmanned aerial vehicle may be operated according to the set of flight controls. A user of the unmanned aerial vehicle may issue one or more commands to effect operation of the unmanned aerial vehicle. The command may be issued by means of a remote control. The set of flight controls may be adhered to enable operation of the UAV. If one or more commands do not comply with the set of flight regulations, the commands may be overridden so that the UAV remains in compliance with the set of flight regulations. When the command complies with the set of flight regulations, the command need not be overridden and control of the UAV may be enabled without interference.
Device identification storage
Fig. 11 illustrates an example of device information that may be stored in a memory according to an embodiment of the present invention. A memory storage system 1110 may be provided. Information from one or more users 1115a, 1115b, one or more user terminals 1120a, 1120b, and/or one or more unmanned aerial vehicles 1130a, 1130b may be provided. The information may include one or more commands, an associated user identifier, an associated UAV identifier, associated timing information, and any other associated information. One or more sets of information 1140 may be stored.
The memory storage system 1110 may include one or more memory storage units. The memory storage system may include one or more databases that may store the information described herein. The memory storage system may include a computer-readable medium. One or more electronic storage units may be provided, such as a memory (e.g., read-only memory, random access memory, flash memory) or a hard disk. A "storage" class of media may include any or all tangible memory in a computer, processor, etc. or its associated modules, such as various semiconductor memories, tape drives, disk drives, etc., that may provide non-transitory storage for software programming at any time. In some embodiments, non-volatile storage media includes, for example, optical or magnetic disks, any storage device such as any one or more computers or the like, such as media that may be used to implement a database or the like. Volatile storage media includes dynamic memory, such as the main memory of such computer platforms. Tangible transmission media include coaxial cables, copper wire and fiber optics, including the wires that comprise a bus within a computer system. Carrier-wave transmission media can take the form of electrical or electromagnetic signals, or acoustic or light waves, such as those generated during Radio Frequency (RF) and Infrared (IR) data communications. Common forms of computer-readable media therefore include, for example: a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD or DVD-ROM, any other optical medium, punch cards, paper tape, any other physical storage medium with patterns of holes, a RAM, a ROM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave transporting data or instructions, cables or links transporting such a carrier wave, or any other medium from which a computer can read programming code and/or data. Many of these forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to a processor for execution.
The memory storage system may be provided at a single location, or may be distributed over multiple locations. In some embodiments, a memory storage system may include a single memory storage unit or multiple memory storage units. A cloud computer infrastructure may be provided. In some cases, a peer-to-peer (P2P) memory storage system may be provided.
The memory storage system may be provided off-board the unmanned aerial vehicle. The memory storage system may be provided on a device external to the UAV. The memory storage system may be provided outside the remote control. The memory storage system may be provided on a device external to the remote control. The memory storage system may be provided in addition to the UAV and the remote control. The memory storage system may be part of an authentication system. The memory storage system may be part of an empty pipe system. The memory storage system may include one or more memory units, which may be one or more memory units of an authentication system, such as an empty pipe system. Alternatively, the memory storage system may be separate from the authentication system. The memory storage system may be owned and/or operated by the same entity as the authentication system. Alternatively, the memory storage system may be owned and/or operated by a different entity than the authentication system.
The communication system may include one or more recorders. The one or more recorders may receive data from any device of the communication system. For example, one or more recorders may receive data from one or more unmanned aerial vehicles. One or more recorders may receive data from one or more users and/or remote controls. One or more memory storage units may be provided by one or more recorders. For example, one or more memory storage units may be provided by one or more recorders that receive one or more messages from the UAV, the user, and/or a remote control. One or more recorders may or may not have a limited range to receive information. For example, the recorder may be configured to receive data from a device located within the same physical area as the recorder. For example, a first recorder may receive information from an unmanned aerial vehicle when the unmanned aerial vehicle is located in a first zone, and a second recorder may receive information from the unmanned aerial vehicle when the unmanned aerial vehicle is located in a second zone. Alternatively, the recorder does not have a limited range and can receive information from a device (e.g., unmanned aerial vehicle, remote control) regardless of where the device is located. The recorder may be a memory storage unit and/or may communicate aggregated information to the memory storage unit.
Information from one or more users 1115a, 1115b may be stored in a memory storage system. The information may include user identification information. Examples of user identification information may include user identifiers (e.g., user ID1, user ID2, user ID3 … …). The user identifier may be unique to the user. In some cases, the information from the user may include information that facilitates identification and/or authentication of the user. The information from the one or more users may include information about the users. The information from the one or more users may include one or more commands from the users (e.g., command 1, command 2, command 3, command 4, command 5, command 6 … …). The one or more commands may include commands to effect operation of the UAV. The one or more commands may be used to control flight of the unmanned aerial vehicle, takeoff of the unmanned aerial vehicle, landing of the unmanned aerial vehicle, operation of a payload of the unmanned aerial vehicle, operation of a carrier of the unmanned aerial vehicle, operation of one or more sensors on the unmanned aerial vehicle, one or more communication units of the unmanned aerial vehicle, one or more power units of the unmanned aerial vehicle, one or more navigation units of the unmanned aerial vehicle, and/or any feature of the unmanned aerial vehicle. Any other type of information may be provided from one or more users and may be stored in a memory storage system.
In some embodiments, all user inputs may be stored in a memory storage system. Alternatively, only selected user inputs may be stored in the memory storage system. In some cases, only certain types of user input are stored in the memory storage system. For example, in some embodiments, only user identification input and/or command information is stored in the memory storage system.
A user may optionally provide information to the memory storage system by way of one or more user terminals 1120a, 1120 b. The user terminal may be a device capable of interacting with a user. The user terminal may be a device capable of interacting with the unmanned aerial vehicle. The user terminal may be a remote control configured to transmit one or more operation commands to the unmanned aerial vehicle. The user terminal may be a display device configured to show data based on information received from the unmanned aerial vehicle. The user terminal may be capable of simultaneously transmitting information to and receiving information from the UAV.
A user may provide information to the memory storage system by means of any other type of device. For example, one or more computers or other devices may be provided, which may be capable of receiving user input. The device may be capable of communicating user input to the memory storage device. The device does not need to interact with an unmanned aerial vehicle.
The user terminals 1120a, 1120b may provide information to a memory storage system. The user terminal may provide information about the user, user commands, or any other type of information. The user terminal may provide information about the user terminal itself. For example, a user terminal identification may be provided. In some cases, a user identifier and/or a user terminal identifier may be provided. Optionally a user key and/or a user terminal key may be provided. In some examples, the user does not provide any input regarding the user key, but the user key information may be stored on the user terminal or may be accessible by the user terminal. In some cases, the user key information may be stored on physical memory of the user terminal. Alternatively, the user key information may be stored off-board (e.g., on the cloud) and may be accessible by the user terminal. In some embodiments, the user terminal may transmit a user identifier and/or an associated command.
The unmanned aerial vehicles 1130a, 1130b may provide information to a memory storage system. An unmanned aerial vehicle may provide information about the unmanned aerial vehicle. For example, unmanned aircraft identification information may be provided. Examples of the unmanned aerial vehicle identification information may include an unmanned aerial vehicle identifier (e.g., unmanned aerial vehicle ID1, unmanned aerial vehicle ID2, unmanned aerial vehicle ID3 … …). The UAV identifier may be unique to the UAV. In some cases, the information from the unmanned aerial vehicle may include information that facilitates identification and/or authentication of the unmanned aerial vehicle. The information from one or more unmanned aerial vehicles may include information about the unmanned aerial vehicle. The information from one or more unmanned aerial vehicles may include one or more commands (e.g., command 1, command 2, command 3, command 4, command 5, command 6 … …) received by the unmanned aerial vehicle. The one or more commands may include commands to effect operation of the UAV. The one or more commands may be used to control flight of the unmanned aerial vehicle, takeoff of the unmanned aerial vehicle, landing of the unmanned aerial vehicle, operation of a payload of the unmanned aerial vehicle, operation of a carrier of the unmanned aerial vehicle, operation of one or more sensors on a drone aircraft, one or more communication units of the unmanned aerial vehicle, one or more power units of the unmanned aerial vehicle, one or more navigation units of the unmanned aerial vehicle, and/or any feature of the unmanned aerial vehicle. Any other type of information may be provided from one or more unmanned aerial vehicles and may be stored in a memory storage system.
In some embodiments, the user may be authenticated prior to storing the user-related information in the memory storage system. For example, the user may be authenticated prior to obtaining the user identifier and/or storing the user identifier by the memory storage system. Thus, in some implementations, only authenticated user identifiers are stored in the memory storage system. Alternatively, the user need not be authenticated and the purported user identifier may be stored in the memory storage system prior to authentication. If the authentication is passed, an indication may be made that the user identifier has been verified. If not, an indication may be made that the user identifier has been marked as suspicious activity or that a failed authentication attempt has been made using the user identifier.
Alternatively, the unmanned aerial vehicle may be authenticated prior to storing the information relating to the unmanned aerial vehicle in the memory storage system. For example, the UAV may be authenticated prior to obtaining the UAV identifier and/or storing the UAV identifier by a memory storage system. Thus, in some implementations, only authenticated unmanned aerial vehicle identifiers are stored in the memory storage system. Alternatively, the UAV need not be authenticated, and the purported UAV identifier may be stored in the memory storage system prior to authentication. If the authentication is passed, an indication may be made that the UAV identifier has been verified. If not, an indication may be made that the UAV identifier has been marked as suspicious activity or that a failed authentication attempt has been made using the UAV identifier.
In some embodiments, the one or more flight commands are only allowed when the user is authorized to operate the unmanned aerial vehicle. The user and/or the UAV may or may not be authenticated prior to determining whether the user is authorized to operate the UAV. The user and/or the UAV may be authenticated prior to allowing the user to operate the UAV. In some cases, the commands in the memory storage system may be stored only when the user is authorized to operate the UAV. The commands in the memory storage system may be stored only when the user and/or the UAV are authenticated.
The memory storage unit may store one or more sets of information 1140. The sets of information may include information from a user, a user terminal, and/or an unmanned aerial vehicle. The sets of information may include one or more commands, a user identifier, an unmanned aerial vehicle identifier, and/or an associated time. The user identifier may be associated with the user issuing the command. The UAV may be associated with an UAV that receives and/or executes commands. The time may be the time at which the command was issued and/or received. The time may be the time the command is stored in memory. The time may be a time at which the unmanned aerial vehicle executes the command. In some cases, a single command may be provided for a single set of information. Alternatively, multiple commands may be provided for a single set of information. The plurality of commands may include commands issued by a user and corresponding commands received by the UAV. Alternatively, a single command may be provided, which may be recorded as it is issued from the user, or may be recorded as it is received by and/or executed by the UAV.
Thus, sets of information may be provided with associated commands. For example, the first set of information may be stored when a user issues a command. The time for the first set of information may reflect when a command was issued by a user or when the set of information was stored in the memory storage system. Alternatively, data from the remote control may be used to provide the first set of information. The second set of information may be stored when the UAV receives the command. The time for the second set of information may reflect when the unmanned aerial vehicle receives the command or when the set of information is stored in the memory storage system. Alternatively, data from the UAV may be used to provide a second set of information based on the associated command. A third set of information may be stored when the unmanned aerial vehicle executes the command. The time for the third set of information may reflect when the unmanned aerial vehicle executed the command or when the set of information was stored in the memory storage system. Alternatively, data from the UAV may be used to provide a third set of information based on the associated command.
The memory storage system may store sets of information regarding a particular interaction between a first user and a first UAV. For example, a plurality of commands may be issued during an interaction between a first user and a first UAV. An interaction may be the execution of a task. In some cases, the memory storage unit may only store information about a particular interaction. Alternatively, the memory storage system may store information regarding a plurality of interactions (e.g., a plurality of tasks) between the first user and the first UAV. The memory storage system may optionally store information based on the user identifier. Data associated with the first user may be stored together. Alternatively, the memory storage unit may store information based on the UAV identifier. Data associated with the first UAV may be stored together. The memory storage unit may store information based on user-UAV interaction. For example, data commonly associated with the first unmanned aerial vehicle and the first user may be stored together. In some cases, only information about the user, the UAV, or the user-UAV combination may be stored in the memory storage unit.
Alternatively, the memory storage system may store multiple sets of information regarding interactions between multiple users and/or unmanned aerial vehicles. The memory storage system may be a data repository that collects information from multiple users and/or unmanned aerial vehicles. The memory storage system may store information from a plurality of tasks, which may include individual users, individual unmanned aerial vehicles, and/or individual user-unmanned aerial vehicle combinations. In some cases, the set of information in the memory storage system may be searchable or indexable. The set of information may be looked up or indexed according to any parameter, such as user identity, unmanned aerial vehicle identity, time, user-unmanned aerial vehicle combination, command type, location, or any other information. The information sets may be stored according to any parameter.
In some cases, information in a memory storage system may be analyzed. The information set may be analyzed to detect one or more behavioral patterns. The information sets may be analyzed to detect one or more characteristics that may be related to an accident or adverse condition. For example, if a particular user frequently crashes a particular model of unmanned aerial vehicle, the data may be extracted. In another example, such information may be extracted if another user tends to attempt to maneuver the unmanned aerial vehicle into an area that is not permitted to fly according to a set of flight regulations. A statistical analysis may be performed on the set of information in the memory storage unit. Such statistical analysis may help identify trends or related factors. For example, it may be noted that certain UAV models may generally have a higher accident rate than other UAV models. The information set may be analyzed to determine: there may typically be a higher rate of unmanned aerial vehicle failure when the temperature in the environment falls below 5 degrees celsius. Thus, the information in the memory storage system may be analyzed synthetically to aggregate information about the operation of the unmanned aerial vehicle. Such an integrated analysis need not respond to a particular event or scenario.
Information from the memory storage system may be analyzed in response to particular events or scenarios. For example, if an unmanned aerial vehicle crash occurs, information associated with the unmanned aerial vehicle may be analyzed to provide further judicial information regarding the crash. If an unmanned aerial vehicle crash occurs during a mission, the set of information collected during the mission can be pulled together and analyzed. For example, a mismatch between the issued command and the received command may be identified. Environmental conditions at the time of crash can be analyzed. The region may be analyzed for the presence of other unmanned aerial vehicles or obstacles. In some embodiments, information sets for the unmanned aerial vehicle from other tasks may also be pulled. For example, among other tasks, the presence of several lucrative collisions or faults may be detected. Such information may be useful in determining the cause of the crash and/or any action that needs to be taken after the crash.
Individualized unmanned aerial vehicle activity may be tracked using the information in the information set. For example, individualized UAV activity may be tracked using one or more commands, the associated one or more user identifiers, and the associated one or more UAV identifiers.
The information sets may store commands, user information, unmanned aerial vehicle information, timing information, location information, environmental condition information, flight control information, or any detected condition. Any information may correspond to a command. For example, the geographic information may include the location of the UAV and/or the remote control when the command is issued. The geographic information may also indicate whether the unmanned aerial vehicle falls into a zone for purposes of considering flight controls. The environmental condition may include one or more environmental conditions of the area. For example, the environmental complexity of the area around the unmanned aerial vehicle may be taken into account when issuing or receiving commands. The climate that the unmanned aerial vehicle is experiencing when issuing or receiving commands may be considered. The command may occur at one point in time.
The memory storage system may be updated in real time. For example, when commands are issued, received, and/or executed, they may be recorded in a memory storage system along with any other information from the information sets. This may occur in real time. The commands and any related information in the information set may be stored in less than 10 minutes, 5 minutes, 3 minutes, 2 minutes, 1 minute, 30 seconds, 15 seconds, 10 seconds, 5 seconds, 3 seconds, 1 second, 0.5 seconds, or 0.1 seconds of issuing, receiving, and/or executing the commands. The information sets can be stored or recorded in any manner in the memory storage system. Upon entry of the information set, the information set may be recorded without regard to other parameters, such as user identity, unmanned aerial vehicle identity, or a user-unmanned aerial vehicle combination. Alternatively, they may be recorded in consideration of other parameters. For example, all sets of information for the same user may be stored together. Even if the information sets are not all stored together, they may be searchable and/or indexable to find associated information. For example, if an information set for a particular user enters at different points in time and is recorded with information sets from other users, the information sets may be searchable to find all information sets associated with the user.
In alternative embodiments, the memory storage system may not need to be updated in real time. The memory storage system can be periodically updated at regular or irregular intervals. For example, the memory storage system may be updated weekly, daily, hourly, half-hour, 15 minutes, 10 minutes, 5 minutes, 3 minutes, 30 seconds, 15 seconds, 10 seconds, 5 seconds, or second. In some cases, an update schedule may be provided, which may include regular or irregular update times. The update schedule may be fixed or may be modifiable. In some cases, the update schedule may be altered by an operator or manager of the memory storage system. The update schedule may be altered by an operator or manager of the authentication system. The user of the UAV may or may not be able to alter the update schedule. A user of an unmanned aerial vehicle may be able to alter an update schedule of the unmanned aerial vehicle associated with the user. The user may be able to change an updated schedule of the unmanned aerial vehicle for which the user is authorized to operate.
The memory storage system may be updated in response to a detected event or condition. For example, when an operator of the memory storage system requests information, the memory storage system may request or pull a set of information from one or more external sources (e.g., remote control, unmanned aerial vehicle, user). In another example, the memory storage system may be able to request or pull a set of information when a detected condition occurs, such as a detected crash. In some examples, one or more external sources (e.g., remote control, unmanned aerial vehicle, user) may push the set of information to the memory storage system. For example, if an unmanned aerial vehicle detects that the unmanned aerial vehicle is approaching a restricted flight zone, the unmanned aerial vehicle may push a set of information to a memory storage system. In another example, if the remote control or the unmanned aerial vehicle realizes that there may be some interfering wireless signals, they may push the set of information to the memory storage unit.
One aspect of the invention may relate to a method of recording Unmanned Aerial Vehicle (UAV) behavior, the method comprising: receiving an unmanned aerial vehicle identifier that uniquely identifies the unmanned aerial vehicle from among other unmanned aerial vehicles; receiving a user identifier that uniquely identifies the user from among other users, wherein the user provides one or more commands via a remote control to effect operation of the UAV; and recording the one or more commands, the user identifier associated with the one or more commands, and the UAV identifier associated with the one or more commands in one or more memory storage units. Similarly, according to an embodiment of the invention, there may be provided a non-transitory computer readable medium containing program instructions for recording Unmanned Aerial Vehicle (UAV) behavior, the computer readable medium comprising: program instructions for associating a user identifier with one or more commands from a user, wherein the user identifier uniquely identifies the user from among other users, and wherein the user provides one or more commands via a remote control to effect operation of the UAV; program instructions for associating an unmanned aerial vehicle identifier with the one or more commands, wherein the unmanned aerial vehicle identifier uniquely identifies the unmanned aerial vehicle from among other unmanned aerial vehicles; and program instructions for recording the one or more commands, the user identifier associated with the one or more commands, and the UAV identifier associated with the one or more commands in one or more memory storage units.
According to an embodiment of the present invention, an Unmanned Aerial Vehicle (UAV) behavior recording system may be provided. The system may include: one or more memory storage units; and one or more processors operatively coupled to the one or more memory storage units and individually or collectively configured to: receiving an unmanned aerial vehicle identifier that uniquely identifies the unmanned aerial vehicle from among other unmanned aerial vehicles; receiving a user identifier that uniquely identifies the user from among other users, wherein the user provides one or more commands via a remote control to effect operation of the UAV; and recording the one or more commands, the user identifier associated with the one or more commands, and the UAV identifier associated with the one or more commands in one or more memory storage units.
The memory storage system may store the set of information for any period of time. In some cases, information sets may be stored indefinitely until they are deleted. Deletion of the information set may or may not be allowed. In some cases, only the handler or administrator of the memory storage system may be allowed to interact with the data stored in the memory storage system. In some cases, only the operator of the authentication system (e.g., the air traffic system, the authentication center) may be allowed to interact with the data stored in the memory storage system.
Alternatively, the set of information may be deleted automatically after a period of time. The time period may be established in advance. For example, the set of information may be automatically deleted after more than a predetermined period of time. Examples of the predetermined period of time may include, but are not limited to, 20 years, 15 years, 12 years, 10 years, 7 years, 5 years, 4 years, 3 years, 2 years, 1 year, 9 months, 6 months, 3 months, 2 months, 1 month, 4 weeks, 3 weeks, 2 weeks, 1 week, 4 days, 3 days, 2 days, 1 day, 18 hours, 12 hours, 6 hours, 3 hours, 1 hour, 30 minutes, or 10 minutes. In some cases, the information sets may be manually deleted only after a predetermined period of time has elapsed.
Nozzle control
In some embodiments, the operation of the unmanned aerial vehicle may be compromised. In one example, a user may operate an unmanned aerial vehicle. Another user (e.g., a hijacker) may attempt to take over control of the unmanned aerial vehicle in an unauthorized manner. The systems and methods described herein may allow for detection of such attempts. The systems and methods described herein may also provide a response to such a hijacking attempt. In some implementations, information collected in a memory storage system may be analyzed to detect hijacking.
Fig. 12 shows a diagram of a scenario in which a hijacker attempts to take over control of an unmanned aerial vehicle, according to an embodiment of the invention. User 1210 may use user remote control 1215 to issue user commands to unmanned aerial vehicle 1220. Hijacker 1230 may use hijacker remote control 1235 to issue hijacker commands to unmanned aerial vehicle 1220. The hijacker command may interfere with the user command.
In some implementations, the user 1210 can be an authorized user of the unmanned aerial vehicle 1220. The user may have an initial relationship with the UAV. The user may optionally pre-register the unmanned aerial vehicle. The user may operate the UAV before the hijacker attempts to take over control of the UAV. In some cases, the user may operate the unmanned aerial vehicle if the user is an authorized user of the unmanned aerial vehicle. If the user is not an authorized user of the UAV, the user may not be allowed to operate the UAV or may be able to operate the UAV in a more limited manner.
The user identity may be authenticated. In some cases, the identity of the user may be authenticated prior to the user operating the unmanned aerial vehicle. The user may be authenticated before, while, or after determining whether the user is an authorized user of the unmanned aerial vehicle. If the user is authenticated, the user may operate the unmanned aerial vehicle. If the user is not authenticated, the user may not be allowed to operate the UAV or may be able to operate the UAV in a more limited manner.
User 1210 may control the operation of unmanned aerial vehicle 1220 using user remote control 1215. The remote control may obtain user input. The remote control may transmit user commands to the UAV. The user command may be generated based on user input. The user commands may control the operation of the UAV. For example, the user commands may control the flight (e.g., flight path, takeoff, landing) of the unmanned aerial vehicle. The user commands may control the operation of one or more payloads, the position of one or more payloads, the operation of one or more carriers, the operation of one or more sensors, the operation of one or more communication units, the operation of one or more navigation units, and/or the operation of one or more power units.
In some cases, the user command may be continuously sent to the UAV. Commands may be sent to the UAV for purposes of keeping current or based on the last input provided by the user, even if the user does not actively provide input at a certain time. For example, if the user input includes movement of a joystick and the user holds the joystick at a particular angle, a flight command may be sent to the unmanned aerial vehicle based on the joystick angle known from the previous motion. In some cases, user input may be continuously provided to the remote control even if the user is not actively moving or physically changing anything. For example, if the user input includes tilting the remote controller to a particular pose and the user does not adjust the already established pose, the user input may be continuously provided based on the continuously measured pose of the remote controller. For example, if the remote control is at angle a for an extended period of time, during that period of time the user input may be interpreted as "the pose of the remote control is angle a". A flight command may be transmitted to the UAV indicating a flight command in response to the "remote control at angle A". The user commands for the unmanned aerial vehicle can be updated in real time. The user command may reflect a user input in less than 1 minute, 30 seconds, 15 seconds, 10 seconds, 5 seconds, 3 seconds, 2 seconds, 1 second, 0.5 seconds, 0.1 seconds, 0.05 seconds, or 0.01 seconds.
Alternatively, the user command need not be continuously sent to the UAV. The user commands may be sent at regular or irregular time periods. For example, the user command may be sent in less than or equal to every hour, every 30 minutes, every 15 minutes, every 10 minutes, every 5 minutes, every 3 minutes, every minute, every 30 seconds, every 15 seconds, every 10 seconds, every 5 seconds, every 3 seconds, every 2 seconds, every 1 second, every 0.5 seconds, or every 0.1 seconds. The user command may be sent according to a schedule. The schedule may or may not be modifiable. The user command may be sent in response to one or more detected events or conditions.
The user command may be received by the unmanned aerial vehicle 1220. When the user command received at the UAV matches the user command sent from user remote control 1215, the UAV may be capable of operating in accordance with the command from the user. The communication link between the unmanned aerial vehicle and the remote control may be operational when the issued command matches the received command. If the issued command and the received command match, the command may not be discarded. In some implementations, if the issued command and the received command do not match, the command may have been dropped (e.g., the communication link between the remote control and the UAV may have been dropped), or a jamming command may have been issued.
Hijacker 1230 may use hijacker remote control 1235 to control the operation of unmanned aerial vehicle 1220. The remote control may obtain the hijacker input. The remote control may transmit a hijacker command to the UAV. The hijacker command may be generated based on hijacker input. The hijacker command may control the operation of the unmanned aerial vehicle. For example, the hijacker command may control the flight (e.g., flight path, takeoff, landing) of the unmanned aerial vehicle. The hijacker command may control the operation of one or more payloads, the position of one or more payloads, the operation of one or more carriers, the operation of one or more sensors, the operation of one or more communication units, the operation of one or more navigation units, and/or the operation of one or more power units.
In some cases, the hijacker command may be continuously sent to the unmanned aerial vehicle. This may occur in a manner similar to how user commands are continually sent to the UAV. Alternatively, the hijacker command need not be continuously sent to the UAV. This may occur in a manner similar to how user commands need not be continuously sent to the UAV. The hijacker command may be sent at regular or irregular time periods. The hijacker command may be sent according to a schedule. The hijacker command may be sent in response to one or more detected events or conditions.
The hijacker command may be received by the UAV 1220. When the hijacker command received at the unmanned aerial vehicle matches the hijacker command sent from the hijacker remote control 1235, the unmanned aerial vehicle may be able to operate according to the command from the hijacker. The communication link between the unmanned aerial vehicle and the hijacker remote control may be operational when the issued command and the received command match. The hijacker may have successfully taken over control of the unmanned aerial vehicle when the command received by the unmanned aerial vehicle matches the command issued from the hijacker receiver controller. In some cases, the hijacker may have successfully taken over control of the unmanned aerial vehicle when the unmanned aerial vehicle performs one or more operations in accordance with the hijacker's commands. The hijacker may have successfully taken over when the unmanned aerial vehicle has not performed one or more operations according to the user command.
When a hijacker command is received at the UAV, the UAV may or may not also receive a user command. In one type of hijacking, the communication link between the unmanned aerial vehicle and the hijacker remote control may interfere with the communication link between the unmanned aerial vehicle and the user remote control. This may prevent the user command from reaching the UAV or may cause the user command to be received only by the UAV with no reliability. The hijacker command may or may not be received by the UAV. In some embodiments, the hijacker connection may interfere with the user connection when the hijacker issues a command to take over control of the unmanned aerial vehicle. In this scenario, the UAV may only receive hijacker commands. The unmanned aerial vehicle can operate according to the hijacker command. In another embodiment, the hijacker connection may interfere with the user connection when the hijacker does not necessarily send commands to the unmanned aerial vehicle. The interference of the signal may be sufficient to constitute a hijacking of the unmanned aerial vehicle or a hacking intrusion on the user operation of the unmanned aerial vehicle. The UAV may optionally not receive any commands (e.g., may stop receiving previously entered user commands). The unmanned aerial vehicle may have one or more default actions that may occur when communication with the user is lost. For example, an unmanned aerial vehicle may hover in place. In another example, the unmanned aerial vehicle may return to the mission starting point.
In another type of hijacking, the unmanned aerial vehicle may receive user commands and hijacker commands. The communication link between the unmanned aerial vehicle and the hijacker remote control does not necessarily need to interfere with the communication link between the unmanned aerial vehicle and the user remote control. The unmanned aerial vehicle can operate according to the hijacker command. The unmanned aerial vehicle may choose to operate on the hijacker's command, ignoring the user's command. Alternatively, when the UAV receives multiple sets of commands, the UAV may take one or more default actions. For example, an unmanned aerial vehicle may hover in place. In another example, the unmanned aerial vehicle may return to the mission starting point.
A hijacker may be an individual that is not authorized to operate the unmanned aerial vehicle. The hijacker may be an individual who has not previously registered the unmanned aerial vehicle. A hijacker may be an individual who is not authorized to take over control from a user to operate the unmanned aerial vehicle. The hijacker may be authorized in other ways to operate the unmanned aerial vehicle. However, when the user has operated the unmanned aerial vehicle, the hijacker may not be authorized to interfere with the user's operation.
Systems and methods described herein may include detecting interference to one or more commands from a user. This may include hijacking of the unmanned aerial vehicle. The user command may be disturbed when the user command does not arrive at the UAV. The user command may be disturbed by the communication connection between the unmanned aerial vehicle and the hijacker. In some cases, the hijacker may attempt to interfere with the signal between the user and the unmanned aerial vehicle. Signal interference may occur in response to a hijacker communicating with the unmanned aerial vehicle. Alternatively, the hijacker need not communicate with the UAV to interfere with signals between the user and the UAV. For example, even if the hijacker device does not issue any hijacker commands, the hijacker device may broadcast a signal that may interfere with the user's communication with the UAV. Even if the user command arrives at the unmanned aerial vehicle, the user command may be disturbed. If the UAV does not perform operations in accordance with user commands, the user commands may be disturbed. For example, the unmanned aerial vehicle may choose to perform operations according to hijacker commands, rather than user commands. Or the UAV may choose to take default action or no action in place of the user command.
The hijacker command may or may not contradict the user command. Unauthorized communications may interfere with one or more commands from users providing contradictory commands to the UAV. In one example, the user command may effect flight of the unmanned aerial vehicle while the hijacker command may effect flight in a different manner. For example, a user command may command the UAV to turn right and a hijacker command may command the UAV to proceed. The user command may command the drone to rotate about a pitch axis and the hijacker command may command the drone to rotate about a yaw axis.
In addition to detecting interference, the systems and methods herein may also allow actions to be taken in response to detected interference to one or more commands from a user. The action may include alerting the user about the disturbance. The action may include alerting one or more other parties (e.g., an operator or manager of the authentication system) regarding the disturbance. The actions may include one or more default actions of the UAV (e.g., landing, hovering in place, returning to a starting point).
One aspect of the invention relates to a method of warning a user when operation of an unmanned aerial vehicle is compromised, the method comprising: authenticating a user to enable operation of the unmanned aerial vehicle; receiving one or more commands from a remote control that receives user input to effect operation of the UAV; detecting unauthorized communication interfering with one or more commands from the user; and alerting, via the remote control, the user regarding the unauthorized communication. In a similar aspect, a non-transitory computer readable medium containing program instructions for alerting a user when operation of an unmanned aerial vehicle is compromised may be provided. The computer-readable medium may include: program instructions for authenticating a user to enable operation of the unmanned aerial vehicle; program instructions for receiving one or more commands from a remote control that receives user input to effect operation of the UAV; and program instructions for generating an alert to be provided to the user via the remote control, the alert regarding the detected unauthorized communication interfering with one or more commands from the user.
According to an embodiment of the present invention, there may be provided an unmanned aerial vehicle warning system including: a communication module; and one or more processors operatively coupled to the communication module and individually or collectively configured to: authenticating a user to enable operation of the unmanned aerial vehicle; receiving one or more commands from a remote control that receives user input to effect operation of the UAV; detecting unauthorized communication interfering with one or more commands from the user; and generating a signal to alert the user via the remote control regarding the unauthorized communication. The unmanned aerial vehicle warning module may include: one or more processors individually or collectively configured to: authenticating a user to enable operation of the unmanned aerial vehicle; receiving one or more commands from a remote control that receives user input to effect operation of the UAV; detecting unauthorized communication interfering with one or more commands from the user; and generating a signal to alert the user via the remote control regarding the unauthorized communication.
The unauthorized communication may include a hijacker command. The unauthorized communication may indicate a hijacking attempt by an unauthorized user. The hijacker command may optionally include one or more commands for controlling the operation of the unmanned aerial vehicle. The unauthorized communication indicates a signal jamming attempt by an unauthorized user. The unauthorized communication causing the signal interference need not include a hijacker command for controlling the operation of the unmanned aerial vehicle.
An alert may be generated regarding unauthorized communication. The alert may be provided to a user attempting to operate the unmanned aerial vehicle, to another individual (e.g., an operator and/or manager of an authentication system, an individual in a law enforcement authority, an individual of emergency services), and/or to a control entity.
The warning may be provided visually, audibly, and/or tactilely. For example, a warning may be provided on a display screen of the user's remote control. For example, text or images may be provided indicating unauthorized communication. Text or images may be provided indicating that a disturbance to the user command has occurred. In another example, the alert may be provided audibly via a user remote control. The user remote control may have a speaker that produces sound. The sound may indicate unauthorized communication. The sound may indicate a disturbance to the user command. The alert may be provided tactilely via a remote control. The user remote control may vibrate or bounce. Alternatively, the user remote control may vibrate, heat or cool, deliver a light shock, or provide any other tactile indication. The haptic effect may indicate unauthorized communication. The haptic effect may indicate a disturbance to the user command.
The alert may indicate the type of unauthorized communication. The type of unauthorized communication may be selected from one or more categories of unauthorized communications. For example, the unauthorized communication may be a reactive flight command, a signal-jamming communication, a reactive payload operation command, or a reactive communication (e.g., data transfer) command. The alert may visually distinguish between different types of unauthorized communications. For example, different text and/or images may be provided. The alert may audibly distinguish between different types of unauthorized communication. For example, different sounds may be provided. The different sounds may be different words or different tones. The alert may tactilely distinguish between different types of unauthorized communication. For example, different vibrations or jumps may be used.
Various ways of detecting unauthorized communication may be implemented. For example, an unauthorized communication may be detected when a user identifier associated with the unauthorized communication is not authenticated or does not indicate a user that is allowed to interact with the unmanned aerial vehicle. For example, the hijacker may not be authenticated as a user. In some cases, a separate hijacker identifier may be extracted. It may be determined that the hijacker identifier is not an individual authorized to operate the unmanned aerial vehicle or is not an individual authorized to take over control of operations from the user. In some cases, key information may also be used to identify the hijacker. For example, information about the hijacker key may be extracted during the experience identification and/or authentication process. The hijacker key may not match the user key. The hijacker does not have access to the user's key. The hijacker may thus be identified as a non-user. The hijacker communication may be identified as a non-user communication. In some cases, the hijacker may not be able to generate the user identifier and/or user key.
Unauthorized communication may be detected when a comparison is made between a user command issued from a remote control and/or a command received at the UAV. If no user command is received at the UAV, one or more interfering unauthorized communications may have occurred. Unauthorized communications may be detected when they reduce the effectiveness of the communication channel between the user remote control and the UAV. One or more contradictory commands may or may not be provided to the UAV based on the unauthorized communication. For example, the one or more contradictory commands may be hijacker flight commands, which may contradict user commands. Detection of contradictory commands received by the UAV may be performed. In other cases, contradictory commands are not received at the UAV, and the UAV may not detect the contradictory commands. Failure to receive a command from the user may be sufficient to indicate that an unauthorized communication has interfered with the user command.
When a comparison is made between a user command issued from the remote control and an operation performed by the unmanned aerial vehicle, unauthorized communication can be detected. If the UAV does not act upon a user command, one or more interfering unauthorized communications may have occurred. This may occur in a first phase, where communication may occur between the user remote control and the UAV. This may occur during a second phase between the time the unmanned aerial vehicle receives the command and the time the unmanned aerial vehicle executes the command. Examples of unauthorized types of communication in the first phase are provided above. For unauthorized communication in the second phase, the user command may be received by the UAV. However, alternative contradictory commands may also be received by the unmanned aerial vehicle. The unmanned aerial vehicle may operate according to alternative contradictory commands. A mismatch between the command issued by the user and the action of the UAV may indicate an unauthorized communication. In another example, the UAV may receive user commands and alternative contradictory commands, but may take no action or may take default action due to the conflicting nature of the commands. The lack of action or default action may provide a mismatch between the user command and the operation of the UAV.
In some cases, data from a memory storage system (e.g., as illustrated in fig. 11) may be analyzed to detect unauthorized communications. Data from one or more sets of information may be analyzed. In some embodiments, commands stored in the information sets may be compared. For example, multiple sets of information may be stored, the sets of information being associated with particular interactions between the user and the UAV. The sets of information may include commands issued by the remote control, commands received by the unmanned aerial vehicle, and/or commands executed by the unmanned aerial vehicle. If the command issued by the remote control matches the command received by the unmanned aerial vehicle, there is little or no risk of interference with the communication between the user's remote control and the unmanned aerial vehicle. If the command issued by the remote control does not match the command received by the unmanned aerial vehicle, there is a high risk of interference with the communication between the remote control and the unmanned aerial vehicle. The commands may not match if the unmanned aerial vehicle receives a different command or if the unmanned aerial vehicle does not receive a command. If the command issued by the remote control matches the command executed by the UAV, there is little or no risk that unauthorized communication will interfere with the user's command. If the commands issued by the remote control do not match the operation of the unmanned aerial vehicle, the risk of unauthorized interference with the user commands is high. In some cases, errors in unmanned aerial vehicle operation may occur without hijacker intervention. For example, an unmanned aerial vehicle may receive user commands, but may not be able to execute in accordance with the commands. Commands or other data from the memory storage system may be compared to detect interference with user commands.
In some cases, data such as command data may be pulled from a separate device without residing in a memory storage system. For example, user command data may be pulled from a remote control and/or command data from an unmanned aerial vehicle may be pulled and a comparison may be made.
A hijacker may be an individual who is not authorized to take over control of the unmanned aerial vehicle from a user. However, in other cases, as described elsewhere herein, control may be allowed to take over. For example, a user with a higher priority level or a higher level of operation may be able to take over control. The user may be an administrative user. The user may be a member of law enforcement or emergency services. The user may have any of the characteristics as described elsewhere herein. The UAV may have different reactions when the user is authorized to take over control from the initial user. Any description herein of an authorized user may also apply to autonomous or semi-autonomous systems that may take over control of the unmanned aerial vehicle from the user. For example, the computer may control the unmanned aerial vehicle in some cases, and may operate the unmanned aerial vehicle according to one or more sets of codes, logic, or instructions. The computer may operate the unmanned aerial vehicle according to a set of flight controls.
The system may be able to distinguish between unauthorized takeover and authorized takeover. If the takeover is authorized, the authorized user may be allowed to continue to take over control of the UAV. When the takeover is not authorized, then communications between the unmanned aerial vehicle and the initial user may be re-established, communications between the unmanned aerial vehicle and the unauthorized user may be blocked, a warning may be provided to one or more individuals, the unmanned aerial vehicle may assume one or more default flight responses, and/or an authorized entity may take over control of the unmanned aerial vehicle.
Examples of scenarios in which authorized takeover may occur are presented herein. The UAV may transmit a message containing the signature. The message may include various types of information regarding flight control commands, GPS location of the unmanned aerial vehicle, and/or time information. Any other information about the operation of the unmanned aerial vehicle or the unmanned aerial vehicle may be transmitted (e.g., information about the unmanned aerial vehicle payload, information about the location of the unmanned aerial vehicle payload, information about data collected using the payload, information about one or more sensors of the unmanned aerial vehicle, information about data collected using the one or more sensors, information about communications of the unmanned aerial vehicle, information about navigation of the unmanned aerial vehicle, information about power usage of the unmanned aerial vehicle, or any other information may be transmitted). In some cases, the message may be received by an empty pipe system.
If the air traffic system finds that the unmanned aerial vehicle enters a restricted area from the information sent (e.g., broadcast) by the unmanned aerial vehicle, the air traffic system may alert the unmanned aerial vehicle or dissuade the unmanned aerial vehicle from continuing to be active in the area. The restricted area may or may not be determined by means of a geo-fencing device as described elsewhere herein. The restricted area may optionally be an allocated volume or space above the allocated region. The user may not be authorized to maneuver the unmanned aerial vehicle into the restricted area. The unmanned aerial vehicle may not be authorized to enter the restricted area. In some embodiments, no unmanned aerial vehicle is authorized to enter the restricted area. Alternatively, some unmanned aerial vehicles may be authorized to enter a restricted area, but the area may be limited to only unmanned aerial vehicles controlled by the user.
An alert may be sent to the user via a communication connection between the air management system and the user. For example, an alert may be sent to a user remote that is interacting with the user. The user may be an individual operating the UAV when the UAV enters a restricted area. Optionally, an alert may also be first sent to the UAV, which is then sent to the user via a communication connection between the UAV and the user. The alert may be relayed to the user using any intermediary device or network. If the user does not thus interrupt the unauthorized flight of the unmanned aerial vehicle, the air traffic system can take over the unmanned aerial vehicle. The unmanned aerial vehicle may be given a period of time to allow the unmanned aerial vehicle to exit the restricted area. If the UAV does not leave the restricted area within the time period, the air traffic system may take over control of the operation of the UAV. If the UAV continues further into the restricted area and no steering is initiated, the air traffic system may take over control of the operation of the UAV. Any description herein of an unmanned aerial vehicle entering a restricted area may apply to any other activity of the unmanned aerial vehicle that is not authorized to be under a set of flight controls for the unmanned aerial vehicle. This may include, for example, the unmanned aerial vehicle collecting images with a camera when the unmanned aerial vehicle is located in an area where photography is not permitted. The unmanned aerial vehicle may be similarly alerted. The UAV may or may not be given some time to comply before the empty pipe system takes over.
After initiating the process of taking over control, the air traffic control system may use the digital signature to send remote control commands to the UAV. Such remote control commands may have a reliable digital signature and digital certificate of the air traffic system, which may prevent forged control commands. The digital signature and digital certificate of the air traffic control system cannot be forged.
The flight control unit of the unmanned aerial vehicle may recognize the remote control command from the empty pipe system. The priority of the remote control command from the empty pipe system may be set higher than the priority of the remote control command from the user. Thus, the empty pipe system may be at a higher level of operation than the user. These commands from the empty pipe system may be recorded by the authentication center. The original user of the unmanned aerial vehicle may also be informed of commands from the air traffic system. The original user may be informed that the empty pipe system is taking over. The original user may or may not know the details of how the air management system controls the unmanned aerial vehicle. In some embodiments, the user remote control may show information about how the empty pipe system controls the unmanned aerial vehicle. For example, when the air management system controls the unmanned aerial vehicle, data such as the positioning of the unmanned aerial vehicle may be shown in real time on the remote control. The commands from the empty pipe system may be provided by an operator of the empty pipe system. For example, the empty pipe system may utilize one or more management users having the ability to take over control of the unmanned aerial vehicle. In other cases, commands from the empty pipe system may be provided automatically without human intervention by way of one or more processors. The command may be generated by means of one or more processors according to one or more parameters. For example, if an unmanned aerial vehicle enters a restricted area into which the unmanned aerial vehicle is not authorized to enter, the one or more processors of the air traffic control system may generate a return journey for the unmanned aerial vehicle to exit the restricted area. In another example, the aerial pipe system may generate a path for the unmanned aerial vehicle to initiate a descent.
The air duct system may guide the unmanned aerial vehicle away from the restricted area. Control may then be returned to the original user. Alternatively, the empty pipe system may allow the unmanned aerial vehicle to land properly. Thus, authorized takeover may be allowed in various scenarios. In contrast, an unauthorized takeover can be detected. One or more responses to unauthorized takeover may be undertaken as described elsewhere herein. For example, an authorized entity may take over control of the unmanned aerial vehicle from an unauthorized hijacker. The empty pipe system may be at a higher level of operation than an unauthorized hijacker. The air traffic control system may be able to take over control of the unmanned aerial vehicle from an unauthorized hijacker. In some embodiments, the empty pipe system may be granted a higher level of operation. Alternatively, the empty pipe system may be awarded a high level of operation, while one or more government entities may have a higher level of operation. The air traffic system may have a higher level of operation than all private users.
Deviations in the behavior of the unmanned aerial vehicle can be detected. Deviations in the behavior of the unmanned aerial vehicle may occur due to the activity of one or more hikers. Deviations in the behavior of the unmanned aerial vehicle may occur due to a malfunction of the unmanned aerial vehicle and/or the user remote control. In one example, the unmanned aerial vehicle behavior may include flight. Any description herein of deviation of unmanned aerial vehicle flight may be applicable to any other type of deviation of unmanned aerial vehicle behavior, such as deviation of payload behavior, payload positioning, carrier operation, sensor operation, communication, navigation, and/or electrical usage.
FIG. 13 illustrates an example of unmanned aerial vehicle flight deviations, according to an embodiment of the present invention. Unmanned aerial vehicle 1300 may have a predicted path of travel 1310. However, the actual path 1320 of the unmanned aerial vehicle may be different than the predicted path. At some point in time, a predicted position of the unmanned aerial vehicle may be determined 1330. However, the actual location 1340 of the UAV may vary. In some embodiments, the distance d between the predicted position and the actual position may be determined.
A predicted path 1310 for the unmanned aerial vehicle to travel may be determined. Data from the user's remote control may be used to determine information about the predicted path. For example, a user may provide input to a remote control, and the remote control may provide one or more flight commands to the UAV based on the user input. The flight command from the remote control may be received by an unmanned aerial vehicle flight control unit, which may send one or more control signals to an unmanned aerial vehicle propulsion unit to implement the flight command. In some embodiments, one or more flight commands sent by the remote control may be used to determine a predicted path for the UAV. For example, if the flight command commands the unmanned aerial vehicle to continue straight ahead, it may be expected that the predicted path will continue straight ahead. In calculating the predicted path, the attitude/orientation of the unmanned aerial vehicle may be considered. The flight command may result in the maintenance or adjustment of the orientation of the UAV, which may be used to affect the flight path.
At any given point in time, a predicted position 1330 of the unmanned aerial vehicle may be determined based on the predicted path. In calculating the predicted position, the predicted movement of the unmanned aerial vehicle, such as the predicted velocity and/or the predicted acceleration, may be considered. For example, if the predicted path is determined to be straight ahead and the speed of the unmanned aerial vehicle remains stable, the predicted position may be calculated.
An actual path 1320 of travel of the UAV may be determined. Data from one or more sensors may be used to determine information about the actual path. For example, the unmanned aerial vehicle may have one or more onboard GPS sensors that may be used to determine the coordinates of the unmanned aerial vehicle in real time. In another example, the unmanned aerial vehicle may use one or more inertial sensors, visual sensors, and/or ultrasonic sensors to provide navigation of the unmanned aerial vehicle. Multiple sensor fusion may be implemented to determine the position of the unmanned aerial vehicle. At any given point in time, the actual position 1340 of the UAV may be determined based on the sensor data. The unmanned aerial vehicle may carry one or more sensors, such as those described elsewhere herein. One or more of the sensors may be any of the sensors described elsewhere herein. In some cases, data from onboard sensors may be used to determine the actual path and/or position of the unmanned aerial vehicle. In some cases, one or more off-board sensors may be used to determine the actual path and/or position of the unmanned aerial vehicle. For example, multiple cameras may be provided at known locations and may capture images of the unmanned aerial vehicle. The image may be analyzed to detect a position of the unmanned aerial vehicle. In some cases, a combination of sensors on-board the unmanned aerial vehicle and sensors off-board the unmanned aerial vehicle may be used to determine the actual path and/or position of the unmanned aerial vehicle. The one or more sensors may operate independently of the UAV and may optionally be uncontrollable by a user. The one or more sensors may be located on-board or off-board the UAV while still operating independently of the UAV. For example, an unmanned aerial vehicle may have a GPS tracking device that a user may not be able to control.
A gap d between the predicted position of the unmanned aerial vehicle and the actual position of the unmanned aerial vehicle may be determined. In some cases, the distance d may be calculated by means of one or more processors. The coordinate difference between the predicted position of the unmanned aerial vehicle and the actual position of the unmanned aerial vehicle can be calculated. The coordinates may be provided as global coordinates. Alternatively, the coordinates may be provided as local coordinates and one or more transformations may be performed on the global coordinate system or on the same local coordinate system.
The one or more processors for determining the gap may be located on the UAV, on the remote control, or may be provided external to the UAV and the remote control. In some cases, the one or more processors may be part of an empty pipe system or any other part of an authentication system. In some embodiments, the one or more processors may be part of a flight supervision module, a flight control module, or a traffic management module.
In one example, commands from a remote control may be transmitted to the unmanned aerial vehicle and may also be detected by the air management system. The air management system may receive commands directly from the remote control or via one or more intermediate devices or networks. A memory storage system (e.g., the memory storage system in fig. 11) may receive the command. The memory storage system may be part of or accessible by the air traffic system. Data from sensors (located onboard and/or off-board the UAV) may be received by the air traffic system. The air management system may receive sensor data directly from the sensors, via the unmanned aerial vehicle, or via any other intermediate device or network. The memory storage system may or may not receive sensor data.
In some embodiments, there may be some natural deviation between the predicted path and the actual path of the unmanned aerial vehicle. However, when the deviation is large, there is an increased possibility of deviation due to hijacking or malfunction or any other hazard to the unmanned aerial vehicle. This may be applicable to any type of unmanned aerial vehicle behavior deviation and need not be limited to flight. In some cases, the distance d may be evaluated to determine the risk of hijacking or malfunction or any other hazard to the unmanned aerial vehicle. A risk indication may be determined based on the distance. In some cases, the risk indication may be a binary indication of risk (e.g., presence or absence of risk). For example, if the distance remains below a predetermined value, no risk indication may be provided. If the distance exceeds a predetermined value, an indication of the risk of hijacking or failure may be provided. In other cases, the risk indication may be provided as one or more categories or levels. For example, if the distance meets or exceeds a first threshold, a high risk level may be indicated. An intermediate risk level may be indicated if the distance falls between the first threshold and a lower second threshold. If the distance falls below the second threshold, a low risk level may be indicated. In some cases, the risk level may be substantially continuous, or may be of a wide variety. For example, the risk indication may be quantitative. The risk percentage may be provided based on distance. For example, based on the deviation, a 74% risk of hijacking or failure may be provided.
In some cases, the deviation may be the only factor by which an indication of risk is provided. In other embodiments, other factors in combination with the deviation may be used to determine the risk indication to be presented. For example, under a first set of environmental conditions, a deviation of a particular distance may indicate a high risk of the unmanned aerial vehicle being compromised (e.g., hijacked, failed), while under a second set of environmental conditions, the same particular distance may indicate a low risk of the unmanned aerial vehicle being compromised. For example, on a calm day, a deviation of 10 meters from the predicted location may indicate that some form of hijacking or malfunction has occurred. However, on a windy day, a deviation of 10 meters may present a lower risk of hijacking or failure, as the wind will make deviations in the flight path more likely.
Factors that may be considered in determining a risk indication may include environmental conditions (e.g., ambient climate (wind, precipitation, temperature), traffic flow, environmental complexity, obstacles), movement of the unmanned aerial vehicle (e.g., speed, acceleration), communication conditions (e.g., signal strength, likelihood of a signal disappearing or a jamming signal being present), sensitivity of the unmanned aerial vehicle type (e.g., cornering, stability), or any other factor.
A risk indication may be provided that the UAV is not operating in accordance with one or more flight commands. This may include the degree of risk of the unmanned aerial vehicle being compromised. This may include flight operations, payload operations, carrier operations, sensor operations, communications, navigation, power usage, or any other type of unmanned aerial vehicle operation described herein. A greater deviation in unmanned aerial vehicle behavior may correspond to a higher degree of risk that the unmanned aerial vehicle is not operating according to one or more flight commands. In some embodiments, the type of deviation in unmanned aircraft behavior is evaluated to determine a degree of risk. For example, flight deviations of the unmanned aerial vehicle may be treated differently than deviations in payload activity. In some embodiments, the positional deviation of the unmanned aerial vehicle may be treated differently from the velocity deviation of the unmanned aerial vehicle.
Aspects of the present invention may relate to a method of detecting flight deviations of an unmanned aerial vehicle, the method comprising: receiving one or more flight commands provided by a user from a remote control; calculating, with the aid of one or more processors, a predicted position of the UAV based on the one or more flight commands; detecting an actual position of the UAV by means of one or more sensors; comparing the predicted location to the actual location to determine a deviation in the behavior of the UAV; and providing an indication of risk that the UAV is not operating in accordance with the one or more flight commands based on the deviation in UAV behavior. Additionally, a non-transitory computer readable medium containing program instructions for detecting flight deviations of an unmanned aerial vehicle may be provided, the computer readable medium comprising: program instructions for calculating a predicted position of the UAV based on one or more flight commands provided by a user from a remote control; program instructions for detecting an actual position of the UAV by means of one or more sensors; program instructions for comparing the predicted position to the actual position to determine the deviation in unmanned aerial vehicle behavior; and program instructions for providing, based on the unmanned aerial vehicle behavioral deviation, an indication of risk that the unmanned aerial vehicle is not operating in accordance with the one or more flight commands.
According to an embodiment of the invention, an unmanned aerial vehicle flight deviation detection system may be provided. The flight deviation detection system may include: a communication module; and one or more processors operatively coupled to the communication module and individually or collectively configured to: receiving one or more flight commands provided by a user from a remote control; calculating a predicted position of the UAV based on the one or more flight commands; detecting an actual position of the UAV by means of one or more sensors; comparing the predicted location to the actual location to determine a deviation in the behavior of the UAV; and generating a signal to provide an indication of risk that the UAV is not operating in accordance with the one or more flight commands based on the UAV behavior deviation. An unmanned aerial vehicle flight deviation detection module may include: one or more processors individually or collectively configured to: receiving one or more flight commands provided by a user from a remote control; calculating a predicted position of the UAV based on the one or more flight commands; detecting an actual position of the UAV by means of one or more sensors; comparing the predicted location to the actual location to determine a deviation in the behavior of the UAV; and generating a signal to provide an indication of risk that the UAV is not operating in accordance with the one or more flight commands based on the UAV behavior deviation.
A risk indication may be provided as a warning. The user may be provided with a warning via the user's remote control. The user may then be able to choose to take an action according to the risk indication. Providing a risk indication to an empty pipe system separate from the remote control and the UAV. The empty pipe system may then be able to determine whether to take an action based on the risk indication. For example, the air duct system may take over control of the unmanned aerial vehicle. The empty pipe system may require the user to confirm whether the user is still controlling the UAV before determining whether to take over control of the UAV. For example, if a user confirms that an unmanned aerial vehicle is operating according to the user's command, the air management system may determine not to take over control of the unmanned aerial vehicle. If the user does not confirm that the UAV is operating according to the user's command, the air traffic control system may take over control of the UAV.
In some embodiments, the risk indication may be presented to the unmanned aerial vehicle itself. The unmanned aerial vehicle may have one or more onboard protocols in place that may initiate one or more default flight responses from the unmanned aerial vehicle. For example, if an unmanned aerial vehicle receives a report that its flight control is compromised, the unmanned aerial vehicle may automatically initiate a landing sequence, may automatically hover in place or fly in a holding pattern, may automatically return to a mission starting point, or may automatically fly to a designated "home" location. In some embodiments, the starting point for the mission may be the location at which the UAV takes off. The "home" position may be a predetermined set of coordinates that may be stored in the memory of the UAV. In some cases, the start of the mission may be set to the homing position. In some cases, the homing location may be a location of a user or a user remote control. Even if the user moves around, the homing position of the remote controller may be updated and the unmanned aerial vehicle may be able to find the remote controller. In some cases, the user may manually enter homing coordinates or designate a street address as a homing location. The unmanned aerial vehicle may block commands from an external source while undergoing a default flight response procedure. In some cases, the unmanned aerial vehicle may block commands from private users while undergoing a default flight response procedure. The unmanned aerial vehicle may or may not block commands from the empty pipe system or the control entity while undergoing a default flight response procedure.
The risk indication that the UAV is not operating in accordance with the one or more flight commands may include information regarding a type of hazard for the UAV. For example, the risk indication that the unmanned aerial vehicle is not operating in accordance with the one or more flight commands may include a risk indication that the unmanned aerial vehicle is hijacked. In another example, the risk indication that the unmanned aerial vehicle is not operating according to one or more flight commands may include a risk indication that a signal from a remote control is disturbed. In another example, the risk indication that the UAV is not operating in accordance with the one or more flight commands may include a risk indication that a fault has occurred on board the UAV. The warning may convey information regarding the type of hazard for the unmanned aerial vehicle. The alert may convey information about the degree of risk. The warning may convey information regarding the degree of risk for various types of unmanned aerial vehicle hazards. For example, the warning may indicate that the likelihood of some form of hazard is 90%, and that 85% of the hazards are due to hijacking, 15% of the hazards are due to communication interference, and 0% of the hazards are due to a fault on board the UAV.
The risk of hijacking may be higher if the commands received by the unmanned aerial vehicle are different from the commands issued by the remote control. The risk of communication interference may be higher if the command received by the unmanned aerial vehicle is a command missing from the commands issued by the remote controller. If the commands received by the UAV match the commands issued via the remote control, but UAV operation is not in accordance with the received commands, the risk of onboard malfunction may be higher.
Any description herein of hijacking may also apply to hacking. A hacker may intercept one or more communications transmitted to or from the unmanned aerial vehicle. A hacker may intercept the data collected by the unmanned aerial vehicle. A hacker may intercept data from one or more sensors or payloads of the unmanned aerial vehicle. For example, a hacker may intercept data from an image capture device on an unmanned aerial vehicle. A hacker may thus attempt to steal the obtained data. A hacker may thus violate the privacy of the operator of the unmanned aerial vehicle. Hackers may also intercept communications that are passed to the unmanned aerial vehicle. For example, a hacker may intercept one or more commands from a user remote control to the unmanned aerial vehicle. A hacker may intercept the commands to determine how the unmanned aerial vehicle will operate. The hacker may use the intercepted commands to determine the location of the unmanned aerial vehicle (which may otherwise be unobtrusive) or other activity of the unmanned aerial vehicle.
Interception of the communication (uplink or downlink) by a hacker may not interfere with the rest of the communication. For example, when a hacker intercepts an image stream from an unmanned aerial vehicle, the intended recipient of the image may still receive the image. The intended recipient may otherwise be unaware that an interception has occurred. Alternatively, the interception of the communication may interfere with the rest of the communication. For example, when an image is intercepted, the intended recipient of the image may not receive the image. The systems and methods described herein may help detect and/or prevent hacker intrusion.
For example, the device may need to be authenticated before any communication with the various components of the unmanned aircraft system is made. For example, the device may need to be authenticated prior to receiving a communication from the UAV and/or the remote control. A device may receive a communication only if the device is authorized to receive the communication. A hacker may not be authorized to receive the communication and may therefore not be able to receive the communication. Similarly, if a hacker attempts to send out a false alternate communication, the hacker's identity may not correspond to the authorized user and the hacker may be prevented from sending out a false communication. Similarly, if a spurious communication is issued or an attempt is made to issue a spurious communication from an unauthorized user, a warning may be provided to the authorized user. In some implementations, the communication may be encrypted. In some cases, only authorized and/or authenticated users may be able to decrypt the encrypted communication. For example, even if a hacker were to intercept the communication, the hacker may not be able to decrypt and interpret the communication. In some embodiments, decryption may require a user's key, which may be stored only in the physical memory of the authorized device. Thus, it may be difficult for a hacker to attempt to obtain a copy of the key and/or attempt to pretend to be an authorized user.
Personalized evaluation
Activities of the user and/or the unmanned aerial vehicle may be evaluated. For example, the activities of a user of the unmanned aerial vehicle may be evaluated. Since the user may be uniquely identifiable, the user's activities may be tied to a unique user identity. Thus, activities performed by the same user may be associated with the user. In one example, a user's activities may be related to the user's previous flight mission. Various testing, certification, or training exercises may also be associated with the user. The user's activity may also refer to any failed attempts by the user to engage in a task, interfere with the operation of another user's unmanned aerial vehicle, and/or intercept communications with another user's unmanned aerial vehicle. In some implementations, the user may be authenticated prior to associating the activity with the user identifier.
The activity of the unmanned aerial vehicle can be evaluated. The unmanned aerial vehicle may also be uniquely identifiable, and the activities of the unmanned aerial vehicle may be tied to a unique unmanned aerial vehicle identity. Thus, activities performed by the same UAV may be associated with the UAV. In one example, the activities of the unmanned aerial vehicle may be related to previous flight missions of the unmanned aerial vehicle. Various maintenance activities, diagnostics or certification may also be associated with the unmanned aerial vehicle. The activity of the UAV may also include any errors, faults, or incidents. In some implementations, the UAV may be authenticated prior to associating the activity with the UAV identifier.
One or more activities of the user and/or the UAV may be evaluated. In some cases, evaluating may include providing a qualitative evaluation. For example, one or more records of any activity with respect to the user or the UAV may be associated with the user or the UAV and/or corresponding activity of the user or the UAV. For example, if an accident involves an unmanned aerial vehicle, a record may be associated with the unmanned aerial vehicle regarding the accident, how the accident occurred, and who was wrongly blamed for. In another example, if a user has previously performed many flight missions at high wind speeds and successfully navigated through different terrain, a record may be provided regarding these achievements. One or more types of ratings may be associated with a user and/or corresponding activities of the user. For example, if a user performs many difficult tasks, the user may have an evaluation of an "expert user" associated with the user. If the unmanned aerial vehicle experiences many faults or errors, the unmanned aerial vehicle may have an assessment of a "high risk of failure" associated with the unmanned aerial vehicle.
In some cases, evaluating may include providing a quantitative evaluation. For example, the user and/or the UAV activity may receive a rating, such as an alphabetical or numerical rating. The rating may relate to any activity of the user or the UAV and may be associated with the user or the UAV and/or corresponding activity of the user or the UAV. For example, when a user completes a task, the user may receive a rating or score on how the user performed during the task. For example, for a first task, the user may receive a 7.5 rating, while for a second task, it may receive a 9.8 rating, indicating that the user may perform better during the second task. Other factors such as task difficulty may play a role. In some cases, the user may receive a higher rating for successfully completing a more difficult task. In another example, a user may undergo a skill test or certification test and may receive a numerical score indicating how the user performed. The UAV may receive a rating based on how the mission is completed. For example, an unmanned aerial vehicle may receive a lower rating if the unmanned aerial vehicle fails during a mission than if the unmanned aerial vehicle did not fail during the mission. The unmanned aerial vehicle may be ranked higher if the unmanned aerial vehicle is undergoing periodic maintenance as compared to an unmanned aerial vehicle that is not undergoing periodic maintenance.
The user and/or UAV may have a higher overall rating when they successfully complete the activity in an active manner. The user and/or UAV may have a lower overall rating when the user and/or UAV is not successful in completing an activity or performing a behavior that may be suspicious. Thus, the user and/or the unmanned aerial vehicle may have a reputation score that is based on the activity of the user and/or the unmanned aerial vehicle.
In some embodiments, the evaluation system may provide a set of evaluations for the user and/or the unmanned aerial vehicle. The evaluation may be determined automatically, by means of one or more processors, without human interaction, without the need for human interaction, for example, qualitative evaluation and/or quantitative evaluation. The assessment may be determined according to one or more parameters or algorithms and data regarding the activity of the user and/or the unmanned aerial vehicle. For example, each successfully completed task may automatically improve user and/or unmanned aircraft evaluations to more positive results. Each failed task or crash may automatically reduce user and/or unmanned aerial vehicle ratings to a more negative result. The assessment may thus be objective.
Alternatively or additionally, the evaluation may be provided by one or more human users. For example, the user may rate himself or herself. A user may evaluate the unmanned aerial vehicle that the user is maneuvering. In other cases, the user's peer may evaluate him or her. The peer may evaluate the unmanned aerial vehicle operated by the user. For example, a first user may be operating a first unmanned aerial vehicle. The second user may observe the first user and notice that the first user is engaged in poor flight behavior (e.g., an unmanned aerial vehicle heading toward the second user suddenly assaults or scares the second user). The second user may provide a negative rating of the first user. In another example, the second user may observe that the first user is engaged in positive flight behavior (e.g., performing a difficult maneuver, assisting the second user) and may provide a positive assessment of the first user.
The user and/or UAV ratings may be viewed by other users. For example, the first user may view the overall rating of the second user and vice versa. The user may view the user's own ratings. The user may take action to attempt to increase the user's rating. In some embodiments, the system may allow the user's activities only if the user rating reaches a threshold level. For example, a user may operate in certain areas only if the user rating reaches a threshold level. Only when the user rating is 7.0 or higher, the user can maneuver the unmanned aerial vehicle in certain areas. The level of flight restriction may depend on user and/or unmanned aircraft evaluations. In some cases, the user and/or UAV rating may indicate a user type and/or UAV type or vice versa. A lower level of flight restrictions may be provided when the user has a higher user rating, and a higher level of flight restrictions may be provided when the user has a lower user rating. The set of flight controls for a user may be more stringent when the user has a lower user rating, and less stringent when the user has a higher user rating. A lower level of flight restrictions may be provided when the UAV has a higher UAV rating, and a higher level of flight restrictions may be provided when the UAV has a lower UAV rating. The set of flight controls for the UAV may be more stringent when the UAV has a lower UAV rating, and less stringent when the UAV has a higher UAV rating.
Flight monitoring
It may be desirable for the air traffic system to be aware of the unmanned aerial vehicle location. The unmanned aerial vehicle may send the location information to an air traffic control system. In some cases, it may be preferable for security and/or safety purposes to have the unmanned aerial vehicle report position information. The unmanned aerial vehicle may periodically and actively report its current location and course to the air management system. However, in some cases, the unmanned aerial vehicle may not comply with the report. This may occur when communication between the unmanned aerial vehicle and the air traffic system is lost or the unmanned aerial vehicle may maliciously intentionally hide information or provide false (i.e., counterfeit) information. The empty pipe system may deploy a recorder in its management area to monitor the status of the unmanned aerial vehicle. One or more recorders may be deployed to monitor unmanned aerial vehicle activity.
FIG. 14 shows an example of a monitoring system using one or more recorders, according to an embodiment of the present invention. Unmanned aerial vehicle 1410 may be provided within an environment. The environment may be within an area managed by an empty pipe system. One or more recorders (e.g., recorder a 1420a, recorder B1420B, recorder C1420C … …) may be provided within an area managed by the empty pipe system. Monitoring device or system 1430 may receive information collected from one or more recorders. In some embodiments, the monitoring system may be an empty pipe system.
In some embodiments, the air management system may be used to manage the entire unmanned aircraft flight system. The area managed by the air traffic system may be worldwide. In other cases, the area managed by the empty pipe system may be limited. The zone managed by the air traffic control system may be jurisdiction based. For example, the air traffic control system may manage unmanned aircraft flight systems throughout a jurisdiction (e.g., a country, state/province, region, city, town, village, or any other jurisdiction). Different empty pipe systems may manage different areas. The size of the regions may be similar or may be different.
Unmanned aerial vehicle 1410 may transmit one or more messages that may be monitored by one or more recorders 1420a, 1420b, 1420 c. The message from the UAV may include a signature. In some cases, the message from the unmanned aerial vehicle may include identification information unique to the unmanned aerial vehicle (e.g., an unmanned aerial vehicle identifier and/or unmanned aerial vehicle key information). The identification information may uniquely identify and distinguish the UAV from other UAVs. The message from the UAV may include any other information, such as information regarding flight control commands, the GPS location of the UAV (or other location information for the UAV), and/or time information. The information may include location information (e.g., GPS information) for the UAV. The time information may include a time at which the message was formulated and/or transmitted. The time may be provided according to a clock of the unmanned aerial vehicle.
The signals acquired from these recorders 1420a, 1420b, 1420c may be time stamped and aggregated in the empty pipe system 1430. Data from the recorder may be stored in a memory storage system. The memory storage system may be part of or accessible by the air traffic system. The empty pipe system may analyze the information from the data logger. Thus, the air management system may be able to collect historical control information for the unmanned aerial vehicle, as well as flight information and the purported GPS track of the unmanned aerial vehicle. The air management system may be capable of collecting operational data about the unmanned aerial vehicle, which may include commands sent to the unmanned aerial vehicle, commands received by the unmanned aerial vehicle, actions performed by the unmanned aerial vehicle, and information about the unmanned aerial vehicle, such as the location of the unmanned aerial vehicle at different points in time.
In some embodiments, the unmanned aerial vehicle may communicate directly with the air management system and/or provide information directly to a memory storage system. Alternatively, the UAV may communicate directly with one or more recorders, which may communicate with the air traffic system and/or the memory storage system. In some cases, information about the unmanned aerial vehicle may be relayed to the air traffic system via one or more recorders. In some cases, the recorder may provide additional data associated with the information sent to the unmanned aerial vehicle to the air management system.
The empty pipe system may analyze the time data to determine a location of the unmanned aerial vehicle. One aspect of the invention may relate to a method of determining a position of an unmanned aerial vehicle, the method comprising: receiving one or more messages from the UAV at a plurality of recorders; time stamping, at the plurality of recorders, one or more messages from the UAV; and calculating, by means of one or more processors, a location of the UAV based on the timestamps of the one or more messages. A non-transitory computer readable medium containing program instructions for determining a position of an unmanned aerial vehicle may be provided, the computer readable medium comprising: program instructions for receiving one or more messages from the UAV at a plurality of recorders; program instructions for time stamping one or more messages from the UAV at the plurality of recorders; and program instructions for calculating a position of the UAV based on the timestamps of the one or more messages. An unmanned aerial vehicle communication location system can include: a communication module; and one or more processors operatively coupled to the communication module and individually or collectively configured to calculate a position of the UAV based on timestamps of one or more messages sent from the UAV and received at a plurality of recorders remote from the UAV.
In one example of analyzing the position of the UAV based on the time data, the time differences of the signals transmitted from different recorders that collect the same message from the same UAV may be used to determine a coarse position of the UAV. For example, if both recorders receive a signal transmitted from a particular unmanned aerial vehicle, then based on the time difference between the signals received by the recorders, the unmanned aerial vehicle is known to be on the hyperbolic curve formed by the time difference measurement and the locations of the two recorders. In forming a hyperbola, two, three or more recorders may be used. Thus, a rough location of the unmanned aerial vehicle can be declared along a hyperbola. The signals may be time stamped when they leave the unmanned aerial vehicle and when they arrive at the recorder. Such information may be used to calculate the time difference.
In another example, multiple recorders may receive signals from the unmanned aerial vehicle and may determine a time difference. An example of a time difference may be a time difference between the unmanned aerial vehicle transmitting a signal and the recorder receiving a signal. In some cases, when sending a signal to one or more recorders, the unmanned aerial vehicle may timestamp the signal. One or more recorders may be time stamped when the signal is received. The time difference between the two timestamps may indicate the travel time of the signal to the recorder. The travel time may be related to a coarse distance from the UAV to the recorder. For example, if the travel time is shorter, the unmanned aerial vehicle may be closer to the recorder than if the travel time is longer. If multiple recorders show different travel times, the UAV may be closer to the recorder showing the smaller travel time. For example, if the UAV is further away from the recorder, a longer travel time for the signal may be expected. The travel time may be a small unit of time. For example, the travel time may be on the order of seconds, milliseconds, microseconds, or nanoseconds. The time stamp of the unmanned aerial vehicle and/or recorder may have a high accuracy (e.g., on the order of seconds, milliseconds, microseconds, and/or nanoseconds). The clocks on the unmanned aerial vehicle and/or the recorder may be synchronized. The clock may be used to provide a timestamp. In some cases, there may be some offset between one or more clocks of the UAV and/or recorder, but the offset may be known and may be compensated for. Triangulation techniques or other similar techniques may be used to determine a coarse location of the UAV based on the distance from the UAV to one or more recorders.
In an additional example of analyzing the position of the UAV, the one or more recorders may also approximately determine the distance between the UAV and the recorders by a Received Signal Strength Indication (RSSI). The RSSI may be a measure of the power present in a signal (e.g., a radio signal) received at the recorder. A higher RSSI measurement may indicate a stronger signal. In some implementations, the wireless signal may attenuate with distance. Thus, stronger signals may be associated with closer distances, while weaker signals may be associated with greater distances. An approximate distance of the UAV from the recorder may be determined based on the RSSI. In some cases, the RSSI of multiple recorders may be compared to determine a coarse location of the unmanned aerial vehicle. For example, if two recorders are provided, the potential location of the unmanned aerial vehicle may be provided as a hyperbola. If three or more recorders are provided, triangulation techniques may be used to determine the coarse UAV position. If multiple recorders show different RSSI values, the unmanned aerial vehicle may be closer to the recorder showing a stronger RSSI value than the recorder showing a weaker RSSI value.
Further examples may provide one or more recorders having multiple receive channels. The recorder may have one or more antennas that may receive multiple signals. For example, the receive antenna may have multiple receive channels. The plurality of signals received through the plurality of receive channels may be processed to obtain a relative direction of the unmanned aerial vehicle. The air traffic system, the authentication system, or other portions of the authentication system may process the plurality of signals from the receive antennas by way of receive beamforming. This may allow an empty pipe system or other entity to generally acquire the orientation and position of the unmanned aerial vehicle relative to the recorder. Beamforming may detect the direction of arrival of the signal and may be used to detect the direction of the unmanned aerial vehicle relative to the recorder. Multiple recorders may be used to narrow the range of positions of the unmanned aerial vehicle by looking at where the desired directions of the unmanned aerial vehicle intersect.
In another example, the recorder may include a sensor, such as a visual sensor, an ultrasonic sensor, or other type of sensor. The recorder may be capable of recording the environment surrounding the recorder. The recorder may record the presence or movement of the UAV within the environment. For example, the recorder may record an unmanned aerial vehicle flying within the environment using video tape. The data from the recorder may be used to detect the UAV and analyze the position of the UAV relative to the recorder. For example, the distance of the UAV from the recorder may be determined based on the size of the UAV in the image, and/or the orientation may be determined when the orientation of the sensor is known.
The position of the recorder may be known. The global coordinates of the recorder may be known. Alternatively, the recorders may have local coordinates and the local coordinates of the recorders may be converted to a common coordinate system. In some embodiments, the recorder may have a predetermined position. In other cases, the recorders may be moved around or placed in a particular manner. The recorder may transmit a signal indicating the position of the recorder. For example, each recorder may have a GPS unit that may provide global coordinates for the recorder. Coordinates for the recorder may be transmitted. The location of the recorder may be known by the empty pipe system or any other part of the authentication system. The empty pipe system or any other part of the authentication system may receive a signal from the recorder indicating the position of the recorder. Thus, even if the recorder moves around, the empty pipe system can have updated data about the recorder position. In some cases, the recorder may be a small device that can be picked up and moved around. The recorder may or may not be self-propelled. The recorder may be hand-held or capable of being carried by a human. Alternatively, the recorder may be a relatively large device that may be permanently or semi-permanently provided at one location.
Any number of recorders may be provided within the area. One, two, three, four, five, six, seven, eight, nine, ten or more recorders may be provided. In some cases, the signal from the unmanned aerial vehicle may have a limited range. In some cases, only recorders located within a certain distance of the unmanned aerial vehicle may receive the signal. The recorder of the received signal may record information about the received signal and may provide the information to an authentication system (e.g., an air management system of the authentication system). A larger number of recorders receiving signals from the unmanned aerial vehicle may provide a greater degree of certainty or accuracy in the rough location of the unmanned aerial vehicle. In some embodiments, providing a greater density or greater number of recorders within a particular area may increase the likelihood that a comparable number of recorders will receive signals from the UAV. In some cases, the recorders may be distributed in an area having a density of at least 1 recorder per square mile, 3 recorders per square mile, 5 recorders per square mile, 10 recorders per square mile, 15 recorders per square mile, 20 recorders per square mile, 30 recorders per square mile, 40 recorders per square mile, 50 recorders per square mile, 70 recorders per square mile, 100 recorders per square mile, 150 recorders per square mile, 200 recorders per square mile, 300 recorders per square mile, 500 recorders per square mile, 1000 recorders per square mile.
The recorders may be distributed over a large area. For example, the plurality of recorders can be distributed over an area greater than about 50, 100, 300, 500, 750, 1000, 1500, 2000, 3000, 5000, 7000, 10000, 15000, 20000, or 50000 square meters. Having a large area may help detect differences in travel time, signal strength, or other data collected by the recorder. If the recorder is located only in a small area, the travel time of the signal will be small, or it may be difficult to distinguish or distinguish between recorders. The recorders may be interspersed separately from each other. For example, at least two of the plurality of recorders are located at least 1 meter from each other, 5 meters from each other, 10 meters from each other, 20 meters from each other, 30 meters from each other, 50 meters from each other, 75 meters from each other, 100 meters from each other, 150 meters from each other, 200 meters from each other, 300 meters from each other, 500 meters from each other, 750 meters from each other, 1000 meters from each other, 1250 meters from each other, 1500 meters from each other, 1750 meters from each other, 2000 meters from each other, 2500 meters from each other, 3000 meters from each other, 5000 meters from each other, or 10000 meters from each other. Having a separately disseminated recorder can help detect differences in travel time, signal strength, or other data collected by the recorder. If the recorders are close together, the travel time of the signal will be small and it may be difficult to distinguish or distinguish between recorders.
The authentication center, the air traffic system, or any other portion of the authentication system may calculate a coarse location of the unmanned aerial vehicle based on the data received from the recorder. One or more processors may be used to calculate a coarse position of the UAV based on the data from the recorder. The data from the recorder may include time stamp data, signal strength data, or any other type of data described elsewhere herein.
The certification system, the air traffic system, or any other portion of the certification system may check the purported position of the unmanned aerial vehicle against the coarse location information based on the timing information to determine whether there is a relatively large difference between the purported position and the coarse location. A larger deviation may indicate a higher risk of causing some form of harm to the unmanned aerial vehicle. For example, a larger deviation may indicate a higher risk of falsifying the position of the unmanned aerial vehicle. False positive location from an unmanned aerial vehicle may indicate malicious or fraud-related behavior (e.g., sensor tampering or reporting false location data). The false positive location may indicate an error or failure of the navigation system of the unmanned aerial vehicle (e.g., a compromised GPS sensor, one or more sensor errors to determine the position of the unmanned aerial vehicle). Errors or malfunctions of the navigation system of the unmanned aerial vehicle are not necessarily malicious, but may also cause problems if the position of the unmanned aerial vehicle is not accurately tracked. In some cases, the purported/reported position of the unmanned aerial vehicle may be compared to a calculated position of the unmanned aerial vehicle, and a warning may be provided when the difference between the calculated position and the purported position exceeds a threshold. In some cases, only the differences between locations are considered in determining the risk of fraud. Alternatively, other factors as described elsewhere herein may be considered, such as environmental conditions, wireless communication conditions, unmanned aircraft model parameters, or any other factor.
The recorder may exist as an air monitoring system and may record a history of the flight mission being monitored. The air management system can compare the flight information actively reported by the unmanned aerial vehicle with the flight information of the same unmanned aerial vehicle acquired by one or more recorders, and quickly judge the authenticity of the data reported by the unmanned aerial vehicle. When there is a risk of causing some form of hazard to the unmanned aerial vehicle, a warning may be provided. A warning may be provided to an operator of the unmanned aerial vehicle, an empty pipe system, or any other entity. The warning may or may not indicate an estimated level of risk. The warning may or may not indicate the type of hazard (e.g., possible malicious tampering or counterfeiting, possible sensor failure).
Authentication procedure for a user
The authentication center may use any number of processes to authenticate the identity of a user operating the unmanned aerial vehicle. For example, the authentication center may use a simple authentication process, such as by comparing a user's identification information and password with authentication information stored in association with the user. In other processes, the identity of the user may be authenticated by acquiring voiceprint, fingerprint, and/or iris information of the user and comparing the acquired information to stored user information. Alternatively, the user identity may be verified using Short Message Signal (SMS) verification sent to a mobile device associated with the user. The system may also utilize a token and/or a built-in identity module in the remote control to authenticate a user associated with the remote control.
In a further example, the process of authenticating the user may include multiple forms of authentication listed above. For example, a user may be authenticated using a two-step process that requires the user to enter identification information (such as a username) along with a password, and then ask the user to provide a second step of authentication by verifying text received at the user's mobile device.
After successfully performing authentication, the authentication center may retrieve information about the user from a database. The retrieved information may be sent to an air management system, which may then determine whether the user has permission to fly.
Authentication process for unmanned aerial vehicle
When authentication of the user can be performed using personal information (e.g., voiceprint, fingerprint) inherent to the user, the authentication of the unmanned aerial vehicle can utilize device information stored within the unmanned aerial vehicle, such as a key encoded into the device. Thus, authentication of the UAV may be based on a combination of the UAV identifier and a key stored within the UAV. In addition, Authentication and Key Agreement (AKA) may be used to enable authentication of the unmanned aerial vehicle. AKA is a two-way authentication protocol. In an example, when AKA is used, the authentication center may authenticate the legitimacy of the unmanned aerial vehicle, and the unmanned aerial vehicle may authenticate the legitimacy of the authentication center. An example of AKA-based authentication is discussed in fig. 15.
Fig. 15 shows a diagram 1500 of mutual authentication between an unmanned aerial vehicle and an authentication center according to an embodiment of the invention. In particular, fig. 15 illustrates the manner in which an unmanned aerial vehicle interacts with an air traffic system, which in turn interacts with a certification center.
AKA authentication between the unmanned aerial vehicle and an authentication center may be performed based on a Universal Subscriber Identity Module (USIM). In particular, the unmanned aerial vehicle may have an onboard USIM module that contains an International Mobile Subscriber Identity (IMSI) and a key. In some examples, a key is burned into an unmanned aerial vehicle when the unmanned aerial vehicle is manufactured. When manufacturing the unmanned aerial vehicle, the key may be permanently written or integrated into the unmanned aerial vehicle. Thus, the key is protected and cannot be read out. In addition, the key is shared with the authentication center or any other component of the authentication system (e.g., authentication center 220 as illustrated in fig. 2). The difficulty of cracking the USIM is very high. In the field, AKA is recognized as an authentication system with a high degree of security. Thus, the authentication center may have a high degree of security and public trust, which extends to the protection of the subscriber's IMSI, keys and counters SQN.
As seen in step 1505, the unmanned aerial vehicle may provide the authentication request and the IMSI to the air traffic system. The drone may transmit its IMSI either actively (broadcast) or passively (in reply).
Once the authentication request and IMSI have been received at the air management system, the air management system may transmit an authentication data request to the authentication center in step 1510. The authentication data request may include information about the UAV, such as the UAV's IMSI.
In step 1515, the authentication center may receive the IMSI, query the corresponding key, generate a random number, and calculate an Authentication Vector (AV) according to a predetermined algorithm. The algorithms f1, f2, f3, f4, and f5 are described in the typical Universal Mobile Telecommunications System (UMTS) security protocol. The AV may contain 5 elements such as RAND (random number), XRES (expected response), CK (encryption key), IK (integrity check key), and AUNT (authentication token). Alternatively, the authentication vector may contain one or more elements including at least one of the elements in this example (including different elements). In this example, AUTN consists of a hidden counter SQN, AMF (submitted authentication management) and MAC (message authentication code).
AUNT:=SQN(+)AK||AMF||MAC
The authentication center may transmit an authentication data response to the air traffic system at step 1520, which may then provide an authentication response including AUNT and RAND to the UAV at step 1525. Specifically, AUNT and RAND may be transmitted to the security module a of the unmanned aerial vehicle. In step 1530, security module a may verify AUNT. The security module a may calculate AK from RAND and the secret key. Once AK is calculated, SQN can be calculated and recovered from AUNT. XMAC (expected authentication code), RES (response to random number), CK (encryption key), and IK (integrity check key) can then be computed. Security module a may compare MAC and XMAC. If the MAC and XMAC are not the same, the UAV may send an authentication reject message to the remote control and the authentication center, and in response, terminate the authentication. Alternatively, if the MAC and XMAC are the same, the security module a may check if the received SQN falls within a reasonable range. In particular, the security module a may record the maximum SQN received so far, so the SQN may only be incremented. If the SQN is abnormal, the UAV may send a synchronization failure message, which in turn may terminate the authentication. Alternatively, if the SQN falls within a reasonable range, the security module a may verify the authenticity of the AUNT and may provide the computer RES to the authentication center.
As seen in fig. 15, in step 1535, security module a may transmit RES to the air traffic system, which may in turn provide RES to the authentication center in step 1540. Alternatively, the unmanned aerial vehicle may provide the RES directly to the authentication center.
Once the authentication center has received the RES, the authentication center may compare the XRES (expected response to the random number) and the RES in step 1545. If the XRES and RES are found to be inconsistent, the authentication fails. Alternatively, if XRES and RES are found to be the same, the mutual authentication is successful.
After mutual authentication, the unmanned aerial vehicle and the air traffic control system can perform secure communication by using the negotiated CK and IK. Specifically, after authenticating with each other, the authentication center may transmit the negotiated CK and IK to the air traffic system in step 1550. Additionally, in step 1555, the unmanned aerial vehicle may calculate the negotiated CK and IK. Once the air traffic system has received the negotiated CK and IK from the authentication center, secure communications may be established between the unmanned aerial vehicle and the air traffic system in step 1560.
Empty pipe system
The air management system may include a user and unmanned aerial vehicle access subsystem, an air monitoring subsystem, a traffic management subsystem, and a geo-fencing subsystem. The air management system may perform a number of functions, including real-time situational monitoring and recording of unmanned aerial vehicle activity. The air management system may also assign traffic rights in restricted airspace. In addition, the air management system can accept a request for a geo-fence device and can also validate and determine attributes of the geo-fence device.
The air management system may also perform necessary user and aircraft certification audits. Further, the empty pipe system may monitor the offending aircraft. The empty pipe system may identify violations or behaviors that are close to violations, and may alert such behaviors. In addition, the air management system may provide countermeasures for continuing violating aircraft, such as air monitoring, traffic right management, security interfaces with certification centers, geofencing, and other forms of violation countermeasures. The air traffic control system may also have a separate event logging function. The air traffic system may also provide hierarchical access to air traffic information.
The air management system may include an air monitoring subsystem. The air condition monitoring subsystem may be responsible for monitoring flight conditions in the allocated airspace in real time, such as the flight of an unmanned aerial vehicle. In particular, the air condition monitoring subsystem may monitor whether an authorized aircraft is flying along a predetermined channel. The air monitoring subsystem may also be responsible for discovering abnormal behavior of authorized aircraft. Based on the discovered abnormal behavior, the air condition monitoring subsystem may alert the violation countering system. Additionally, the air condition monitoring subsystem may monitor for the presence of an unauthorized aircraft and may alert the violation counter-system based on the detection of the unauthorized aircraft. Examples of airspace monitoring may include radar, opto-electronic and acoustic sensing, among other examples.
The air monitoring subsystem may also actively perform verification of the identity of the aircraft and may also respond to authentication requests received from the aircraft. In addition, the air condition monitoring system may actively acquire information about the flight status of a particular aircraft. When monitoring an aircraft, the aircraft being monitored can be located in three dimensions. For example, an air condition monitoring system may track the flight path of an aircraft and compare it to a planned flight path to identify abnormal behavior. Abnormal behavior may be identified as behavior that exceeds a predetermined tolerance threshold (e.g., a threshold for deviation from a predetermined flight plan or falling below a threshold altitude while in flight). In addition, the monitoring points may be distributed or centralized.
The air monitoring subsystem may have primary means for airspace monitoring, namely receiving and resolving (either directly or by receiving and forwarding) in real-time flight information about the aircraft itself broadcast in real-time by authorized (or cooperating) aircraft in the monitored airspace. Real-time flight information can be obtained through active inquiry of the air pipe and response of the aircraft. In addition, the airspace monitoring subsystem may have aids for airspace monitoring, such as sodar and lidar. In an example, an authenticated user may be allowed to query the airspace monitoring subsystem for a current airspace status.
The air traffic control system may also include a traffic rights management subsystem. The traffic rights management system may be responsible for accepting initial requests for channel resources and requests to change channels, which can plan the flight channel and send feedback to the applicant regarding the definitive response to the request. Examples of information provided in determining the response include a planned flight path, en-route monitoring points, and corresponding time windows. Additionally, the traffic right management system may be responsible for adjusting the predetermined flight path as conditions change in the present airspace and/or other airspaces. Reasons for which the predetermined flight path may be adjusted include, but are not limited to, climate, changes in available airspace resources, accidents, setup of the geofencing device, and adjustments to attributes of the geofencing device such as spatial extent, duration, and level of restriction. The traffic rights management system may also inform the applicant or user about the original flight path to be adjusted. Further, the authenticated user may be allowed to query the traffic rights management subsystem for an authorized allocation of air channels.
The air traffic system may also include a secure interface with the authentication center subsystem. The secure interface with the authentication center subsystem may be responsible for secure communications with the authentication center. In particular, the air traffic system may communicate with an authentication center for authentication purposes or for attribute queries for aircraft and users.
In an example, a user may be aware of other users in the same region through an air traffic system. The user may choose to share information, such as their flight path, with other users. Users can also share content captured by their devices. For example, users may share content captured from cameras on or within their unmanned aerial vehicles. In addition, users may send instant messages to each other.
The unmanned aerial vehicle flight system can also include a geofencing subsystem that includes one or more geofencing devices. The unmanned aerial vehicle may be capable of detecting the geofencing device, or vice versa. The geofencing device may cause the unmanned aerial vehicle to act according to a set of flight controls. Examples of geofencing subsystems will be discussed in more detail later in this application.
Flight procedure relying only on unmanned aerial vehicle certification
In some applications, the air management system may only need to perform certification for the unmanned aerial vehicle before the unmanned aerial vehicle can take off. In these applications, authentication for the user does not have to be performed. The process by which the air traffic control system performs authentication for the unmanned aerial vehicle can be demonstrated by AKA as provided in fig. 15 above. After authentication, the unmanned aerial vehicle and the aerial vehicle system may acquire the CK and IK to communicate with each other, as described in step 1550 and step 1555 of fig. 15. CK may be used for data encryption and IK may be used for data integrity protection.
After authentication, the user can acquire the keys (CK, IK) finally generated during the authentication process via a secure channel, and the communication data between the user and the unmanned aerial vehicle can be protected by encrypting using the keys so as to avoid being hijacked or being miscontrolled. Accordingly, the subsequent data Message (MSG) may include information related to the unmanned aerial vehicle, such as the location of the unmanned aerial vehicle, the speed of the unmanned aerial vehicle, and so forth. In this way, the unmanned aerial vehicle may be inclusively protected and pass the IK test. The information transmitted is as follows:
equation 1: MSG1| ((HASH (MSG1) | CRC () + scr (ik)) | IMSI (hssh 1) | CRC ())
Wherein MSG1| | RAND | | | TIMESTAMP | | | GPS
In equation 1 above, CRC () may be a cyclic checksum of information, and scr (IK) may be an IK-derived data mask. In addition, in this specification, MSG is an original message, HASH () is a HASH function, RAND is a random number, TIMESTAMP is a current time stamp, and GPS is a current location, so as to avoid replay attack.
Fig. 16 shows a process 1600 for sending a message with a cryptographic signature according to an embodiment of the invention. In a first step 1610, a message is composed. As provided in the discussion above, the message may be denoted "MSG". Having composed the message, the sender of the message may generate a digest of the message from the text of the message using a hash function in step 1620. The digest may be a hash of only the message itself, the MSG, or it may be a hash of a modified message, such as MSG1 as seen above. Specifically, the MSG1 may include a compilation of information such as the original message MSG, the random number RAND, the current timestamp TIMESTAMP, and the current location GPS. In other examples, the modified message may contain alternative information.
Once the digest of message MSG or modified message MSG1 is generated, the sender may encrypt the digest using a personal key in step 1630. In particular, the digest may be encrypted using the public key of the sender, such that the encrypted digest may serve as a personal signature for the sent message. Thus, in step 1630, the cryptographic digest may then be sent to the recipient along with the message as a digital signature of the message.
Fig. 17 shows a process 1700 for verifying a message by decrypting a signature according to an embodiment of the invention. As seen in step 1710, the recipient receives a message and a cryptographic digest, such as the message and cryptographic digest discussed in fig. 16. In step 1720, the recipient may compute a digest of the message from the received original message using the same hash function as the sender. Additionally, in step 1730, the recipient can decrypt the digital signature attached to the message using the sender's public key. Once the digital signature attached to the message is decrypted, the recipient may compare the digest in step 1740. If the two digests are the same, the receiver can confirm that the digital signature is from the sender.
Thus, when other peers receive such information and upload to the certificate authority, they can be treated as digital signatures. That is, the presence of such wireless information is equivalent to the presence of the UAV. This process may simplify the certification process for the unmanned aerial vehicle. That is, after the initial authentication is completed, the unmanned aerial vehicle can correctly announce its own determined presence by performing the above-described process without necessarily initiating a complicated initial authentication process each time.
Flight procedure relying on unmanned aerial vehicle and user authentication
In some applications, the unmanned aerial vehicle may take off only after a double authentication of the unmanned aerial vehicle and the user.
Authentication for the user may be based on an electronic key. When manufacturing the unmanned aerial vehicle, the manufacturer may embed an electronic key. The electronic key may have a built-in USIM card that contains an IMSI-U (IMSI associated with the user) and a K-U (key associated with the user) that are shared with the authentication center. This is also a user unique personal identification and is written only once at the time of USIM manufacture. Therefore, the USIM card cannot be copied or forged. Furthermore, it is protected from being read out by the security mechanism of the USIM. Therefore, the difficulty of decrypting the USIM is very large. The electronic key, as an electronic device, may be inserted in the remote control, integrated in the remote control or communicate with the remote control via conventional means such as bluetooth (TM), WIFI, USB, audio, optical communication, etc. The remote controller can obtain basic information of the electronic key and can communicate with the authentication center to perform corresponding authentication.
Authentication for a user may also be accomplished by various other means, such as the inherent characteristics of the user. Specifically, the authentication of the user may be achieved by voice print, fingerprint, iris information, or the like.
Authentication for unmanned aerial vehicles is based on an onboard security module that authenticates with an authentication center via a CH (channel). In an example, the on-board security module of the unmanned aerial vehicle includes a USIM that contains an IMSI-M (IMSI associated with the unmanned aerial vehicle, e.g., as provided by the manufacturer) and a K-M (key associated with the unmanned aerial vehicle, e.g., as provided by the manufacturer). The K-M shared between the unmanned aerial vehicle and the authentication center is written once at the time of USIM manufacture. Furthermore, it is protected from being read out by the security mechanism of the USIM. Therefore, the difficulty of decrypting the USIM is very large.
The airborne security module and the authentication center of the unmanned aerial vehicle can perform bidirectional authentication, the process is similar to that of UMTS, and an authentication and key agreement mechanism, namely AKA, is adopted. AKA is a two-way authentication protocol. That is, not only the authentication center is required to verify the legitimacy of the unmanned aerial vehicle or the electronic key, but also the unmanned aerial vehicle or the electronic key is required to verify the legitimacy of the authentication center that provides the service. The AKA authentication procedure between the unmanned aerial vehicle and the authentication center provided in fig. 15, discussed above, illustrates this procedure.
The unmanned aerial vehicle and the electronic key may need to perform an authentication process before performing the flight mission. In an example, both authenticate with the authentication center either serially or in parallel, where both perform the basic authentication process. Specifically, (IMSI-M, K-M) may be employed in authentication of the unmanned aerial vehicle, and (IMSI-U, K-U) may be employed in authentication of the electronic key. The following description is a general description. The safety module refers to an unmanned aerial vehicle or an electronic key.
After the authentication of the unmanned aerial vehicle with the authentication center is completed, the authentication center may determine several characteristics of the unmanned aerial vehicle through the database. For example, the authentication center may determine the type of unmanned aerial vehicle, the capabilities of the unmanned aerial vehicle, the affiliation of the unmanned aerial vehicle, the health/operating conditions of the unmanned aerial vehicle, the maintenance needs of the unmanned aerial vehicle, historical flight records of the unmanned aerial vehicle, and the like. In addition, after the authentication of the electronic key with the authentication center is completed, the authentication center may determine personal information, operation authority, flight history, and the like of a user corresponding to the electronic key.
After the authentication is completed, the controller obtains several negotiated important passwords: CK-U (CK associated with the user), IK-U (IK associated with the user), CK-M (CK associated with the UAV), and IK-M (IK associated with the UAV). The basic authentication process and results described above can be used flexibly in the context of an unmanned aerial vehicle.
The electronic key may negotiate flight missions with the authentication center via an encrypted channel. The certification center may approve, reject, or provide relevant modification suggestions or hints for the flight mission based on the attributes of the user and the unmanned aerial vehicle. The certification may also maintain communication with the UAV and the remote control using the respective passwords during flight to obtain flight parameters (such as position, speed, etc.) and to manage and control the UAV or user's rights in flight.
In a wireless communication link between the unmanned aerial vehicle and the remote controller, a dual signature of the unmanned aerial vehicle and the electronic key is adopted. When sending a message, the sender uses a hash function to generate a message digest from the message text and then encrypts the digest using a personal key. This encrypted digest is sent to the recipient as a digital signature of the message along with the message. The receiver first computes a message digest from the received original message using the same hash function as the sender, and then decrypts the digital signature attached to the message using the sender's public key. If the two digests are the same, the receiver can confirm that the digital signature is from the sender.
In particular, subsequent data Messages (MSG) may be remote control commands, position reports, velocity reports, etc., and may be integrity protected and tested by K-U and IK-M. The information transmitted is as follows:
MSG1|| (HASH(MSG||RAND)||CRC)(+)SCR1(IK-M)(+)SCR2(IK-U))|| IMSI-U||IMSI-M
wherein MSG1 | | RAND | | | TIMESTAMP | | | GPS
In the above equation, CRC (+) is the information cyclic checksum, and SCR1(IK) and SCR2(IK) are IK derived data masks. The SCRs 1() and the SCRs 2() may be common password generators. In addition, HASH () is a HASH function, whereas RAND is a random number, TIMESTAMP is the current timestamp, and GPS is the current location to avoid replay attacks.
Such information may be treated as a digital signature when uploaded to a certification authority. That is, the presence of such wireless information is equivalent to the presence of the UAV and the presence of the user. This process may simplify the certification process for the unmanned aerial vehicle. That is, after the initial authentication is completed, the unmanned aerial vehicle can correctly announce its own determined presence by performing the above-described process without necessarily initiating a complicated initial authentication process each time.
To enhance security, a legal period may be allocated for the IK described above. The authentication center and the unmanned aerial vehicle and the electronic key can continuously execute the AKA process. During flight, the unmanned aerial vehicle can accept a user switching program.
The authentication center may have the identity of the unmanned aerial vehicle, the registered flight mission of the unmanned aerial vehicle, and its actual flight history. It may also have information of the corresponding user. Also, based on the result of the bidirectional authentication, the authentication center can also provide various services and information on security to the user. The certification center can also take over the unmanned aerial vehicle to a certain extent. For example, the certification center may take over some functions of the UAV. In this way, regulatory authorities have increased the supervision and control of unmanned aerial vehicles.
Supervision Process
The air traffic control system may send an IMSI query command to the unmanned aerial vehicle via peer-to-peer communicator B. After the drone responds to the IMSI query with its IMSI, a manager at the air traffic control system may initiate the above-described command authentication to identify the drone as one that legitimately owns the IMSI. Once it has been determined that the unmanned aerial vehicle legitimately owns the IMSI, further two-party signaling interaction may be performed between the unmanned aerial vehicle and the air traffic control system using the negotiated CK and IK. For example, the unmanned aerial vehicle may report historical information or mission plans. Additionally, the empty pipe system may require the unmanned aerial vehicle to perform certain actions. Thus, the air traffic control system may take over some actions that control the UAV. By using the authentication process, the air traffic control system can ensure proper identification of the unmanned aerial vehicle without risk of impersonation.
In another example, if the air traffic system sends an IMSI query command to the unmanned aerial vehicle and receives a no response, an error response, or an error authentication, the air traffic system may consider the unmanned aerial vehicle to be non-compliant. In other examples, the unmanned aerial vehicle system may request that the unmanned aerial vehicle periodically broadcast its IMSI without prompting from air traffic control. The air management system, upon receiving the broadcasted IMSI, may choose to initiate the authentication process described above with the unmanned aerial vehicle.
Overview of geofences
The unmanned aerial vehicle flight system can include one or more geo-fencing devices. The unmanned aerial vehicle may be capable of detecting the geofencing device, or vice versa. The geofencing device may cause the unmanned aerial vehicle to act according to a set of flight controls. The set of flight controls can include a geographic component that can be correlated to a location of the geofencing device. For example, a geo-fencing device may be provided at a location and may claim one or more geo-fencing boundaries. The activities of the unmanned aerial vehicle can be regulated within the geofence boundary. Alternatively or additionally, the activities of the unmanned aerial vehicle may be regulated outside of the geofence boundary. In some cases, the rules imposed on the unmanned aerial vehicle may differ within a geofence boundary from outside the geofence boundary.
Fig. 18 shows an example of an unmanned aerial vehicle and a geofencing device, in accordance with an embodiment of the present invention. Geofencing device 1810 may claim a geofencing boundary 1820. The unmanned aerial vehicle 1830 may encounter a geofencing device.
Unmanned aerial vehicle 1830 may operate according to a set of flight controls. The set of flight controls can be generated based on a geo-fencing device. The set of flight restrictions can take into account boundaries of the geofencing device. A set of flight restrictions can be associated with a geofence boundary within a distance of the geofence device.
The geo-fence device 1810 may be provided at a location. The geofence device can be any device that can be used to help determine one or more geofence boundaries that can be used in one or more sets of flight restrictions. The geofencing device may or may not transmit a signal. The signal may or may not be detectable by the UAV. The unmanned aerial vehicle may be capable of detecting the geo-fence device with or without a signal. The geofencing device may or may not be capable of detecting the unmanned aerial vehicle. The UAV may or may not transmit a signal. The signal may or may not be detectable by the geo-fencing device. The one or more intermediary devices may be capable of detecting signals from the unmanned aerial vehicle and/or the geo-fencing device. For example, the intermediary device may receive a signal from the unmanned aerial vehicle and may transmit data regarding the unmanned aerial vehicle to the geo-fencing device. The intermediary device may receive the signal from the geofencing device and may transmit data regarding the geofencing device to the UAV. Various combinations of detection and communication may occur as described in more detail elsewhere herein.
The geo-fence device can be used as a reference to one or more geo-fence boundaries 1820. The geofence boundary may indicate a two-dimensional area. Anything above or below the two-dimensional area can be located within the geofence boundary. Anything above or below the area outside the two-dimensional area may be located outside the geofence boundary. In another example, the geofence boundary may indicate a three-dimensional volume. A space within the three-dimensional volume may be within the geofence boundary. A space outside of the three-dimensional volume may be outside of the geofence boundary.
The geofence device boundaries can be open or closed. The closed geofence device boundary can completely enclose an area within the geofence device boundary. The closed geofence device boundary may begin and end at the same point. In some implementations, a closed geofence boundary may not have a start point or an end point. Examples of closed geofence boundaries may be circular, square, or any other polygon. An open geofence boundary may have different starting and ending points. The geofence wall may have boundaries that are straight or curved. A closed geofence boundary may enclose an area. For example, a closed boundary may help define a restricted flight zone. The open geofence boundary may form a barrier. The barrier may help form a geofence boundary at the natural physical boundary. Examples of physical boundaries may include jurisdictional boundaries (e.g., boundaries between countries, regions, states, provinces, towns, cities, or landlines), naturally occurring boundaries (e.g., rivers, streams, rivers, cliffs, ravines, canyons), man-made boundaries (e.g., walls, streets, bridges, dams, doors, entryways), or any other type of boundary.
The geo-fencing device can have a location relative to a geo-fence boundary. The location of the geo-fencing device can be used to determine the location of the geo-fence boundary. The location of the geo-fence device can serve as a reference to the geo-fence boundary. For example, if the geofence boundary is a circle surrounding the geofence device, the geofence device may be at the center of the circle. Thus, depending on the location of the geo-fence device, the geo-fence boundary may be determined as a circle around the geo-fence device, the circle centered on the geo-fence device and having a predetermined radius. The geofencing device need not be a circle center. For example, a boundary of a geofencing device may be provided such that the boundary is a circle offset from the geofencing device. The geofencing device itself can be located within the boundaries of the geofencing device. In an alternative embodiment, the boundary of the geo-fencing device may be such that the geo-fencing device is located outside the boundary of the geo-fencing device. However, the boundary of the geo-fence device may be determined based on the location of the geo-fence device. The location of the boundary of the geofence device may also be determined based on the geofence boundary type (e.g., shape, size of the geofence boundary). For example, if the known geofence boundary type is a hemispherical boundary centered around the location of the geofence device and having a radius of 30 meters, and the known geofence device has global coordinates of X, Y, Z, the location of the geofence boundary can be calculated from the global coordinates.
The unmanned aerial vehicle can access the geofence device. The identification of the geofence device boundary may be provided by the unmanned aerial vehicle. The unmanned aerial vehicle may be flown according to a flight control having different rules based on whether the unmanned aerial vehicle is located on a first side of a geofence device boundary or a second side of the geofence device boundary.
In one example, the unmanned aerial vehicle may not be allowed to fly within the boundaries of the geofence device. Thus, when the unmanned aerial vehicle approaches the geofence device, it may be detected that the unmanned aerial vehicle is near the geofence boundary or that the unmanned aerial vehicle has crossed the geofence boundary. The detection may be performed by an unmanned aerial vehicle. For example, the unmanned aerial vehicle may know the unmanned aerial vehicle location and the geofence boundary. In another example, the detection may be performed by an empty pipe system. The air management system can receive data regarding the location of the UAV and/or the location of the geofencing device. The unmanned aerial vehicle may or may not know the geofence device location. The flight control may be such that the UAV is not permitted to fly within the boundary. Flight response measures may be taken by the unmanned aerial vehicle. For example, the channel of the unmanned aerial vehicle may be altered so that the unmanned aerial vehicle does not enter the area within the geofence boundary, or the unmanned aerial vehicle may be moved out of the area within the geofence boundary if the unmanned aerial vehicle has already entered. Any other type of flight response action may be taken, which may include providing a warning to a user of the unmanned aerial vehicle or to an empty pipe system. The flight response action may be initiated on the unmanned aerial vehicle or may be initiated from an air traffic system.
In some cases, the unmanned aerial vehicle may be proximate to the geofencing device. The unmanned aerial vehicle may know the location of the unmanned aerial vehicle itself (e.g., using a GPS unit, any other sensor, or any other technique described elsewhere herein). The unmanned aerial vehicle can know the location of the geofencing device. The unmanned aerial vehicle can directly sense the geofence device. The geofencing device may sense the unmanned aerial vehicle and may transmit a signal to the unmanned aerial vehicle indicating the location of the geofencing device. The unmanned aerial vehicle can receive an indication of the geofence device from the empty pipe system. Based on the geofence device location, the unmanned aerial vehicle can know the location of the geofence boundary. Based on the known location of the geofence device, the unmanned aerial vehicle may be able to calculate the location of the geofence boundary. In other cases, the calculation of the geofence boundary location may be performed off-board the unmanned aerial vehicle, and the location of the geofence boundary may be transmitted to the unmanned aerial vehicle. For example, the geo-fencing device may know its own location and the type of boundary (e.g., the spatial layout of the boundary relative to the device location). The geofencing device may calculate the location of its geofence boundary and transmit the boundary information to the unmanned aerial vehicle. The boundary information can be transmitted from the geofencing device to the UAV, either directly or via one or more intermediaries (e.g., an air duct system). In another example, the empty pipe system can be aware of the geofence device location. The air management system can receive the geofence device location from the geofence device or from the UAV. The empty pipe system may know the type of boundary. The air management system can calculate the location of the geofence boundary and can transmit the boundary information to the unmanned aerial vehicle. The boundary information may be transmitted directly from the air duct system to the unmanned aerial vehicle. The boundary information may be transmitted from the air duct system to the UAV directly or via one or more intermediaries. When the unmanned aerial vehicle knows the location of the geofence boundary location, the unmanned aerial vehicle can compare its own location relative to the geofence boundary location.
Based on the comparison, one or more flight response actions may be taken. The unmanned aerial vehicle may be able to initiate flight response measures to be taken on its own. The unmanned aerial vehicle may have a set of flight controls stored on the unmanned aerial vehicle and may be capable of initiating flight response measures in compliance with the flight controls. Alternatively, the flight controls may be stored off-board the UAV but may be accessible by the UAV for the UAV to determine flight response actions to be taken by the UAV. In another example, the unmanned aerial vehicle does not initiate a flight response by itself, but may receive flight response instructions from an external source. Based on the location comparison, the unmanned aerial vehicle can query the external source whether a flight response measure is required, and if so, the external source can provide instructions for the flight response measure. For example, the empty pipe system may look at the location comparison and determine if flight response measures are required. The air duct system may provide directions for the UAV if desired. For example, if the unmanned aerial vehicle flight path needs to deviate to avoid entering within the geofence boundary, a command to alter the flight path may be provided.
In another case, the unmanned aerial vehicle may be proximate to the geofencing device. The unmanned aerial vehicle may know the location of the unmanned aerial vehicle itself (e.g., using a GPS unit, any other sensor, or any other technique described elsewhere herein). The unmanned aerial vehicle optionally does not know the location of the geofencing device. The unmanned aerial vehicle may provide information regarding the location of the unmanned aerial vehicle to an external device. In one example, the external device is a geo-fencing device. The geo-fencing device may know its own location. The geofencing device may know the unmanned aerial vehicle location from information provided from the unmanned aerial vehicle. In an alternative embodiment, the geo-fencing device may sense the unmanned aerial vehicle and determine the unmanned aerial vehicle location based on the sensed data. The geofencing device may be capable of receiving information about the location of the UAV from additional sources, such as an air traffic system. The geofencing device can know the location of the geofence boundary. The geo-fencing device may be capable of calculating a location of the geo-fence boundary based on a known location of the geo-fencing device. The calculation of the boundary position may also be based on the boundary type (e.g., the spatial layout of the boundary with respect to the device position). When the location of the geofence boundary location is known, the geofence device may compare the location of the unmanned aerial vehicle relative to the geofence boundary location.
In another example, the external device is an empty pipe system. The empty pipe system can know the geofence device location. The empty pipe system can receive the geofence device location from the geofence device, either directly or via one or more intermediate devices. The empty pipe system can sense the geo-fencing device and determine a geo-fencing device location based on the sensed data. In some cases, information from one or more recorders may be used to determine the location of the geo-fencing device. The empty pipe system may be capable of receiving the location of the geofencing device from an additional source, such as an unmanned aerial vehicle. The empty pipe system may know the unmanned aerial vehicle location from information provided from the unmanned aerial vehicle. In an alternative embodiment, the empty pipe system apparatus may sense the unmanned aerial vehicle and determine the unmanned aerial vehicle location based on the sensed data. In some cases, information from one or more recorders may be used to determine the position of the unmanned aerial vehicle. The geo-fencing device may be capable of receiving information regarding the location of the unmanned aerial vehicle from additional sources, such as the geo-fencing device. The empty pipe system can know the location of the geofence boundary. The empty pipe system can receive a location of a geofencing device boundary from the geofencing device or another source. The empty pipe system may be capable of calculating the location of the geofence boundary based on the known locations of the geofence devices. The calculation of the boundary position may also be based on the boundary type (e.g., the spatial layout of the boundary with respect to the device position). When the location of the geofence boundary location is known, the air management system may compare the location of the unmanned aerial vehicle relative to the geofence boundary location.
Based on the comparison, one or more flight response actions may be taken. The geofencing device or the air duct system may be capable of providing information to the UAV regarding the comparison. The unmanned aerial vehicle may be able to initiate flight response measures to be taken on its own. The unmanned aerial vehicle may have a set of flight controls stored on the unmanned aerial vehicle and may be capable of initiating flight response measures in compliance with the flight controls. Alternatively, the flight controls may be stored off-board the UAV but may be accessible by the UAV for the UAV to determine flight response actions to be taken by the UAV.
In another example, the unmanned aerial vehicle does not initiate a flight response by itself, but may receive flight response instructions from an external source. The external source may be a geo-fencing device or an empty pipe system. Based on the position comparison, the external source may provide instructions for flight response measures, if desired. For example, the empty pipe system may look at the location comparison and determine whether flight response measures are required. The air duct system may provide directions for the UAV if desired. For example, if the unmanned aerial vehicle flight path needs to deviate to avoid entering within the geofence boundary, a command to alter the flight path may be provided.
Any of the descriptions of flight controls previously provided may be applicable herein. The geofencing device may establish a boundary of locations that may contribute to flight control. Various types of flight controls may be imposed, as provided elsewhere herein. The geofence devices may be used to establish boundaries for different types of flight regulations, which may include regulations that may affect flight of the unmanned aerial vehicle (e.g., flight path, takeoff, landing), operation of a payload of the unmanned aerial vehicle, positioning of a payload of the unmanned aerial vehicle, operation of a carrier of the unmanned aerial vehicle, operation or placement of one or more sensors of the unmanned aerial vehicle, operation of one or more communication units of the unmanned aerial vehicle, navigation operation of the unmanned aerial vehicle, power placement of the unmanned aerial vehicle, and/or any other operation of the unmanned aerial vehicle.
Fig. 19 illustrates a side view of a geofencing device, a geofence boundary, and an unmanned aerial vehicle, in accordance with an embodiment of the present invention. Geofence device 1910 may be provided at any location. For example, the geo-fencing device can be provided on the object 1905 or on the surface 1925. The geofencing device can serve as a reference to one or more geofence boundaries 1920. Unmanned aerial vehicle 1930 can be proximate to a geofence device and/or a geofence boundary.
Geofence device 1910 may be set up at a location. In some cases, the geo-fencing device can be provided in a permanent or semi-permanent manner. The geo-fencing device may be substantially immovable. The geo-fencing device is not manually movable without the aid of tools. The geo-fencing device can remain in the same location. In some cases, the geo-fencing device may be affixed or attached to the object 1905. The geo-fencing device may be built into the object.
Alternatively, the geo-fencing device can be easily removable and/or portable. The geo-fencing device can be manually moved without tools. The geo-fencing device is movable from location to location. The geo-fencing device can be removably attached to or supported by an object. In some cases, the geo-fencing device may be a handheld device. The geo-fencing device can be picked up and carried by a human. The geo-fencing device can be picked up and carried by a human hand. The geo-fencing device can be easily transportable. In some embodiments, the geo-fence device can weigh less than or equal to about 500kg, 400kg, 300kg, 200kg, 150kg, 100kg, 75kg, 50kg, 40kg, 30kg, 25kg, 20kg, 15kg, 12kg, 10kg, 9kg, 8kg, 7kg, 6kg, 5kg, 4kg, 3kg, 2kg, 1.5 kg, 1kg, 750g, 500g, 300g, 200g, 100g, 75g, 50g, 30g, 20g, 15g, 10g, 5g, 3g, 2g, 1g, 500mg, 100mg, 50mg, 10 mg, 5mg, or 1 mg. The geo-fence device may have a volume of less than or equal to about 5m3, 3m3, 2m3, 1m3, 0.5m3, 0.1m3, 0.05m3, 0.01m3, 0.005m3, 0.001m3, 500cm3, 300cm3, 100cm3, 75cm3, 50cm3, 30cm3, 20cm3, 10cm3, 5cm3, 3cm3, 1cm3, 0.1cm3, or 0.01cm 3. The geofencing device may be worn by the individual. The geofencing devices can be carried in a pocket, handbag, pouch, purse, backpack, or any other item of the individual.
The geo-fencing device can be moved from location to location by way of an individual. For example, a user can pick up the geo-fencing device, move it to another location and drop it down. Alternatively, the user may need to detach the geo-fencing device from the existing object, then pick up the geo-fencing device, move it to another location and attach it at the new location. Alternatively, the geo-fencing device may be self-propelled. The geo-fencing device may be mobile. For example, the geofencing device may be another unmanned aerial vehicle, or may be another vehicle (e.g., a ground-based vehicle, a water-based vehicle, an air-based vehicle, a space-based vehicle). In some cases, the geofencing device may be attached to or supported by an unmanned aerial vehicle or other vehicle. The location of the geo-fencing device may be updated and/or tracked as it moves.
In some embodiments, the geo-fencing device can be substantially stationary during use. The geofencing device can be provided on object 1905 or on surface 1925. The object may be a naturally occurring object or a man-made object. Examples of naturally occurring objects may include trees, shrubs, stones, hills, mountains, or any other naturally occurring object. Examples of man-made objects may include structures (e.g., buildings, bridges, columns, fences, walls, piers, buoys) or any other man-made object. In one example, the geo-fencing device may be provided on a structure such as a roof of a building. The surface may be a naturally occurring surface or may be a man-made surface. Examples of surfaces may include ground surfaces (e.g., terrain, dirt, gravel, asphalt, roads, floors) or water-based surfaces (e.g., lakes, seas, rivers, streams).
Alternatively, the geo-fencing device may be an unmanned aerial vehicle docking station. The geo-fence device may be secured to an unmanned aerial vehicle docking station. The geofence device may be placed on or supported by the unmanned aerial vehicle docking station. The geo-fencing device may be part of, or may be integrally formed with, the unmanned aerial vehicle docking station. Unmanned aerial vehicle docking stations may allow one or more unmanned aerial vehicles to be landed on or supported by the docking station. The unmanned aerial vehicle docking station may include one or more landing zones operable to bear the weight of the unmanned aerial vehicle. The unmanned aerial vehicle docking station may provide power to the unmanned aerial vehicle. In some cases, the unmanned aerial vehicle docking station may be used to charge one or more power cells (e.g., batteries) on the unmanned aerial vehicle. The unmanned aerial vehicle docking station may be used to swap out a power unit from the unmanned aerial vehicle with a new power unit. New power cells may have higher energy capacity or charge state. The unmanned aerial vehicle docking station may be capable of performing repairs on the unmanned aerial vehicle or providing spare parts for the unmanned aerial vehicle. The unmanned aerial vehicle docking station may accept items carried by the unmanned aerial vehicle or may store items that may be picked up to be carried by the unmanned aerial vehicle.
The geo-fencing device may be any type of device. The device may be a computer (e.g., personal computer, laptop, server), a mobile device (e.g., smartphone, cellular phone, tablet, personal digital assistant), or any other type of device. The device may be a network device capable of communicating over a network. The apparatus includes one or more memory storage units, which may include a non-transitory computer-readable medium that may store code, logic, or instructions for performing one or more of the steps described elsewhere herein. The apparatus may include one or more processors that may perform one or more steps, individually or collectively, in accordance with code, logic, or instructions of a non-transitory computer readable medium as described herein.
When a device provides a reference point for a set of boundaries associated with a set of flight controls, the device may become a geo-fencing device. In some cases, a device may be a geo-fencing device when software or an application is running on the device that can provide the location of the device as a reference point for a set of boundaries associated with a set of flight controls. For example, a user may have a device, such as a smartphone, that performs additional functions. The application may be downloaded to a smart phone that may communicate with the air traffic system, another component of the authentication system, or any other system. The application may provide the location of the smartphone to an air traffic control system and indicate that the smartphone is a geo-fencing device. Thus, the location of the smartphone may be known and may be used to determine a boundary that may be associated with a restriction. The device may already have a locator or may use a positioning system to determine the location of the device. For example, the location of a smartphone and/or tablet computer or other mobile device may be determined. The location of the mobile device can be utilized to provide a reference point as a geo-fencing device.
In some embodiments, the geo-fencing device may be designed to be provided in an outdoor environment. Geofence devices can be designed to withstand various climates. The geo-fencing device can have a housing that can partially or completely enclose one or more components of the geo-fencing device. The housing may protect the one or more components from wind, dust, or precipitation (e.g., rain, snow, hail, ice). The housing of the geo-fencing device may or may not be air tight and may or may not be water tight. A housing of a geofencing device can enclose one or more processors of the geofencing device. A housing of a geofencing device can enclose one or more memory storage units of the geofencing device. A housing of a geofencing device can enclose a locator of the geofencing device.
In some implementations, the geo-fencing device can be a remote control configured to receive user input. The remote controller can control the operation of the unmanned aerial vehicle. This may be useful when the geofence boundary is used to allow operation of the unmanned aerial vehicle within the geofence boundary but to limit operation of the unmanned aerial vehicle outside the geofence boundary. For example, an unmanned aerial vehicle may only be allowed to fly within geofence boundaries. If the UAV approaches or departs the boundary, the flight path of the UAV may be altered to keep the UAV within the geofence boundary. This may keep the UAV within a specified distance from the remote control if the UAV is only allowed to fly within the geofence boundary. This may help the user to more easily monitor the unmanned aerial vehicle. This may prevent the unmanned aerial vehicle from getting lost while flying outside of the desired distance. If the geo-fencing device is a remote control, the user may be able to walk around and the geo-fencing device boundary may move with the remote control. Thus, the user may have some degree of freedom to freely traverse the region while the unmanned aerial vehicle remains within a desired boundary relative to the user.
The geofence boundary may include one or more lateral boundaries. For example, a geofence boundary may be a two-dimensional region that may define lateral dimensions of a space within the geofence boundary and a space outside the geofence boundary. The geofence boundary may or may not include an altitude boundary. The geofence boundary may define a three-dimensional volume.
Fig. 19 provides an illustration in which geofence boundary 1920 can include a lateral aspect and a height aspect. For example, one or more lateral boundaries may be provided. Upper and/or lower height limits may be provided. For example, an upper height limit may define the top of the boundary. The lower height limit may define a bottom of the boundary. The boundary may be substantially flat, or may be curved, sloped, or have any other shape. Some boundaries may have a cylindrical, prismatic, conical, spherical, hemispherical, bowl-shaped, annular, tetrahedral, or any other shape. In one illustration, the unmanned aerial vehicle may not be allowed to fly within the geofence boundary. The unmanned aerial vehicle can fly freely outside the geofence boundary. Thus, the unmanned aerial vehicle can fly above the upper altitude limit provided in fig. 19.
In some implementations, a geo-fencing system can be provided. The geofencing system can be a subsystem of an empty pipe system. In some cases, an empty pipe system may include a geo-fencing module that may perform one or more examples described herein. Any description herein of a geofence system may apply to a geofence module, which may be part of an empty pipe system. The geo-fence module may be part of an authentication system. Alternatively, the geofencing system may be separate and/or independent from the authentication system or the empty pipe system.
The geofencing system can accept a request for setting up a geofencing device. For example, when the geo-fencing device is part of an unmanned aircraft system, it may be identified and/or tracked. The geo-fencing device can have a unique identity. For example, a geo-fence device can have a unique geo-fence identifier that can uniquely identify and/or distinguish the geo-fence device from other geo-fence devices. Identity information about the geo-fencing device can be aggregated. Such information may include information about the type of geo-fencing device. The geo-fence device identifier can be used to determine a geo-fence device type. Further description regarding geofence device types may be provided elsewhere herein.
In some implementations, the geo-fence device identifier can be provided from an ID registry. The ID registry may also provide a user identifier and/or an unmanned aerial vehicle identifier (e.g., ID registry 210 illustrated in fig. 2). Alternatively, the geo-fence device may use an ID registry that is separate from the unmanned aerial vehicle and/or the user. Accordingly, a geo-fencing device can be identified. When a setup application for a geofencing device passes through the geofencing system, then the geofencing device can be identified.
In some implementations, the geo-fencing device can also be authenticated. Authenticating the geo-fence device may include confirming that the geo-fence device is the geo-fence device indicated by the geo-fence device identifier. Any authentication technique may be used to authenticate the geo-fence device. Any technique for authenticating the unmanned aerial vehicle and/or the user can be used to authenticate the geo-fence device. The geo-fence device can have a geo-fence device key. The geo-fence device key may be used during an authentication process. In some cases, the AKA procedure may be used to assist in authentication of the geo-fenced device. Further possible processes for authenticating geo-fenced devices are described in more detail elsewhere herein. The geofencing system can prevent cloning of the geofencing device. The geofencing system can prevent authentication of cloned geofencing devices through an aerial management system (e.g., an air management system) and an unmanned aerial vehicle. The certified geofencing device can be used by the empty pipe system and can communicate with the unmanned aerial vehicle.
The geofence system can track the identity of the geofence device that has undergone a registration process with respect to the geofence device system. The geo-fence device can be authenticated prior to successfully registering the geo-fence device with respect to the system of geo-fence devices. In some cases, the geo-fence device is registered once with respect to the geo-fence device subsystem. Alternatively, registration may occur multiple times. A geo-fence device may be identified and/or authenticated each time it is powered on. In some cases, the geo-fencing device may remain powered on during use. In some cases, the geo-fencing device may be powered down. When the geo-fencing device is powered off and then powered back on, it may undergo an identification and/or authentication process to set up in the system. In some cases, only the currently powered geofence device is tracked by the system. Data relating to geofence devices once set up but not currently powered can be stored by the system. When the device is powered down, it does not need to be tracked.
The geofencing system can examine and determine the effective spatial range, duration, and/or level of restriction of the geofencing device. For example, the location of the geo-fencing device can be tracked. In some cases, the geo-fencing devices may report their location themselves. In some cases, the geo-fencing device may have a location tracker, such as a GPS unit or one or more sensors. The geo-fencing device can transmit information regarding the location of the geo-fencing device to the geo-fencing system. The location may include coordinates of the geo-fencing device, such as global coordinates or local coordinates.
The geofencing system can record geofence boundaries for each of the geofencing devices. The geo-fencing devices may have the same type of boundary or may have different types of boundaries. For example, the boundaries may be different from device to device. The geofencing system can record the boundary type and location of the geofencing device. Thus, the geofencing system may be able to determine the location of the boundary of the geofencing device. The effective spatial extent of the geo-fencing device may be known by the system.
The duration of the geofence device boundary may be known. In some implementations, the geofence boundary can remain static over time. The geofence boundary can remain open for as long as the geofence device is powered on. In other cases, the geofence boundary may change over time. Even when the geofence device is powered on, the geofence boundaries may have the same range, but may or may not be in effect. For example, from 2 pm to 5pm per day, a geofence boundary may be provided, while during the remaining hours, the geofence boundary is not in effect. The shape and/or size of the geofence boundary may change over time. The change in geofence boundaries may be based on a time of day, a day of week, a day of month, a week of month, a quarter, a season, a year, or any other time-related factor. The change may be regular or periodic. Alternatively, the change may be irregular. In some cases, a schedule may be provided to which changes to the geofence boundaries may follow. Further examples and illustrations of changing geofence boundaries are provided in more detail elsewhere herein.
The level of the geofencing device can be known by the geofencing subsystem. The earlier description of the levels of various flight controls can be applied to the levels of geofencing devices. For example, if multiple geo-fencing devices have overlapping spatial ranges, the overlapping ranges may be treated according to rank. For example, flight regulations pertaining to geofence devices having higher levels may be applicable to the overlapping areas. Alternatively, more restrictive flight controls may be used in the overlap region.
The geofencing system can determine how to announce the geofencing device. In some cases, the geo-fencing device may emit a signal. The signal can be used to detect a geo-fencing device. The unmanned aerial vehicle may be capable of detecting a signal from a geofencing device to detect the geofencing device. Alternatively, the unmanned aerial vehicle may not be able to directly detect the geo-fence device, but the geo-fence system may be able to detect the geo-fence device. A recorder, such as the recorder described elsewhere herein, may be capable of detecting a geo-fencing device. The empty pipe system may be capable of detecting the geo-fencing device. The geo-fencing device can be announced in any manner. For example, a geo-fencing device may be declared using an electromagnetic signal or an acousto-optic signal. The signal from the geo-fence device may be detected by means of a visual sensor, an infrared sensor, an ultraviolet sensor, a sound sensor, a magnetometer, a radio receiver, a WiFi receiver, or any other type of sensor or receiver. The geofencing system can track which geofencing devices use which type of signal. The geofence system may inform one or more other devices or systems (e.g., unmanned aerial vehicles) as to what type of signal the geofence device provides so that the correct sensor may be used to detect the geofence device. The geofence information may also track information such as frequency range, bandwidth, and/or protocols used in transmitting the signal.
The geofence system can manage a pool of resources for the unmanned aerial vehicle flight based on information from the geofence device. The geofencing device may impose one or more regulations on the operation of the UAV. For example, unmanned aerial vehicle flight can be restricted based on the geo-fencing device. An example of a resource may be the available space domain. The available airspace may be limited based on the location and/or boundary of the geo-fencing device. The available airspace information may be used by the air management system in allocating resources for the unmanned aerial vehicle. The available airspace may be updated in real time. For example, the geo-fencing device can be turned on or off, can be added or removed, can be moved, or the boundaries of the geo-fencing device can change over time. Thus, the available spatial domain may change over time. The available airspace may be updated in real time. The available airspace may be continuously or periodically updated. The available spatial domain can be updated at regular or irregular intervals or according to a schedule. The available airspace may be updated in response to an event, such as a request for a resource. In some cases, the available spatial domain may be predicted over time. For example, if the geo-fence device schedule is known in advance, some changes in airspace may be predictable. Thus, when a user requests resources such as airspace in the future, the predicted available airspace may be evaluated. In some embodiments, different levels may be provided. For example, different levels of operation by the user may be provided. Different resources may be available to the user based on the user's operational level. For example, some geofence restrictions may apply only to certain users and not to other users. The user type may affect the available resources. Another example of a grade may include an unmanned aerial vehicle type. The type of unmanned aerial vehicle may affect the available resources. For example, some geofence restrictions may only apply to certain UAV models and not to other UAV models.
When a user wishes to operate the unmanned aerial vehicle, a request for one or more resources may be made. In some cases, a resource may include certain spaces over a period of time. The resources may include devices, such as those described elsewhere herein. Based on the available resources, the flight plan may be accepted or rejected. In some cases, some modifications may be provided to the flight plan to comply with the available resources. Geofence device information can be useful in determining resource availability. Geofence device information may be useful in determining whether to accept, reject, or modify a proposed flight plan.
A user can interact with the geo-fencing system. The user can query the geo-fencing system regarding the assignment of resources. For example, a user may request to assign a state of available airspace or other resources. The user may request assignment of a state of available airspace corresponding to the user level (e.g., operation level, user type). The user may request assignment of a status of available airspace corresponding to the type of unmanned aerial vehicle or other characteristic. In some cases, the user may retrieve information about the resource assignment from the geo-fencing system. In some cases, the information can be presented in a graphical format. For example, a map may be provided showing the available airspace. The map may show the available airspace at the current point in time when the user makes a query, or may infer the available airspace at a future point in time after the user query. The map may show the location and/or boundaries of the geo-fencing device. Further description of the user interface, which can show the geo-fencing devices and/or available resources, can be provided in more detail elsewhere herein (e.g., fig. 35).
In some embodiments, a violation counter system may be provided. The violation counter system may be a subsystem of the empty pipe system. The empty pipe system may include a violation countering module that may perform one or more of the actions described herein. Any description herein of the violation countering system may apply to the violation countering module that may be part of the empty pipe system. The violation countering module can be part of the authentication system. Alternatively, the violation countering system can be separate from and/or independent of the authentication system or the empty pipe system.
The violation counter-system may track unmanned aerial vehicle activity. For example, the position of the unmanned aerial vehicle may be tracked. The position of the unmanned aerial vehicle may include an orientation of the unmanned aerial vehicle. Tracking the position of the unmanned aerial vehicle may also include tracking movement (e.g., translational velocity, translational acceleration, angular velocity, angular acceleration) of the unmanned aerial vehicle. Other operations of the UAV may be tracked, such as operation of the payload, positioning of the payload, operation of the carrier, operation of one or more UAV sensors, operation of the communication unit, operation of the navigation unit, power loss, or any other activity of the UAV. The violation countering system may detect when the unmanned aerial vehicle is performing an anomaly. The violation countering system may detect when the unmanned aerial vehicle is performing an action that does not comply with a set of flight controls. In determining whether the UAV is compliant or non-compliant with the set of flight regulations, a user identity and/or an UAV identity may be considered. Geofence data may be considered in determining whether the UAV is compliant or non-compliant with the set of flight regulations. For example, the violation countering system may detect when an unauthorized unmanned aerial vehicle is present in the restricted airspace. The restricted airspace may be provided within the boundaries of the geofencing device. The user and/or the unmanned aerial vehicle may not be authorized to enter the restricted airspace. However, the presence of an unmanned aerial vehicle approaching or entering a restricted airspace may be detected. Unmanned aerial vehicle activity can be tracked in real time. Unmanned aerial vehicle activity may be continuously tracked, periodically tracked, tracked according to a schedule, or tracked in response to detected events or conditions.
The violation countering system may send an alert when the unmanned aerial vehicle is to engage in an activity that does not comply with a set of flight controls for the unmanned aerial vehicle. For example, if an unauthorized unmanned aerial vehicle is about to enter a restricted airspace, an alert may be provided. The alert can be provided in any manner. In some cases, the alert may be an electromagnetic alert or an audible and visual alert. A warning may be provided to a user of the unmanned aerial vehicle. The alert may be provided via a user terminal such as a remote control. An alert may be provided to the empty pipe system and/or the UAV. A user may be provided with an opportunity to alter the behavior of an unmanned aerial vehicle to cause the unmanned aerial vehicle to comply with flight regulations. For example, if the unmanned aerial vehicle is approaching restricted airspace, the user may have some time to alter the path of the unmanned aerial vehicle to avoid the restricted airspace. Alternatively, the user may not be provided with the opportunity to change the behavior of the unmanned aerial vehicle.
The violation counter-control system may enable flight response measures to be implemented by the UAV. The flight response measures may be effective to cause the UAV to comply with the set of flight regulations. For example, if an unmanned aerial vehicle has entered a restricted area, the flight path of the unmanned aerial vehicle may be altered to allow the unmanned aerial vehicle to immediately leave the restricted area, or to allow the unmanned aerial vehicle to land. The flight response action may be a mandatory action that may override one or more user inputs. The flight response measure may be a mechanical, electromagnetic or acousto-optical measure or take-over control for the unmanned aerial vehicle. If the alert is not valid, the action may cause the UAV to be dislodged, caught, or even destroyed. For example, the action may automatically result in a modification to the flight path of the UAV. The measures can make the unmanned aerial vehicle automatically land. The measures can power off or self-destruct the unmanned aerial vehicle. Any other flight response measures may be employed, such as those described elsewhere herein.
The violation countering system may record and track information about unmanned aerial vehicle activity. Various types of information about the UAV may be recorded and/or stored. In some embodiments, the information may be stored in a memory storage system. All information about the unmanned aerial vehicle activity may be stored. Alternatively, a subset of the information about the unmanned aerial vehicle activity may be stored. In some cases, the recorded information may facilitate retrospective tracing. The recorded information may be used for jurisdictional purposes. In some cases, the recorded information may be used for penalty actions. For example, an event may occur. The recorded information relating to the event can be traced back. The information may be used to determine details of how or why the event occurred. If the event is an accident, the information may be used to determine the cause of the accident. The information may be used to assign a mistake for the incident. For example, if a party is responsible for the incident, the information may be used to determine that the party is in error. If the party is in error, a penalty action can be implemented. In some cases, multiple parties may share different degrees of mistakes. Penalty actions may be assigned according to the recorded information. In another example, the event may be an action by the UAV that does not comply with a set of flight regulations. For example, an unmanned aerial vehicle may fly through areas where photography is not permitted. However, the UAV may have captured images using a camera. After issuing the alert, the UAV may continue to capture images in some manner. The information may be analyzed to determine how long the UAV captured an image or the type of image captured. The event may be an abnormal behavior exhibited by the unmanned aerial vehicle. If the unmanned aerial vehicle exhibits abnormal behavior, the information may be analyzed to determine a cause of the abnormal behavior. For example, if an unmanned aerial vehicle performs an action that does not match a command issued by a user remote control, the information may be analyzed to determine how or why the unmanned aerial vehicle performed the action.
In some embodiments, the recorded information may be unalterable. Alternatively, the private user may not be able to alter the recorded information. In some cases, only the handler or administrator of the memory storage system and/or the offending countermeasure system may be able to access the recorded information.
Type of communication
The unmanned aerial vehicle and the geo-fencing system may interact in the unmanned aerial vehicle system. The geo-fence device may provide one or more geo-fence boundaries that may affect the available airspace for the unmanned aerial vehicle and/or activities that the unmanned aerial vehicle may or may not perform while in the airspace.
Fig. 39 illustrates different types of communications between an unmanned aerial vehicle and a geofencing device, in accordance with an embodiment of the present invention. The geo-fencing device may be online 3910 or may be offline 3920. The geo-fence device may receive signals 3930 from the unmanned aerial vehicle only, may transmit signals 3940 to the unmanned aerial vehicle only, or may both transmit and receive signals 3950 from the unmanned aerial vehicle.
When connected to (e.g., in communication with) the authentication center, the geo-fence device may be on-line 3910. The geo-fence device may be online when connected to (e.g., in communication with) any part of the authentication system. The geofence system can be online when (in communication with) the geofence device is connected to (or connected to) an empty pipe system or a module thereof (e.g., geofence module, violation counter module). The geo-fence device may be online when connected to a network. The geo-fence device may be online when it is directly connected to another device. The geo-fencing device may be online when the geo-fencing device is capable of communicating with another device or system.
When the geo-fence device is not connected to (e.g., not in communication with) the authentication center, the geo-fence device may be offline 3920. When the geo-fence device is not connected to (e.g., not in communication with) any part of the authentication system, the geo-fence device may be offline. The geofence system may be offline when (e.g., not in communication with) the geofence system is not connected to (e.g., not in communication with) the empty pipe system or a module thereof (e.g., geofence module, violation counter module). The geo-fence device may be offline when the geo-fence device is not connected to a network. The geo-fencing device may be offline when the geo-fencing device is not directly connected to another device. The geo-fence device may be offline when the geo-fence device is unable to communicate with another device or system.
The geofencing device may be in communication with the UAV. Communication between the geofencing device and the UAV can occur in a variety of ways. For example, communication may occur via a channel, a signaling scheme, a multiple access mode, a signal format, or a signaling format. The communication between the geo-fence device and the unmanned aerial vehicle may be direct or may be indirect. In some cases, only direct communication may be employed, only indirect communication may be employed, or both direct and indirect communication may be employed. Further examples and details regarding direct and indirect communications are provided elsewhere herein.
Indirect communication may be used when the geo-fencing device receives signal 3930 only from the UAV. When the geo-fencing device is online, the indirect communication can include a signal to the unmanned aerial vehicle. For example, a network may be employed to transmit signals from the geo-fencing device to the unmanned aerial vehicle. When the geo-fencing device is offline, the indirect communication can include the recorded presence of the UAV. The geo-fence device may be capable of detecting the presence of the unmanned aerial vehicle or receiving an indirect communication of the presence of the unmanned aerial vehicle.
Direct communication may be used when the geo-fencing device only sends signal 3940 to the UAV. Direct communication can be used regardless of whether the geo-fencing device is online or offline. The geo-fence device may be capable of communicating directly with the unmanned aerial vehicle even if the geo-fence device is not in communication with the authentication system or a component thereof. The geofencing device may send a direct communication to the UAV. The geo-fencing device may provide direct communication via wireless signals. The direct communication may be an electromagnetic signal, an acousto-optical signal, or any other type of signal.
When the geo-fencing device is both sending and receiving signals 3950 (e.g., in two-way communication) with the unmanned aerial vehicle, direct communication or indirect communication may be used. In some cases, both direct and indirect communication may be used. The geofencing device and the UAV may switch between using direct communication and indirect communication. Direct communication or indirect communication can be used regardless of whether the geo-fencing device is online or offline. In some implementations, direct communication may be used for a portion of the two-way communication from the geo-fence device to the unmanned aerial vehicle, while indirect communication may be used for a portion of the two-way communication from the unmanned aerial vehicle to the geo-fence device. For a portion of the two-way communication from the UAV to the geofencing device, the indirect communication may include a signal to the UAV while the geofencing device is online, and may include a recorded presence of the UAV while the geofencing device is offline. Alternatively, direct communication and indirect communication may be used interchangeably regardless of direction.
Alternatively, the communication rules may be stored in memory above the geo-fencing device. Optionally, one or more rules regarding one or more sets of flight controls can be stored on the geo-fencing device. The geo-fencing device may or may not be capable of connecting to a network, such as the internet, any other WAN, LAN, telecommunications network, or data network. The geo-fencing device need not store the rules in memory if the geo-fencing device can connect to a network. For example, the communication rules need not be stored on the geo-fencing device. Alternatively, one or more rules regarding one or more sets of flight controls need not be stored on the geo-fencing device. The geo-fencing device may access rules stored on a separate device or memory over a network.
The geo-fence device memory may store geo-fence identification and/or authentication information. For example, the geo-fence device memory can store a geo-fence device identifier. The memory may store a geo-fence device key. The associated algorithm may be stored. The geo-fence device identifier and/or key may not be changed. Alternatively, the geo-fence device identifier and/or the key may be externally unreadable. The geo-fence device identifier and/or the key can be stored in a module, which can be non-separable from the geo-fence device. The module cannot be removed from the geo-fencing device without compromising the functionality of the geo-fencing device. In some cases, the geo-fence identification and/or authentication information may be stored on the geo-fence device regardless of whether the geo-fence device may access the network.
The geo-fencing device may include a communication unit and one or more processors. The one or more processors can individually or collectively perform any step or function of the geo-fencing device. The communication unit may allow direct communication, indirect communication, or both. The communications unit and the one or more processors can be provided on the geo-fenced device regardless of whether the geo-fenced device can access the network.
In some embodiments, the unmanned aerial vehicle may be offline or online. The unmanned aerial vehicle may be offline (e.g., not connected to the certification system). When offline, the unmanned aerial vehicle may not communicate with any component of the authentication system, such as the authentication center, the empty pipe system, or a module of the empty pipe system (e.g., the geo-fence module, the violation counter measure module). The unmanned aerial vehicle may be offline when the unmanned aerial vehicle is not connected to the network. The unmanned aerial vehicle may be offline when the unmanned aerial vehicle is not directly connected to another device. The UAV may be offline when the UAV is unable to communicate with another device or system.
When the UAV may be offline, a digital signature method may be used in the communication. Certificate issuance and usage can be used for communication. Such an approach may provide some measure of security for communication with the UAV. Such security may be provided without requiring the unmanned aerial vehicle to communicate with the authentication system.
The unmanned aerial vehicle may be online when connected to (e.g., in communication with) any component of an authentication system, such as an authentication center, an air traffic system, or any module of an authentication center (e.g., a geo-fencing module, an offending countermeasure module). The unmanned aerial vehicle may be online when the unmanned aerial vehicle is connected to a network. When an unmanned aerial vehicle is directly connected to another device, the unmanned aerial vehicle may be online. The unmanned aerial vehicle may be online when the geo-fencing device is capable of communicating with another device or system.
When the unmanned aerial vehicle is online, various communication methods or techniques may be employed. For example, the unmanned aerial vehicle and/or the user can receive the geo-fence signal and authentication can be performed at an authentication center of the authentication system. Authentication may be for a geo-fence device, which may confirm that the geo-fence device is authenticated and authorized. In some cases, the geofencing device may be confirmed to comply with legal standards. In some cases, the certified geofencing device may inform the unmanned aerial vehicle and/or the user about one or more sets of flight controls. The air management system can inform the unmanned aerial vehicle and/or the user about one or more sets of flight restrictions imposed in response to the certified geofencing device.
Fig. 20 illustrates a system in which a geo-fencing device transmits information directly to an unmanned aerial vehicle, according to an embodiment of the present invention. Geofence device 2010 may transmit signal 2015, which may be received by unmanned aerial vehicle 2030. The geo-fenced device can have a geo-fence boundary 2020. The geofencing device may include a communication unit 2040, a memory unit 2042, a detector 2044, and one or more processors 2046. The communication may be used to transmit signals. The detector may be used to detect the presence 2050 of the UAV.
Geofence device 2010 may broadcast wireless signal 2015. The broadcast may occur continuously. The broadcast may occur independent of any detected condition. The broadcast mode may advantageously be simple. Alternatively, the broadcast of the signal may occur when an approaching unmanned aerial vehicle 2020 is detected. At other times, the broadcast need not occur. This may advantageously save radio resources. The geofence device may remain hidden until the unmanned aerial vehicle is detected.
Aspects of the invention may relate to a geo-fencing device 2010 including: a communication module 2040 configured to transmit information within a predetermined geographic range of the geofencing device; and one or more storage units 2042 configured to store or receive one or more sets of flight restrictions for the predetermined geographic range of the geo-fence device, wherein when an unmanned aerial vehicle enters the predetermined geographic range of the geo-fence device, the communications module is configured to transmit a set of flight restrictions from the one or more sets of flight restrictions to the unmanned aerial vehicle. A method of providing a set of flight controls to an unmanned aerial vehicle may be provided, the method comprising: storing or receiving, in one or more memory units of a geofence device, one or more sets of flight restrictions for a predetermined geographic range of the geofence device; and transmitting a set of flight restrictions from the one or more sets of flight restrictions to the unmanned aerial vehicle when the unmanned aerial vehicle enters the predetermined geographic range of the geofence device via a communications module configured to transmit information within the predetermined geographic range of the geofence device.
Geofence device 2010 may detect the presence of unmanned aerial vehicle 2020. Alternatively, the detector 2044 of the geo-fence device may broadcast to detect the presence of the unmanned aerial vehicle.
In some implementations, the geo-fencing device may detect the unmanned aerial vehicle by identifying the unmanned aerial vehicle via visual information. For example, the geo-fencing device may visually detect and/or identify the presence of an unmanned aerial vehicle. In some cases, a camera or other form of visual sensor may be provided as a detector of the unmanned aerial vehicle. The camera may be capable of detecting the unmanned aerial vehicle when the unmanned aerial vehicle enters within a predetermined range of the geo-fencing device. In some cases, the detector of the geo-fencing device may include multiple cameras or visual sensors. Multiple cameras or vision sensors may have different fields of view. The camera may capture an image of the unmanned aerial vehicle. The image may be analyzed to detect the unmanned aerial vehicle. In some cases, the image may be analyzed to detect the presence or absence of the unmanned aerial vehicle. The image may be analyzed to determine an estimated distance of the unmanned aerial vehicle from the geofencing device. The images may be analyzed to detect the type of unmanned aerial vehicle. For example, different models of unmanned aerial vehicles may be distinguished.
In identifying unmanned aerial vehicles, information from any portion of the electromagnetic spectrum may be employed. For example, in addition to the visible spectrum, other spectra from the unmanned aerial vehicle may be analyzed to detect and/or identify the presence of the unmanned aerial vehicle. In some cases, the detector may be an infrared detector, an ultraviolet detector, a microwave detector, a radar, or any other type of device that can detect an electromagnetic signal. The detector may be capable of detecting the unmanned aerial vehicle when the unmanned aerial vehicle enters within a predetermined range of the geofence device. In some cases, multiple sensors may be provided. Multiple sensors may have different fields of view. In some cases, an electromagnetic image or signature of the unmanned aerial vehicle may be detected. The image or signature may be analyzed to detect the presence or absence of the unmanned aerial vehicle. The image or signature can be analyzed to estimate the distance of the UAV from the geofencing device. The image or signature may be analyzed to detect the type of unmanned aerial vehicle. For example, different models or unmanned aerial vehicles may be distinguished. In one example, the first UAV model type may have a different thermal signature (heat signature) than the second UAV model type.
The geofencing device may detect the unmanned aerial vehicle by identifying the unmanned aerial vehicle via acoustic information (e.g., sound). For example, the geo-fencing device may acoustically detect and/or identify the presence of an unmanned aerial vehicle. In some cases, the detector may include a microphone, sonar, ultrasonic sensor, vibration sensor, and/or any other type of acoustic sensor. The detector may be capable of detecting the unmanned aerial vehicle when the unmanned aerial vehicle comes within a predetermined range of the geofence device. The detector may comprise a plurality of sensors. Multiple sensors may have different fields of view. The sensors may capture acoustic signatures of the unmanned aerial vehicle. The acoustic signature may be analyzed to detect the UAV. The acoustic signature may be analyzed to detect the presence or absence of the UAV. The acoustic features can be analyzed to determine an estimated distance of the UAV from the geofencing device. The acoustic signature may be analyzed to detect the type of UAV. For example, different models of unmanned aerial vehicles may be distinguished. In one example, the first UAV model type may have acoustic characteristics that are different from the second UAV model type.
The geofencing device may identify an approaching UAV by monitoring one or more wireless signals from the UAV. The unmanned aerial vehicle can optionally broadcast a wireless signal that is detectable by the geofencing device when the unmanned aerial vehicle enters range. The detector of the unmanned aerial vehicle may be a receiver of a wireless signal from the unmanned aerial vehicle. The detector may optionally be a communication unit of the unmanned aerial vehicle. The same communication unit may be used to transmit signals and detect wireless communications from the UAV. Or a different communication unit may be used to transmit the signal and detect the wireless communication from the unmanned aerial vehicle. The wireless data captured by the detector may be analyzed to detect the presence or absence of the unmanned aerial vehicle. The wireless data can be analyzed to estimate a distance of the UAV from the geofencing device. For example, the time difference or signal strength may be analyzed to estimate the distance of the unmanned aerial vehicle from the geofencing device. The wireless data may be analyzed to detect the type of UAV. In some cases, the wireless data may include data identifying the unmanned aerial vehicle, such as an unmanned aerial vehicle identifier and/or an unmanned aerial vehicle type.
In some cases, the geofencing device may detect the unmanned aerial vehicle based on information from the empty pipe system or any other component of the authentication system. For example, when the air pipe system detects that the unmanned aerial vehicle is near the geofencing device, the air pipe system may track the location of the unmanned aerial vehicle and may send a signal to the geofencing device. In other cases, the air management system may send location information about the unmanned aerial vehicle to the geo-fencing device, and the geo-fencing device may make a determination that the unmanned aerial vehicle is proximate to the geo-fencing device. In some embodiments, the detector may be a communication unit that may receive data from the empty pipe system.
The unmanned aerial vehicle may or may not emit any information about the unmanned aerial vehicle 2020. In some cases, the unmanned aerial vehicle may issue wireless communications. The wireless communication may be detected by a detector above the geo-fencing device. The wireless communication may include information broadcast by the UAV. The information broadcast by the unmanned aerial vehicle may announce the presence of the unmanned aerial vehicle. Additional information regarding the identity of the UAV may or may not be provided. In some embodiments, the information regarding identity of the unmanned aerial vehicle may include an unmanned aerial vehicle identifier. The information may include information about the type of unmanned aerial vehicle. The information may include position information of the unmanned aerial vehicle. For example, the unmanned aerial vehicle may broadcast its current global coordinates. The UAV may broadcast any other attributes, such as a parameter of the UAV or a type of UAV.
In some implementations, the unmanned aerial vehicle can establish communication with the geo-fencing device and an exchange of information can occur. The communication may include one-way communication or two-way communication. The communication may include information about the identity of the unmanned aerial vehicle, the identity of the geofencing device, the type of unmanned aerial vehicle, the type of geofencing device, the location of the unmanned aerial vehicle, the location of the geofencing device, the type of geofencing device of the boundary, flight regulations, or any other type of information.
The geofencing device may be aware of the presence of the unmanned aerial vehicle through a detector above the geofencing device. The geofencing device may be aware of the presence of the UAV through information from other devices. For example, an empty pipe system (e.g., geo-fence module, violation countermeasure module), an authentication center, another geo-fence device, another unmanned aerial vehicle can provide information about the unmanned aerial vehicle's presence to the geo-fence device.
The detector of the geofencing device may be configured to detect the presence of the unmanned aerial vehicle within a predetermined range of the geofencing device. In some implementations, the detector may detect the presence of the UAV outside of a predetermined range. The detector may have a very high probability of detecting the presence of the unmanned aerial vehicle when the unmanned aerial vehicle is within the predetermined range. The detector may have a probability of detecting the unmanned aerial vehicle greater than 80%, 90%, 95%, 97%, 99%, 99.5%, 99.7%, 99.9%, or 99.99% when the unmanned aerial vehicle is within the predetermined range of the geo-fencing device. The predetermined range of the geofence device may be when the unmanned aerial vehicle is located within a predetermined distance from the geofence device. The predetermined range of geofencing devices can have a circular, cylindrical, hemispherical, or spherical shape with respect to the geofencing devices. Alternatively, the predetermined range can have any shape relative to the geo-fencing device. The geo-fencing device may be provided at the center of a predetermined range. Alternatively, the geo-fencing device may be offset from the center of the predetermined range.
The predetermined range of the geo-fencing device can comprise any magnitude of distance. For example, the predetermined range of the geo-fence device may be within 1 meter, 3 meters, 5 meters, 10 meters, 15 meters, 20 meters, 25 meters, 30 meters, 40 meters, 50 meters, 70 meters, 100 meters, 120 meters, 150 meters, 200 meters, 300 meters, 500 meters, 750 meters, 1000 meters, 1500 meters, 2000 meters, 2500 meters, 3000 meters, 4000 meters, 5000 meters, 7000 meters, or 10000 meters.
The communication unit of the geofencing device may be configured to transmit information to the unmanned aerial vehicle within a predetermined range of the geofencing device. The communication unit may be configured to continuously transmit information, periodically transmit information, transmit information according to a schedule, or transmit information upon detection of an event or condition. The transmitted information may be broadcast so that it may be received by the UAV. Other devices may also receive information if they are within a predetermined range. Alternatively, only selected devices may receive information even when they are within range. In some implementations, the communication unit may transmit information to the UAV outside of a predetermined range. When the unmanned aerial vehicle is within the predetermined range, the communication unit may have a very high probability of communication reaching the unmanned aerial vehicle. The communication unit may have a likelihood of successfully transmitting information to the unmanned aerial vehicle greater than 80%, 90%, 95%, 97%, 99%, 99.5%, 99.7%, 99.9%, or 99.99% when the unmanned aerial vehicle is within the predetermined range of the geofencing device.
When the presence of the unmanned aerial vehicle is detected, the communication unit of the geo-fence device may be configured to transmit information within a predetermined range of the geo-fence device. Detecting the presence of the UAV may be an event or condition that may initiate the transmission of information from the geo-fencing device. The information may be transmitted once or continuously after the presence of the unmanned aerial vehicle is detected. In some cases, information may be continuously or periodically transmitted to the unmanned aerial vehicle while the unmanned aerial vehicle remains within the predetermined range of the geo-fencing device.
In some embodiments, the information transmitted to the UAV may include a set of flight controls. A set of flight restrictions can be generated at the geo-fencing device. A set of flight controls may be generated by selecting from a plurality of sets of flight controls. A set of flight restrictions can be generated from scratch at the geofencing device. A set of flight controls may be generated via user input. The set of flight controls may combine features from multiple sets of flight controls.
A set of flight controls may be generated based on the information about the unmanned aerial vehicle. For example, a set of flight controls may be generated based on the type of unmanned aerial vehicle. A set of flight controls may be selected from a plurality of sets of flight controls based on the type of unmanned aerial vehicle. A set of flight controls may be generated based on the unmanned aerial vehicle identifier. A set of flight controls may be selected from a plurality of sets of flight controls based on the unmanned aerial vehicle identifier. A set of flight controls may be generated based on information about the user. For example, a set of flight controls may be generated based on the user type. A set of flight controls may be selected from a plurality of sets of flight controls based on a user type. A set of flight controls may be generated based on the user identifier. A plurality of flight controls may be selected from the plurality of sets of flight controls based on the user identifier. Any other type of flight control generation technique may be utilized.
The geo-fence device may be configured to receive an unmanned aerial vehicle identifier and/or a user identifier. The UAV identifier may uniquely identify the UAV from other UAVs. The user identifier may uniquely identify the user from among other users. The identity of the unmanned aerial vehicle and/or the identity of the user may have been authenticated. The communications module of the geo-fence device can receive the unmanned aerial vehicle identifier and/or the user identifier.
The communications module may be capable of changing a communications mode when the unmanned aerial vehicle enters a predetermined range of the geo-fencing device. The communication module can be a communication module of a geo-fencing device. Alternatively, the communication module may be a communication module of an unmanned aerial vehicle. The communications module may operate in the first communications mode before the unmanned aerial vehicle enters the predetermined range of the geo-fencing device. The communication module may switch to the second communication mode when the unmanned aerial vehicle enters a predetermined range of the geo-fencing device. In some embodiments, the first communication mode is an indirect communication mode and the second communication mode is a direct communication mode. For example, when an unmanned aerial vehicle is within a predetermined range of a geofencing device, the unmanned aerial vehicle may communicate with the geofencing device via a direct communication mode. The unmanned aerial vehicle can communicate with the geofence device via an indirect communication mode when the unmanned aerial vehicle is outside of a predetermined range of the geofence device. In some embodiments, two-way communication may be established between the unmanned aerial vehicle and the geo-fencing device. Alternatively, two-way communication may be established when the unmanned aerial vehicle is within a predetermined range of the geofencing device. The communications module may transmit information within the predetermined range of the geo-fence device when the unmanned aerial vehicle is within the predetermined range of the geo-fence device, and optionally not when the unmanned aerial vehicle is outside the predetermined range of the geo-fence device. The communications module may receive information within the predetermined range of the geo-fence device when the unmanned aerial vehicle is within the predetermined range of the geo-fence device and, optionally, not when the unmanned aerial vehicle is outside the predetermined range of the geo-fence device.
The one or more processors 2046 of the geofencing device can be individually or collectively configured to generate a set of flight controls. A set of flight controls can be generated using information about a plurality of sets of flight controls that can be stored on the geo-fencing device. The set of flight restrictions may be generated by a processor, which may select a set of flight restrictions from a plurality of sets of flight restrictions stored in one or more memory units 2042. The processor may generate the set of flight restrictions by combining flight restrictions from the plurality of sets of flight restrictions stored in the one or more memory units. Alternatively, the processor may use information regarding a plurality of sets of flight controls that may be stored outside of the geo-fencing device to generate a set of flight controls. In some cases, information from outside the geo-fenced device can be pulled and received at the geo-fenced device. The geo-fence device may store the pulled information permanently or temporarily. The pulled information may be stored in short-term memory. In some cases, the pulled information may be temporarily stored by buffering the received information.
The geo-fence device may or may not be detectable by the UAV. In some cases, the geo-fencing device may include an indicator detectable by the unmanned aerial vehicle. The indicator may be a visual marker, an infrared marker, an ultraviolet marker, an acoustic marker, a wireless signal, or any other type of marker that may be detected by the UAV. Further details regarding the detection of the geofence device by the UAV may be provided elsewhere herein in greater detail. The unmanned aerial vehicle may be capable of receiving a set of flight restrictions without detecting the geofence device. A geofencing device may detect an unmanned aerial vehicle and push a set of flight controls or any other geofencing data to the unmanned aerial vehicle without the unmanned aerial vehicle detecting the geofencing device.
The set of flight restrictions can include a set of one or more geofence boundaries. The geofence boundary may be used to accommodate or prevent entry of the unmanned aerial vehicle. For example, the set of flight restrictions may include one or more boundaries within which the UAV is permitted to fly. The unmanned aerial vehicle may optionally not be allowed to fly outside of the boundary. Alternatively, the set of flight restrictions may include a set of one or more boundaries within which the UAV is not permitted to fly. The set of flight controls may or may not impose any altitude restrictions. In some embodiments, the set of flight restrictions may include an upper altitude limit that does not allow the UAV to fly above the set of flight restrictions. The set of flight restrictions may include a lower altitude limit under which the UAV is not permitted to fly.
The set of flight controls may include conditions that do not allow the UAV to operate a payload of the UAV. The payload of the UAV may be an image capture device, and flight controls may include conditions that do not allow the UAV to capture images. The condition may be based on whether the unmanned aerial vehicle is located within or outside of a geofence boundary. The set of flight controls may include conditions that do not allow the UAV to communicate under one or more wireless conditions. The radio conditions may include one or more selected frequencies, bandwidths, protocols. The condition may be based on whether the unmanned aerial vehicle is located within or outside of a geofence boundary.
The set of flight controls may include one or more restrictions on items carried by the UAV. For example, the limit may be imposed on the number of items, the size of the items, the weight of the items, or the type of items. The condition may be based on whether the unmanned aerial vehicle is located within or outside of a geofence boundary.
The set of flight controls may include a minimum remaining battery capacity for operation of the UAV. The battery capacity may include a state of charge, a remaining flight time, a remaining flight distance, energy efficiency, or any other factor. The condition may be based on whether the unmanned aerial vehicle is located within or outside of a geofence boundary.
The set of flight controls may include one or more restrictions on landing the UAV. The limit may include a landing process that may be implemented by the unmanned aerial vehicle or whether the unmanned aerial vehicle can land at all. The condition may be based on whether the unmanned aerial vehicle is located within or outside of a geofence boundary.
Any other type of flight control may be provided as described in more detail elsewhere herein. One or more sets of flight controls can be associated with a predetermined range of the geofencing device. One or more sets of flight restrictions are associated with one or more geofence boundaries within a predetermined range of the geofence device. In some embodiments, the predetermined range may indicate a range for detecting and/or communicating with the unmanned aerial vehicle. The geofence boundary may indicate a boundary at which different operations may be delineated, which may or may not be permitted by the unmanned aerial vehicle. Different rules may apply inside and outside the geofence boundary. In some cases, the predetermined range is not used to delineate the operation. The geofence boundary can eventually be aligned with the predetermined range. Alternatively, the geofence boundary may be different from the predetermined range. The geofence boundary can fall within a predetermined range. In some cases, some buffering may be provided between the predetermined range and the geofence boundary. The buffering may ensure that the UAV may receive a set of flight restrictions before reaching the boundary.
In one example, the unmanned aerial vehicle can receive a set of flight restrictions when the unmanned aerial vehicle is within a predetermined range of the geofence device. The unmanned aerial vehicle can determine from the set of flight restrictions that a geofence boundary for a device is imminent, and that the unmanned aerial vehicle is not permitted to enter within the geofence boundary. The unmanned aerial vehicle can make the determination before the unmanned aerial vehicle reaches a geofence boundary, while the unmanned aerial vehicle crosses a geofence boundary, or shortly after the unmanned aerial vehicle crosses a geofence boundary. The set of flight restrictions sent to the UAV may include instructions to prevent the UAV from entering the one or more geofence boundaries. The unmanned aerial vehicle may take flight response measures. For example, the unmanned aerial vehicle flight path can be automatically controlled to avoid one or more geofence boundaries. The unmanned aerial vehicle can be automatically forced to land when the unmanned aerial vehicle enters one or more geofence boundaries. When the unmanned aerial vehicle enters one or more geofence boundaries, the unmanned aerial vehicle flight path can be automatically controlled to cause the unmanned aerial vehicle to exit an area enclosed by the one or more geofence boundaries.
Fig. 21 illustrates a system in which an air traffic system can communicate with a geofencing device and/or an unmanned aerial vehicle. A geo-fencing device 2110 and an unmanned aerial vehicle 2120 may be provided within the system. An empty pipe system 2130 may also be provided. The geo-fencing device may provide a spatial reference to one or more geographic boundaries 2115. The geofence device may include a communication unit 2140 and a detector 2142. The empty pipe system may include one or more processors 2150 and a memory unit 2152.
The unmanned aerial vehicle 2120 may be detected by the detector 2142 of the geofence device. The unmanned aerial vehicle may be detected using any of the techniques as described elsewhere herein. The unmanned aerial vehicle may be detected when the unmanned aerial vehicle enters a predetermined range. When the unmanned aerial vehicle enters the predetermined range, there may be a high probability that the unmanned aerial vehicle is detected. In some cases, the unmanned aerial vehicle may be detected before entering the predetermined range. When an unmanned aerial vehicle enters a predetermined range, a set of flight controls may be provided to the unmanned aerial vehicle.
A set of flight controls may be generated at the air traffic control system 2130. The set of flight controls may be generated by selecting a set of flight controls for the unmanned aerial vehicle from a plurality of available sets of flight controls. Multiple sets of flight controls may be stored in memory 2152. The one or more processors 2150 of the air management system may select the set of flight controls from a plurality of sets of flight controls. The air traffic system may employ any other flight control generation techniques, including those described elsewhere.
In some cases, geo-fence device 2110 can transmit a signal to air traffic management system 2130 that can trigger generation of a set of flight restrictions at the air traffic management system. The signal may be transmitted by means of the communication unit 2140 of the geo-fencing device. The geofencing device can transmit the signal to the empty pipe system when the geofencing device detects that the unmanned aerial vehicle is within the predetermined range. Detection of crossing of an unmanned aerial vehicle into a predetermined range may trigger transmission of instructions from a geo-fencing device to an air traffic system to generate a set of flight regulations for the unmanned aerial vehicle.
In some implementations, the geo-fencing device may be capable of detecting information about the unmanned aerial vehicle and/or the user. The geo-fence device may be capable of detecting an unmanned aerial vehicle identifier and/or a user identifier. The geo-fencing device may be capable of determining an unmanned aerial vehicle type or a user type. The geofencing device can transmit information about the UAV and/or the user to the air traffic management system. For example, the geofencing device may transmit information about the type of unmanned aerial vehicle or the type of user to the air traffic management system. The geofencing device may transmit the unmanned aerial vehicle identifier and/or the user identifier to the empty pipe system. The geo-fencing device can transmit additional information to the empty pipe system, such as environmental conditions or any other data captured or received by the geo-fencing device.
The air traffic control system can optionally broadcast information from the geo-fencing device to generate a set of flight controls. For example, the air management system may generate a set of flight controls based on the unmanned aerial vehicle or user information. The air management system may generate a set of flight controls based on the type of UAV or the type of user. The air management system may generate a set of flight controls based on the UAV identifier or the user identifier. In generating the set of flight controls, the empty pipe system can use additional data from the geo-fencing device, such as environmental conditions. For example, a set of flight restrictions can be generated based on a set of environmental conditions detected or communicated by the geo-fencing device. The geofencing devices can have one or more sensors mounted thereon that can allow the geofencing devices to detect one or more environmental conditions. In some cases, the empty pipe system may receive data from an additional data source other than the geo-fencing device. In some cases, the empty pipe system can receive data from a plurality of geo-fencing devices. The empty pipe system can receive information about environmental conditions from a plurality of data sources, such as a plurality of geo-fencing devices, and external sensors or third party data sources. Any data from the geo-fencing device or other data source may be used in generating a set of flight controls for the UAV. A set of flight controls may be generated based on one or more of: user information, unmanned aerial vehicle information, additional data from geo-fencing devices, or additional information from other data sources.
The unmanned aerial vehicle may or may not directly send communications to the air traffic system. In some implementations, the unmanned aerial vehicle can transmit the unmanned aerial vehicle and/or user data to the air traffic system. Alternatively, the UAV does not transmit UAV and/or user data to the air traffic system.
When an air management system has generated a set of flight controls for an unmanned aerial vehicle, the air management system may communicate the set of flight controls to the unmanned aerial vehicle. In providing the set of flight controls, the air traffic control system may communicate directly or indirectly with the UAV. The air management system can communicate a set of flight controls to the unmanned aerial vehicle via the geo-fencing device. For example, the air traffic system may generate and communicate a set of flight restrictions to a geo-fencing device, which may in turn transmit the set of flight restrictions to the UAV.
The UAV may quickly receive the set of flight controls. The set of flight restrictions may be received by the unmanned aerial vehicle prior to, concurrently with, or after entering the predetermined range of the geo-fencing device. The set of flight restrictions may be received by the unmanned aerial vehicle before, concurrently with, or after passing through a geofence boundary of the geofence device. In some embodiments, the unmanned aerial vehicle can receive the set of flight restrictions within less than about 10 minutes, 5 minutes, 3 minutes, 1 minute, 30 seconds, 15 seconds, 10 seconds, 5 seconds, 3 seconds, 2 seconds, 1 second, 0.5 seconds, or 0.1 seconds of being detected by the detector of the geo-fencing device. The unmanned aerial vehicle can receive the set of flight restrictions within less than about 10 minutes, 5 minutes, 3 minutes, 1 minute, 30 seconds, 15 seconds, 10 seconds, 5 seconds, 3 seconds, 2 seconds, 1 second, 0.5 seconds, or 0.1 seconds of entering the predetermined range of the geo-fence device.
Fig. 22 illustrates a system in which an unmanned aerial vehicle detects a geofencing device, according to an embodiment of the present invention. Geo-fence device 2210 may be detectable by unmanned aerial vehicle 2220. The unmanned aerial vehicle may have a memory unit 2230, a communication unit 2232, a flight controller 2234, and/or one or more sensors 2236.
Geo-fence device 2210 may include an indicator. The indicator of the geo-fence device may be detectable by the unmanned aerial vehicle 2220. The indicator may be a marker that is discernable by one or more sensors 2236 on the UAV. The indicator may be detectable by the UAV when the UAV is in flight. The indicator may be detectable by one or more sensors on the UAV while the UAV is in flight. The indicator may be detectable by the unmanned aerial vehicle before or while the unmanned aerial vehicle enters within the predetermined range of the geofencing device. The indicator can be detectable by the unmanned aerial vehicle prior to the unmanned aerial vehicle entering a geofence boundary of the geofence device. The indicator may be detectable by the unmanned aerial vehicle when the unmanned aerial vehicle enters a geofence boundary of the geofence device.
The indicator may be a wireless signal. The geo-fencing device may continuously broadcast the wireless signal. The geofence device may broadcast the wireless signal periodically (e.g., at regular or irregular time periods). For example, the geo-fence device can broadcast the wireless signal in a period of less than or equal to about once every 0.01 second, once every 0.05 second, once every 0.1 second, once every 0.5 second, once every 2 seconds, once every 3 seconds, once every 5 seconds, once every 10 seconds, once every 15 seconds, once every 30 seconds, once every minute, once every 3 minutes, once every 5 minutes, once every 10 minutes, or once every 15 minutes. The geo-fencing device may broadcast the wireless signal according to a schedule. The geo-fence device may broadcast a wireless signal in response to an event or condition. For example, the geo-fencing device may broadcast a wireless signal in response to the detected presence of the unmanned aerial vehicle. The geofencing device may broadcast a wireless signal in response to detecting that the unmanned aerial vehicle has crossed within a predetermined range of the geofencing device. The geofencing device may broadcast a wireless signal in response to detecting the UAV before the UAV has crossed into the predetermined range, while the UAV is crossing into the predetermined range, or after the UAV has crossed into the predetermined range. The geofence device can broadcast a wireless signal to the unmanned aerial vehicle before the unmanned aerial vehicle crosses a geofence boundary of the geofence device.
The indicator may provide any type of wireless signal. For example, the wireless signal may be a radio signal, a bluetooth signal, an infrared signal, a UV signal, a visible signal or an optical signal, a WiFi signal or a WiMax signal, or any other type of wireless signal. The wireless signals may be broadcast so that any device in the area may receive and/or detect the wireless signals. In some cases, the wireless signal may be directed only to the unmanned aerial vehicle. The wireless signal may be detectable by a wireless receiver or sensor on the UAV. In some embodiments, the wireless receiver or sensor may be a communication unit. The same communication may be used to detect the pointing object and provide communication between the UAV and other devices, such as a user terminal. Alternatively, a different communication unit may be used to detect the pointing object and provide communication between the UAV and other devices, such as a user terminal.
The indicator may be a visible mark. The visible mark may be detectable by one or more vision sensors of the unmanned aerial vehicle. The visible mark may be visually depicted on the image captured by the camera. The visible indicia may comprise an image. The visible indicia may be static and may include still images that do not change over time. The still image may include letters, numbers, icons, shapes, symbols, pictures, 1D, 2D, or 3D barcodes, Quick Response (QR) codes, or any other type of image. The visible indicia may be dynamic and may include images that may change over time. The visible indicia may change continuously, periodically, according to a schedule, or in response to a detected event or condition. The visual indicia may be displayed on a screen that may remain static or may change the displayed indicia over time. The visual indicia can be a sticker that can be provided on a surface of the geo-fencing device. The visual indicia may include one or more lights. The spatial arrangement of the lights and/or the blinking pattern of the lights may be used as part of the visual indicia. The visual indicia may have a color. Further description of dynamic tagging is provided in more detail elsewhere herein. The visual indicia may be visually discernable from a distance. The unmanned aerial vehicle may be capable of visually discerning the visual indicia when the unmanned aerial vehicle enters a predetermined range of the geo-fencing device. The unmanned aerial vehicle may be able to visually discern the visual indicia before the unmanned aerial vehicle enters the predetermined range of the geo-fencing device. The unmanned aerial vehicle may be able to visually discern the visual indicia prior to the unmanned aerial vehicle entering a geofence boundary of a geofence device.
The indicator may be an acoustic marker. The acoustic marker may emit a sound, vibration, or other discernible sound effect. The acoustic marker may be detected by an acoustic sensor (such as a microphone or other type of acoustic detector) on the UAV. The acoustic markers may emit different tones, pitches, frequencies, harmony, volume, or sound patterns or vibration patterns. The acoustic marker may or may not be detectable by a human naked ear. The acoustic marker may or may not be detectable by a typical mammalian ear.
The indicator can indicate the presence of the geo-fencing device. When the unmanned aerial vehicle detects the indicator, the unmanned aerial vehicle may know that the geofencing device may be present. In some cases, the indicator may uniquely identify the geo-fencing device. For example, each geo-fence device may have a different indicator, which may be detectable by the unmanned aerial vehicle to distinguish the geo-fence device from other geo-fence devices. In some cases, a set of flight controls can be generated based on the uniquely identified geo-fencing device. A set of flight controls can be associated with the uniquely identified geo-fencing device.
In some cases, the indicator may indicate a geofence device type. The indicator need not be unique to a particular geo-fencing device, but can be unique to a particular geo-fencing device type. In some cases, different types of geo-fencing devices
May have different physical characteristics (e.g., model, shape, size, power output, range, battery life, sensors, performance) or may be used to perform different geofencing functions (e.g., keeping the unmanned aerial vehicle out of the area, affecting flight of the unmanned aerial vehicle, affecting payload operation of the unmanned aerial vehicle, affecting communication of the unmanned aerial vehicle, affecting sensors on board the unmanned aerial vehicle, affecting navigation of the unmanned aerial vehicle, affecting charge usage of the unmanned aerial vehicle). Different types of geo-fencing devices may have different security levels or priorities. For example, the rules imposed by a first level of geo-fencing device may be more important than the rules imposed by a second level of geo-fencing device. The geofence device types may include different geofence device types created by the same manufacturer or designer, or by different manufacturers or designers. In some cases, a set of flight restrictions may be generated based on the identified geo-fencing device type. A set of flight controls can be associated with the identified geo-fencing device type.
The indicator can be permanently affixed to the geo-fencing device. The indicator can be integrated into the geo-fencing device. In some cases, the indicator cannot be removed from the geo-fencing device without damaging the geo-fencing device. Alternatively, the indicator can be removable from the geo-fencing device. The indicator can be removed from the geo-fencing device without damaging the geo-fencing device. In some embodiments, the indicator can be the body or housing of the geo-fencing device itself. The fuselage or housing of the geofence device may be identifiable by the unmanned aerial vehicle. For example, the unmanned aerial vehicle can include a camera that can capture an image of the geo-fencing device and can identify the geo-fencing device from its fuselage or housing.
The unmanned aerial vehicle 2220 may be capable of sensing geofence devices. The unmanned aerial vehicle can sense an indicator of the geofencing device. The unmanned aerial vehicle may include: a sensor configured to detect an indicator of a geo-fencing device; and a flight control module configured to generate one or more signals that cause the UAV to operate in accordance with a set of flight controls generated based on the detected indicators of the geofence device. A method of operating an unmanned aerial vehicle may include: detecting, by means of a sensor on the UAV, an indicator of a geofencing device; generating, using a flight control module, one or more signals that cause the UAV to operate in accordance with a set of flight regulations generated based on the detected indicators of the geofencing device.
The unmanned aerial vehicle can detect the indicator by means of the sensor 2236. The unmanned aerial vehicle may carry one or more types of sensors. In some cases, the unmanned aerial vehicle may be capable of detecting different types of indicators. For example, an unmanned aerial vehicle may encounter a geofencing device having a visible marker and another geofencing device having a wireless signal as an indicator. The unmanned aerial vehicle may be capable of detecting both types of indicators. Alternatively, the unmanned aerial vehicle may seek a particular type of indicator (e.g., only identify the visual marker as an indicator of the geo-fencing device). The unmanned aerial vehicle may carry one or more types of sensors, such as those described elsewhere herein, and may have a communication unit 2232 that may also act as a sensor for an indicator.
When an unmanned aerial vehicle detects a geofencing device, the unmanned aerial vehicle can generate or receive a set of flight restrictions. The unmanned aerial vehicle may then operate according to the set of flight controls. Various types of flight controls may be provided, such as those described in more detail elsewhere herein. The set of flight controls can include information about geofence boundaries and locations, and types of unmanned aircraft operators that are allowed or not allowed within or outside of the geofence boundaries. The set of flight regulations may include the timing at which rules or restrictions apply and/or any flight responses to be taken by the UAV to comply with the regulations.
A set of flight controls may be generated on the UAV. The unmanned aerial vehicle can include one or more processors that can perform steps to generate a set of flight controls. The UAV may include a memory 2230 that may store information that may be used to generate a set of flight controls. In one example, the set of flight controls may be generated by selecting the set of flight controls from a plurality of sets of flight controls. The plurality of sets of flight controls may be stored in memory on the UAV.
The unmanned aerial vehicle can detect the presence of the geofencing device. A set of flight restrictions can be generated based on the detected presence of the geo-fencing device. In some cases, the presence of the geofencing device may be sufficient to generate a set of flight controls. The unmanned aerial vehicle and/or the user information may be provided at the unmanned aerial vehicle. In some cases, the unmanned aerial vehicle and/or user information may be used to help generate a set of flight controls. For example, a set of flight restrictions may be generated based on the UAV and/or user information (e.g., UAV identifier, UAV type, user identifier, and/or user type).
The unmanned aerial vehicle can receive other information about the geo-fence device, such as a type of geo-fence device or a unique identifier of the geo-fence device. The information about the geo-fence device can be determined based on an indicator of the geo-fence device. Alternatively, other channels may deliver information about the geo-fencing device. The geofence device information may be used to help generate a set of flight controls. For example, a set of flight restrictions can be generated based on the geofence information (e.g., geofence device identifier, geofence device type). For example, different types of geofencing devices may have different boundary sizes or shapes. Different types of geofencing devices may have different operating rules or restrictions imposed on the unmanned aerial vehicle. In some cases, different geofencing devices of the same type may have the same boundary shape or size and/or the same type of operational rules applied. Alternatively, different flight controls may be imposed even within the same geo-fencing device type.
The UAV may aggregate or receive other information that may be used to generate a set of flight controls. For example, an unmanned aerial vehicle may receive information about environmental conditions. The information regarding the environmental condition may be received from a geo-fencing device, an air traffic management system, one or more external sensors, one or more sensors on an unmanned aerial vehicle, or any other source. A set of flight controls may be generated based on other information, such as environmental conditions.
A set of flight controls may be generated off-board the unmanned aerial vehicle. For example, a set of flight controls may be generated at an air traffic system off-board an unmanned aircraft. The air management system may include one or more processors that may perform steps to generate a set of flight controls. The air management system may include a memory that may store information that may be used to generate a set of flight controls. In one example, the set of flight controls may be generated by selecting the set of flight controls from a plurality of sets of flight controls. The sets of flight controls may be stored in a memory of the air traffic control system.
The unmanned aerial vehicle can detect the presence of the geofencing device. In response to detection of the geofencing device, the unmanned aerial vehicle may send a request for a set of flight controls to the air traffic management system. Based on the detected presence of the geofence device, a set of flight controls can be generated at the empty pipe system. In some cases, the presence of the geofencing device may be sufficient to generate a set of flight controls. The unmanned aerial vehicle and/or user information may be provided to the air management system from the unmanned aerial vehicle or from any other component of the system. In some cases, the unmanned aerial vehicle and/or user information may be used to help generate a set of flight controls. For example, a set of flight restrictions may be generated based on the UAV and/or user information (e.g., UAV identifier, UAV type, user identifier, and/or user type).
The empty pipe system can receive other information about the geo-fencing device, such as the type of geo-fencing device or a unique identifier of the geo-fencing device. The air management system may receive information from the unmanned aerial vehicle that may have been determined based on the indicator of the geo-fencing device. Alternatively, other channels may deliver information about the geo-fencing device. For example, the empty pipe system can receive information directly from the geo-fencing device. The air management system may be capable of comparing the locations of the unmanned aerial vehicle and the geofencing devices to determine which geofencing device(s) the unmanned aerial vehicle may have detected. The geofence device information may be used to help generate a set of flight controls. For example, a set of flight restrictions can be generated based on the geofence information (e.g., geofence device identifier, geofence device type). For example, different types of geofencing devices may have different boundary sizes or shapes. Different types of geofencing devices may have different operating rules or restrictions imposed on the unmanned aerial vehicle. In some cases, different geofencing devices of the same type may have the same boundary shape or size and/or the same type of operational rules applied. Alternatively, different flight controls may be imposed even within the same geo-fencing device type.
The air management system may aggregate or receive other information that may be used to generate a set of flight controls. For example, the empty pipe system may receive information about environmental conditions. Information regarding the environmental condition may be received from a geo-fencing device, one or more external sensors, one or more sensors on an unmanned aerial vehicle, or any other source. A set of flight controls may be generated based on other information, such as environmental conditions.
In another example, a set of flight restrictions can be generated on top of the geo-fencing device. The geofencing device can include one or more processors that can perform the steps to generate a set of flight controls. The geofencing device can include a memory that can store information that can be used to generate a set of flight controls. In one example, the set of flight controls may be generated by selecting a set of flight controls from a plurality of sets of flight controls. The plurality of sets of flight controls can be stored in a memory of the geofencing device.
The unmanned aerial vehicle can detect the presence of the geofencing device. In response to the detection of the geo-fencing device, the unmanned aerial vehicle can send a request for a set of flight restrictions to the geo-fencing device. In response to a request from an unmanned aerial vehicle, a set of flight restrictions can be generated at the geo-fencing device. The unmanned aerial vehicle and/or user information can be provided to the geo-fencing device from the unmanned aerial vehicle or from any other component of the system. In some cases, the unmanned aerial vehicle and/or user information may be used to help generate a set of flight controls. For example, a set of flight restrictions may be generated based on the UAV and/or user information (e.g., UAV identifier, UAV type, user identifier, and/or user type).
The geo-fencing device can use information about the geo-fencing device, such as the type of geo-fencing device or a unique identifier of the geo-fencing device. A geo-fencing device can store information thereon about the geo-fencing device. The geofence device information may be used to help generate a set of flight controls. For example, a set of flight restrictions can be generated based on the geofence information (e.g., geofence device identifier, geofence device type). For example, different types of geofencing devices may have different boundary sizes or shapes. Different types of geofencing devices may have different operating rules or restrictions imposed on the unmanned aerial vehicle. In some cases, different geofencing devices of the same type may have the same boundary shape or size and/or the same type of operational rules applied. Alternatively, different flight controls may be imposed even within the same geo-fencing device type.
The geofencing device can aggregate or receive other information that can be used to generate a set of flight controls. For example, the geo-fencing device may receive information about environmental conditions. Information regarding the environmental condition may be received from other geo-fencing devices, one or more external sensors, one or more sensors on the geo-fencing device, an unmanned aerial vehicle, or any other source. A set of flight controls may be generated based on other information, such as environmental conditions.
In some embodiments, a geo-fencing device can comprise: a receiver configured to receive data useful in determining a set of flight controls; one or more processors individually or collectively configured to: determining the set of flight controls based on data received by the receiver; and one or more transmitters configured to transmit signals that cause the unmanned aerial vehicle to fly in accordance with the set of flight controls. Aspects of the invention may relate to a method of controlling flight of an unmanned aerial vehicle, the method comprising: receiving, using a receiver of a geofencing device, data that facilitates determining a set of flight controls; determining, with the aid of one or more processors, the set of flight controls based on data received by the receiver; and transmitting, by way of one or more transmitters of the geofencing device, a signal that causes the UAV to fly in accordance with the set of flight regulations. The receiver may be an input element that collects data. The transmitter may be an output element that outputs a signal to the UAV.
In some embodiments, the receiver may be a sensor. The data received by the receiver may be sensed data indicative of one or more environmental conditions of the geo-fencing device. The geofencing devices may include any type of sensor, such as a visual sensor, a GPS sensor, an IMU sensor, a magnetometer, an acoustic sensor, an infrared sensor, an ultrasonic sensor, or any other type of sensor described elsewhere herein, including other sensors described within the context of being carried by an unmanned aerial vehicle. The one or more environmental conditions may include any type of environmental condition, such as those described elsewhere herein. The sensors may be capable of detecting data regarding ambient climate (e.g., temperature, wind, precipitation, insolation, humidity), ambient complexity, population density, or traffic flow (e.g., surface traffic flow or air traffic flow near the geo-fence device). Environmental conditions may be considered when generating a set of flight controls. For example, flight controls may differ if rain is compared to no rain. The flight control may be different if the sensors sense a lot of movement (e.g., high traffic flow) around the geo-fence device as compared to no movement.
The data received by the receiver may be sensed data indicative of one or more wireless or communication conditions of the geo-fencing device. For example, the geo-fenced devices may be surrounded by different wireless networks or hotspots. When generating a set of flight controls, wireless conditions or communication conditions may be considered.
The receiver may be a detector configured to detect the presence of the unmanned aerial vehicle. The data received by the receiver may indicate the presence of the unmanned aerial vehicle. The detector may be capable of identifying the UAV as an UAV as compared to other objects that may be located within an environment. The detector may be capable of detecting the identity of the unmanned aerial vehicle and/or the type of unmanned aerial vehicle. In some cases, the data received by the receiver may indicate a type of the unmanned aerial vehicle and/or an identifier of the unmanned aerial vehicle. The detector may be capable of detecting a position of the unmanned aerial vehicle. The detector may be capable of detecting a distance of the unmanned aerial vehicle relative to the detector. The detector may be capable of detecting an orientation of the UAV relative to the detector. The detector may be capable of determining an orientation of the unmanned aerial vehicle. The detector may be capable of detecting a position of the unmanned aerial vehicle relative to the detector and/or relative to a global environment.
The receiver may be a communication module configured to receive wireless signals. The data received by the communication module may be user input. The user can manually input data directly into the communication module. Alternatively, the user may interact with a remote user terminal, which may send signals indicative of user input to the communication module. The data may include information from one or more surrounding geo-fencing devices. The geo-fence devices may communicate with each other and share information. The data may include information from the air traffic system or any other part of the authentication system.
A set of flight controls may be determined based on data received by the receiver. The set of flight restrictions can include one or more geofence boundaries for one or more flight restrictions. The geofence boundary may be determined based on data received by the receiver. Thus, different geofence boundaries may be provided under different circumstances for the same geofence device. For example, the geofencing device may generate a first set of boundaries when a first type of unmanned aerial vehicle is detected, and a second set of boundaries when a second type of unmanned aerial vehicle is detected. The first type of unmanned aerial vehicle and the second type of unmanned aerial vehicle can be simultaneously within range of the geofencing device, and can each have a different boundary imposed thereon. In another example, the geo-fencing device may generate a first set of boundaries when the environmental conditions indicate that the wind speed is high, and may generate a second set of boundaries when the environmental conditions indicate that the wind speed is low. The geofence boundary may be determined based on any data received by the receiver, including but not limited to unmanned aerial vehicle information, user information, environmental information, shared information.
A set of flight controls may include one or more types of flight restrictions. Flight restrictions may apply to regions within or outside of the geofence boundary. Flight restrictions may impose restrictions on one or more aspects of unmanned aerial vehicle operation (e.g., flight, takeoff, landing, payload operation, payload positioning, carrier operation, objects that may be carried by the unmanned aerial vehicle, communications, sensors, navigation, and/or electrical usage). In some cases, flight restrictions may be imposed only within the geofence boundary. Alternatively, the restriction may be imposed only outside the geofence boundary. Some limits may be provided both inside and outside the boundary. The entire set of limits within the boundary may be different from the entire set of limits outside the boundary. The flight limit may be determined based on data received by the receiver. Thus, different restrictions may be provided in different situations for the same geo-fencing device. For example, the geofencing device may generate a first set of flight limits when a first type of unmanned aerial vehicle is detected, and may generate a second set of limits when a second type of unmanned aerial vehicle is detected. The first type of unmanned aerial vehicle and the second type of unmanned aerial vehicle can be simultaneously within range of the geofencing device, and can each have a different set of restrictions imposed thereon. In another example, a geo-fencing device may have generated a first set of flight restrictions when it detects a large number of wireless hotspots in an area, and may generate a second set of flight restrictions when it does not detect a large number of wireless hotspots.
The geofencing device may transmit a signal that causes the unmanned aerial vehicle to fly in accordance with a set of flight regulations. The signal may include the set of flight controls themselves. The geofencing device may determine a set of flight controls locally and then transmit the set of flight controls to the UAV. The set of flight controls may be sent directly to the UAV or may be sent to an air traffic system, which may relay the set of flight controls to the UAV. The signal may be sent directly to the unmanned aerial vehicle or another external device.
In some embodiments, the signal may include a trigger to cause the external device to send a set of flight controls to the UAV. An example of an external device may be another geo-fencing device, an empty pipe system, or any other external device. In some cases, the external device may store a plurality of sets of possible flight controls or may store components that may be used to generate a set of flight controls. The geofencing device can transmit a signal that can cause the external device to generate a set of flight restrictions based on the determination of the geofencing device. The external device may then deliver the generated set of flight controls to the UAV. The signal may be sent directly to an external device.
The signal may include an identifier that causes the UAV to select the determined set of flight restrictions from a memory of the UAV. The unmanned aerial vehicle can generate a set of flight controls based on the signal from the geo-fencing device. For example, an UAV may store one or more sets of flight controls or components of flight controls in a memory of the UAV. The geofencing device can transmit a signal that can cause the unmanned aerial vehicle to generate a set of flight restrictions in accordance with the determination of the geofencing device. In one example, the signal may include a generated identifier that may indicate the set of flight controls. The identifier may be an identifier that is unique to a particular set of flight controls. The unmanned aerial vehicle may then operate in compliance with the generated set of flight regulations.
Fig. 23 shows an example of an unmanned aerial vehicle system in which the unmanned aerial vehicle and the geofencing device need not communicate directly with each other. In some cases, the unmanned aerial vehicle may detect the presence of the geofencing device, or vice versa.
The unmanned aerial vehicle system can include a geo-fencing device 2310, an unmanned aerial vehicle 2320, and/or an external device 2340. Unmanned aerial vehicle may include memory unit 2330, sensors 2332, flight controllers 2334, and/or communication unit 2336.
The external device 2340 may be an empty pipe system, an authentication center, or any other part of an authentication system. The external device may be another unmanned aerial vehicle or another geo-fencing device. The external device may be a device separate from the other types of devices mentioned herein. In some cases, the external device may include one or more physical devices. Multiple physical devices may communicate with each other. The external device may have a distributed architecture. In some cases, the external device may have a cloud computing infrastructure. The external device may have a P2P architecture. The external device may communicate with the unmanned aerial vehicle 2320 and may independently communicate with the geo-fencing device 2310.
In one implementation, the geo-fencing device 2310 may be capable of detecting the presence of an unmanned aerial vehicle. The geofencing device may include a detector that may detect the presence of the unmanned aerial vehicle when the unmanned aerial vehicle is within a predetermined geographic range of the geofencing device. The detector may be any type of detector, such as those described elsewhere herein. For example, the detector may use a visual sensor, radar, or any other detection mechanism as described elsewhere herein. The detector may be configured to detect the unmanned aerial vehicle by means of one or more external devices. For example, additional sensors may be provided separately in the environment. A separate sensor may detect the unmanned aerial vehicle or may assist the detector of the geofencing apparatus in detecting the unmanned aerial vehicle or collecting information about the unmanned aerial vehicle. For example, the individual sensors can include visual sensors, acoustic sensors, infrared sensors, or wireless receivers that can be dispersed throughout the environment occupied by the geo-fencing device. As described elsewhere herein, the detector may detect the unmanned aerial vehicle before the unmanned aerial vehicle enters the predetermined range, or may have a very high likelihood of detecting the unmanned aerial vehicle when the unmanned aerial vehicle is within the predetermined range. The detector may detect the unmanned aerial vehicle before the unmanned aerial vehicle reaches the geofence device boundary.
The detector may be capable of detecting any information about the UAV and/or the user. The detector may detect the unmanned aerial vehicle or the user type. The detector may be capable of determining an unmanned aerial vehicle identifier and/or a user identifier. In some embodiments, the detector may be a communication module. The communication module may receive a communication from the UAV or an external device indicating UAV and/or user information. The unmanned aerial vehicle may broadcast information that may be received by the detector to detect the presence of the unmanned aerial vehicle. The broadcasted information may include information about the UAV, such as the identity of the UAV, the UAV type, the location of the UAV, or an attribute of the UAV.
The geofencing device may include a communication module that may be configured to send a signal that triggers sending a set of flight restrictions to the unmanned aerial vehicle. The communication module may transmit a signal when the unmanned aerial vehicle enters a predetermined geographic range of the geofencing device. The communication module may transmit a signal when the unmanned aerial vehicle is detected by the geo-fencing device. The communication module may transmit a signal before the unmanned aerial vehicle enters the geofence device boundary.
Aspects of the invention may include a geo-fencing apparatus comprising: a detector configured to detect the presence of an unmanned aerial vehicle within a predetermined geographic range of the geofencing device; and a communication module configured to send a signal when the UAV enters the predetermined geographic range of the geofence device, the signal triggering a set of flight restrictions to be sent to the UAV. A further aspect may relate to a method of providing a set of flight controls to an unmanned aerial vehicle, the method comprising: detecting, by means of a detector of a geo-fencing device, a presence of an unmanned aerial vehicle within a predetermined geographic range of the geo-fencing device; and transmitting, by means of a communication module of the geo-fencing device, a signal when the UAV enters the predetermined geographic range of the geo-fencing device, the signal triggering a set of flight restrictions to be sent to the UAV.
The signal from the communication module can be sent to any device other than the geo-fencing device. For example, the signal may be sent to the external device 2340. As previously described, the signal may be sent to an empty pipe system, a certification center, other geo-fencing devices, or other unmanned aerial vehicles. In some cases, the signal may be sent to the unmanned aerial vehicle itself. The external device may receive the signal and may send a set of flight restrictions to the UAV 2320. When the external device receives the signal, the external device may generate a set of flight controls. After the set of flight restrictions has been generated, the external device may transmit the set of flight restrictions to the unmanned aerial vehicle.
In some alternative embodiments, the communications module can provide a signal to a component above the geo-fencing device. For example, a signal can be provided to one or more processors of the geo-fencing device, which can retrieve a set of flight restrictions from a memory of the geo-fencing device and transmit the set of flight restrictions to the unmanned aerial vehicle. Two-way communication may be provided between the geo-fencing device and the unmanned aerial vehicle.
The communication module can be configured to transmit information within a predetermined geographic range of the geo-fencing device. The communication module can be configured to at least contact devices within a predetermined geographic range of the geo-fenced device. The communication module may be capable of contacting devices outside of a predetermined geographic range of the geo-fencing device. The communication module may issue direct communications that may have a limited range. In some cases, the communication modules may communicate directly and may utilize one or more intermediate devices or networks.
The communication module may be configured for continuous transmission of information. The communication module may be configured to transmit information periodically (e.g., at regular or irregular intervals), according to a schedule, or upon detection of an event or condition. For example, the communication module may transmit information upon detecting the presence of the unmanned aerial vehicle. The communication module may transmit information upon detecting that the unmanned aerial vehicle is within a predetermined range of the geofencing device. The communication module may transmit information upon detecting that the UAV is approaching a geofence boundary. The communication module may transmit information upon detection of any condition, such as those described elsewhere herein.
The geo-fencing device 2310 can include a locator configured to provide a location of the geo-fencing device. In some cases, the locator may be a GPS unit. The locator can provide global coordinates of the geo-fencing device. The locator can use one or more sensors to determine the geo-fencing device location. The locator can provide local coordinates of the geo-fencing device. The location of the geofencing device can be included in the transmitted signal that triggers a set of flight controls. In one example, an external device may receive a geo-fencing device location. The external device may be capable of tracking the location of the geo-fencing device and/or any other geo-fencing device in the area.
In some embodiments, a set of flight controls may be generated at an external device. A set of flight controls may be generated based on the UAV and/or the user information. For example, a set of flight restrictions may be generated using an identity of the unmanned aerial vehicle, a type of the unmanned aerial vehicle, an identity of the user, and/or a type of the user. The information regarding the geo-fencing device can be used to generate a set of flight controls. For example, a geo-fencing device identity, type, or location may be used. A set of flight controls may be selected from a plurality of sets of flight controls based on any type of information described herein.
A set of flight controls may include any of the characteristics as described elsewhere herein. A set of flight restrictions can include geofence boundaries. A set of flight controls may include one or more limits on the operation of the UAV.
A set of flight controls may be transmitted from an external device to the UAV. In some cases, a set of flight restrictions may be transmitted from the geo-fencing device to the unmanned aerial vehicle. A set of flight controls can be transmitted to the unmanned aerial vehicle from the geo-fencing device directly or via an external device.
In another implementation, the unmanned aerial vehicle can receive information regarding a location of the geofencing device. The unmanned aerial vehicle can include a communication unit configured to receive a location of the geo-fencing device; and a flight control module configured to generate one or more signals that cause the UAV to operate in accordance with a set of flight regulations generated based on a location of the geofence device. Aspects of the invention may include a method of operating an unmanned aerial vehicle, the method comprising: receiving, by means of a communication module of the UAV, a location of a geofencing device; and generating, by means of a flight control module, one or more signals that cause the UAV to operate in accordance with a set of flight controls, the set of flight controls generated based on the location of the geofencing device.
The unmanned aerial vehicle can receive information regarding the location of the geofencing device while the unmanned aerial vehicle is in flight. A geo-fencing device can have a locator located above the geo-fencing device. For example, the locator can be a GPS unit configured to provide global coordinates of the geo-fencing device. Any other locator as described elsewhere herein may be provided on the geo-fencing device.
The unmanned aerial vehicle can receive information regarding a location of the geofencing device at a communication unit of the unmanned aerial vehicle. The communications unit may be configured to receive a location of the geo-fencing device directly from the geo-fencing device. The communication unit may be configured to receive the location of the geo-fencing device from one or more intermediate devices (i.e., external devices). The one or more intermediate devices may be an air duct system off-board the unmanned aerial vehicle.
A set of flight controls may be generated on the UAV. The UAV may store information that may be used to generate a set of flight controls in a memory onboard the UAV. For example, the onboard memory of the UAV may store multiple sets of flight controls. The unmanned aerial vehicle can use the received information about the geo-fence device to generate a set of flight controls. The detection of the geo-fencing device may trigger the generation of a set of flight controls. The information of the geofencing device may or may not affect the generation of a set of flight controls. The unmanned aerial vehicle may then operate according to the generated set of flight controls.
A set of flight controls may be generated at an air traffic system or other external device off-board the unmanned aerial vehicle. The external device may store information that may be used to generate a set of flight controls in memory on-board the external device. For example, the on-board memory of the external device may store multiple sets of flight controls. The external device can use the received information about the geo-fence device to generate a set of flight controls. The detection of the geo-fencing device may trigger the generation of a set of flight controls. The information of the geofencing device may or may not affect the generation of a set of flight controls. In some cases, a set of flight restrictions can be generated based on the location of the geo-fencing device. The external device may transmit the generated set of flight controls to the unmanned aerial vehicle. The set of flight controls may be transmitted directly or indirectly. The unmanned aerial vehicle may have a communication unit that may receive a set of flight controls from an external device.
An external device, such as an air traffic system or other portion of an authentication system, may assist in managing the interaction between the unmanned aerial vehicle and the geofencing device. Any description herein may be applicable to any type of external device. The air management system can receive information from a plurality of unmanned aerial vehicles and a plurality of geo-fencing devices. The air management system may collect information from multiple sources and help manage the flight of the unmanned aerial vehicle. The air traffic control system may push dynamic route information for each unmanned aerial vehicle. The air management system may accept, reject, or alter the proposed route of the unmanned aerial vehicle based on information about other unmanned aerial vehicles and/or the geo-fencing device. The air management system can use the location information of the various geo-fencing devices to accept, reject, or modify the proposed routes of the unmanned aerial vehicle. The air traffic control system may provide navigation services. The air management system may assist the unmanned aerial vehicle in navigating through the environment. The air management system can assist the unmanned aerial vehicle in navigating in an environment in which one or more geofencing devices can be present.
Figure 41 illustrates an example of an unmanned aerial vehicle system in which an aerial vehicle system interacts with a plurality of unmanned aerial vehicles and a plurality of geofencing devices in accordance with an embodiment of the present invention. The empty pipe system 4110 can communicate with one or more geo- fence devices 4120a, 4120b, 4120c, 4120 d. The empty pipe system may be in communication with one or more unmanned aerial vehicles 4130a, 4130b, 4130c, 4130 d.
In some embodiments, the empty pipe system can be aware of the location of the geofencing device. A geo-fencing device can have a locator that can determine a location of the geo-fencing device. Information from the locator may be transmitted to the empty pipe system. The geo-fencing device may be updated if its location changes.
The air traffic control system may be aware of the location of the UAV. The unmanned aerial vehicle may have a locator that determines the position of the unmanned aerial vehicle. For example, the unmanned aerial vehicle may have a GPS unit or other sensors may be used to determine the location of the unmanned aerial vehicle. Information regarding the position of the UAV may be communicated to the air traffic system. If the position of the unmanned aerial vehicle changes, it may be updated.
The empty pipe system can know the location of the geofencing device and the UAV. The location of the geofencing device and the unmanned aerial vehicle can be located in real-time or at a high frequency. The air management system may advantageously collect information from a plurality of geo-fencing devices and a plurality of unmanned aerial vehicles. Thus, the empty pipe system may be able to have a good understanding of the devices within the area. The air management system can know the location of the geofencing device and the UAV without the geofencing device detecting the UAV or vice versa. In some cases, detection between the geo-fencing device and the unmanned aerial vehicle may occur. The empty pipe system may be capable of detecting whether the unmanned aerial vehicle is entering a predetermined range of the geofencing device. The empty pipe system may be capable of detecting whether the unmanned aerial vehicle is approaching a geofence boundary of the geofence device. The air management system may be capable of alerting the unmanned aerial vehicle and/or the geofencing device to: the UAV is approaching the geofence device. Alternatively, no warning may be provided.
When an air pipe system detects that an unmanned aerial vehicle is approaching a geofencing device, the air pipe system may generate and transmit a set of flight restrictions for the unmanned aerial vehicle to the unmanned aerial vehicle. The air management system may detect that the UAV is approaching the geofence device by comparing the location data of the UAV to the location data of the geofence device. The real-time location of the unmanned aerial vehicle can be compared to the real-time location of the geofencing device. The coordinates of the unmanned aerial vehicle can be compared to the coordinates of the geofencing device. The positions of the unmanned aerial vehicle and the geofencing device can be compared without detection of the unmanned aerial vehicle by the geofencing device or vice versa. The set of flight controls may be customized for or generated based on a geofence device that the UAV is approaching. The UAV may receive a set of flight controls and operate in accordance with the set of flight controls.
In an alternative implementation, the unmanned aerial vehicle that can detect the geofencing device can provide an indication to the empty pipe system that the unmanned aerial vehicle is approaching the geofencing device. The air management system may generate and transmit a set of flight restrictions for the unmanned aerial vehicle to the unmanned aerial vehicle. The geo-fencing device may be provided at a location to be detected but need not have other functionality. In some cases, a geo-fencing device may be provided at the location and may be uniquely identified or identifiable by a type that may have some impact on the type of flight control generated. The geofencing device itself need not perform any action. Alternatively, the geofencing device may detect that the unmanned aerial vehicle is approaching and may provide an indication to the empty pipe system that the unmanned aerial vehicle is approaching the geofencing device. The air management system may generate and transmit a set of flight restrictions for the unmanned aerial vehicle to the unmanned aerial vehicle. The geofencing device may be provided at a location to be detected by the UAV but need not have other functionality. In some cases, the geo-fencing device may have a unique identifier or may be identifiable by a type that may have some impact on the type of flight control generated. The geofencing device itself need not perform any additional actions. Thus, in some implementations, the geo-fencing device may advantageously be a relatively simple or cost-effective device.
In a further alternative implementation, rather than generating a set of flight controls on top of the air traffic system, the air traffic system provides signals to the unmanned aerial vehicle that cause the unmanned aerial vehicle to generate the set of flight controls. The signals from the empty pipe system may determine which set of flight controls to generate. The signals from the air traffic control system may correspond to a single set of flight controls. When an unmanned aerial vehicle generates a set of flight controls, the unmanned aerial vehicle may then act in accordance with the set of flight controls. In another example, rather than generating a set of flight restrictions on top of the air traffic system, the air traffic system can provide signals to the geo-fence device that can cause the geo-fence device to generate the set of flight restrictions. The signals from the empty pipe system may determine which set of flight controls to generate. The signals from the air traffic control system may correspond to a single set of flight controls. The geofencing device may transmit the generated set of flight controls to the UAV. When an unmanned aerial vehicle receives a set of flight controls, the unmanned aerial vehicle may then act in accordance with the set of flight controls. Similarly, the air traffic system may provide any other type of attachment with a signal that causes the attachment to generate a set of flight controls. The attachment may transmit the generated set of flight controls to the UAV, which may operate according to the set of flight controls.
As previously described, various types of interactions between components of an unmanned aerial vehicle system may allow flight regulations to be generated and the unmanned aerial vehicle to operate in compliance with the flight regulations. The flight control can pertain to one or more regions defined by boundaries of the geo-fencing device. In some cases, various components, such as the unmanned aerial vehicle, a user terminal (e.g., remote control), a geo-fencing device, an external storage unit, and/or an empty pipe system (or other external device), may be in communication with or detectable by each other.
In some embodiments, push communication may be provided between any two of the components. Push communications may include communications sent from a first component to a second component, where the first component initiates the communications. A communication may be sent from a first component to a second component without any request for a communication from the second component. For example, the empty pipe system may push communications down to the unmanned aerial vehicle. The air management system may send a set of flight controls to the UAV if the UAV does not request the set of flight controls.
Alternatively, pull communication may be provided between any two components. Pulling a communication may include sending a communication from a first component to a second component, where the second component initiates the communication. The communication may be sent from the first component to the second component in response to a request for a communication from the second component. For example, when an unmanned aerial vehicle requests an update to a set of flight controls, the air traffic control system may send the unmanned aerial vehicle a set of flight controls.
The communication between any two components may be automatic. The communication may occur without any instruction or input from the user. The communication may occur automatically in response to a schedule or detected event or condition. One or more processors may receive the data and may automatically generate instructions for the communication based on the received data. For example, the air management system may automatically send updates to the local navigation map to the unmanned aerial vehicle.
One or more communications between any two components may be manual. The communication may occur upon detection of an instruction or input from a user. The user may initiate the communication. The user may control one or more aspects of the communication, such as content or delivery. For example, the user may command the unmanned aerial vehicle to request an update to the local navigation map from the air management system.
The communication may occur continuously in real-time, may occur on a routine basis (e.g., periodically at regular or irregular intervals, or according to a schedule), or may occur in a non-routine manner (e.g., in response to a detected event or condition). The first component can send communications to the second component in a persistent manner, in accordance with a routine, or in a non-routine manner. For example, the geo-fencing device may continuously send information about its location to the empty pipe system. The empty pipe system can know the location of the geo-fencing device in real time. Alternatively, the geo-fencing device can routinely transmit information regarding the location of the geo-fencing device. For example, the geo-fencing device may update the empty pipe system every minute with respect to its location. In another example, the geo-fencing device may send its location to the empty pipe system according to a schedule. The schedule may be updated. For example, on monday, the geo-fencing device may send updates about its location every minute, while on tuesday, the geo-fencing device may send updates about its location every 5 minutes. In another example, a geo-fencing device can send information regarding the location of the geo-fencing device in a non-routine manner. For example, the geofencing device may send information about the geofencing device location to an empty pipe system when the device detects that an unmanned aerial vehicle is approaching the device.
Communication between any two components may be direct or indirect. Communication may be provided directly from the first component to the second component without any intermediary. For example, the remote control may send a radio signal from a transmitter that is received directly by a receiver of the unmanned aerial vehicle. Communication may be provided indirectly from the first component to the second component by relaying through an intermediary. An intermediary may be one or more intermediary devices or networks. For example, the remote control may send signals over a telecommunications network to control the operation of the unmanned aerial vehicle, which may include routing via one or more telecommunications towers. In another example, the flight control information may be sent directly to the unmanned aerial vehicle and then indirectly to the user terminal (e.g., via the unmanned aerial vehicle), or may be sent directly to the user terminal and then indirectly to the unmanned aerial vehicle (e.g., via the user terminal).
Any type of communication between any of the components can be provided in various ways, such as those described herein. The communication may include location information, identification information, information for authentication, information regarding environmental conditions, or information related to flight controls. For example, one or more sets of flight controls may be provided via communications, which may be push or pull communications, automatic or manual communications, communications that occur continuously in real-time, communications that occur according to routines or in a non-routine manner, or communications that may occur directly or indirectly.
Geofence device determined regulation
The geo-fencing device may have a boundary. The boundaries may define a region in which a set of flight controls may be applicable. The region may be located within a geofence boundary. The region may be located outside of a geofence boundary. The boundary may be determined in the generation of the set of flight controls. The boundary may be determined independently of generating the remaining set of flight controls.
In some implementations, the geofencing device can generate a set of flight controls. The set of flight restrictions generated by the geofencing device can also include an indication of a boundary of the geofencing device. Alternatively, the geofencing device may determine a boundary for the geofencing device regardless of whether the geofencing device generated a set of flight restrictions or whether a different device generated a set of flight restrictions.
Alternatively, the empty pipe system may generate a set of flight controls. The set of flight controls generated by the air traffic control system can also include an indication of a boundary of the geofencing device. Alternatively, the air management system can determine the boundary for the geo-fencing device regardless of whether the air management system generates a set of flight controls or a different device generates a set of flight controls. Any description herein of a geofencing device determining a boundary of a flight control or geofencing device may also apply to an empty pipe system determining a boundary of a flight control or geofencing device.
In a further example, the unmanned aerial vehicle may generate a set of flight controls. The set of flight restrictions generated by the UAV may also include an indication of a boundary of the geofencing device. Alternatively, the unmanned aerial vehicle can determine a boundary for the geo-fencing device regardless of whether the unmanned aerial vehicle generates a set of flight restrictions or a different device generates a set of flight restrictions. Any description herein of a geofencing device that determines the boundaries of a flight control or geofencing device may also apply to an unmanned aerial vehicle that determines the boundaries of a flight control or geofencing device.
The geofencing device can self-determine a set of flight restrictions applicable to the geofencing device and/or a boundary of the geofencing device. A set of flight controls (and/or boundaries) may be based on information from an air traffic control system, user input, environmental condition information (e.g., ambient climate, environmental complexity, population density, traffic flow), airway status (e.g., air traffic flow status), information from surrounding geo-fencing devices, and/or information from one or more unmanned aerial vehicles. Any flight control may be modified or updated in response to any of the factors described or information from any of the sources described. The updating or changing may be done in real-time, may be done periodically (e.g., at regular or irregular intervals), according to a schedule, or in response to a detected event or condition.
Type of restriction
As previously described, any type of flight control may be imposed on the operation of the UAV. Any type of flight control as previously described may be imposed in response to the presence of the geo-fencing device. The geofence device can have one or more geofence boundaries that can be associated with the set of flight restrictions.
A set of flight controls may include limits on the behavior of the unmanned aerial vehicle. For example, an unmanned aerial vehicle may be restricted from entering an area defined by the geo-fencing device. Other examples of limitations may include, but are not limited to, limiting presence, allowing presence, altitude limiting, linear velocity limiting, angular velocity limiting, linear acceleration limiting, angular acceleration limiting, time limiting, payload usage limiting, aerial photography limiting, limiting sensor operation (e.g., turning on/off a particular sensor, collecting data without using a sensor, not recording data from a sensor, not transmitting data from a sensor), emission limiting (e.g., emitting within a specified electromagnetic spectrum, which may include visible, infrared, or ultraviolet light, limiting sound or vibration), limiting changes in the appearance of the UAV (e.g., limiting deformation of the UAV), limiting wireless signals (e.g., frequency band, frequency, protocol), limiting changes in communications or communications used, etc, A limitation on items carried by the unmanned aerial vehicle (e.g., item type, item weight, item size, item material), a limitation on actions to be performed on or with the items (e.g., item drop or delivery, item pick up), a limitation on unmanned aerial vehicle carrier operation, a limitation on power usage or management (e.g., requiring sufficient remaining battery capacity), a limitation on landing, a limitation on takeoff, or a limitation on any other use of the unmanned aerial vehicle. Any of the examples given elsewhere herein with respect to flight control may be applicable to unmanned aerial vehicles as possible limitations associated with the presence of geofencing devices.
The geo-fencing devices can be of different types. In some cases, different types of geofencing devices may impose different types of flight restrictions. Examples of types of flight restrictions may include any of the above-described flight restrictions or any of the flight controls described elsewhere herein. In some cases, different geofencing devices of different types may have different boundaries (e.g., different shapes, sizes, changing conditions) with respect to the geofencing device.
As previously described, different restrictions may be imposed based on the identity of the UAV, the identity of the user, and/or the geofencing device entity. Different restrictions may be imposed for different types of unmanned aerial vehicles, different types of users, and/or different types of geo-fencing devices. Different restrictions may be provided for unmanned aerial vehicles having different levels of operation, users having different levels of operation, and/or geo-fencing devices having different levels of operation.
Flight response measures may occur when the UAV is not acting to comply with a set of flight regulations. Control of the unmanned aerial vehicle may be taken over from the user. The control may be taken over by the unmanned aerial vehicle itself which automatically performs the flight response measures according to instructions on the unmanned aerial vehicle's aircraft, by an air traffic system which sends instructions to cause the unmanned aerial vehicle to perform the flight response measures according to instructions on the air traffic system, or by another user having a higher level of operation than the unmanned aerial vehicle's original user. The instructions may be provided from a source remote from the unmanned aerial vehicle, the source having a higher privilege than the privilege of the original user. The take over may be reported to the empty pipe system. The unmanned aerial vehicle may enact flight response measures. Flight response measures may be provided in accordance with flight restrictions that are not complied with by the unmanned aerial vehicle.
For example, if an unmanned aerial vehicle is located in an area where its presence is not allowed, the unmanned aerial vehicle may be forced to leave the area, return to a starting or homing point, or land. The UAV may be given some time before takeover occurs for the user to take the UAV out of the area. If an unmanned aerial vehicle leaves the only region where the unmanned aerial vehicle is permitted to fly, the unmanned aerial vehicle may be forced to return to the region or land. The UAV may be given some time for a user to return the UAV to the area before taking over control. The payload may be automatically turned off if the UAV is operating a payload in an area where the UAV is not permitted to operate the payload. If the UAV is collecting information using sensors that may not be able to transmit the data it is collecting, the sensors may be turned off, may not record the data being collected, or may not be able to transmit the data it is collecting. In another example, if wireless communication is not allowed within the region, the communication unit of the unmanned aerial vehicle may be turned off to prevent wireless communication. If the unmanned aerial vehicle is not allowed to land in the area and the user provides a landing instruction, the unmanned aerial vehicle may not land but hover. If the UAV must maintain a certain battery charge level within the region and the charge level drops below a desired level, the UAV may be automatically directed to a battery charging station, may be navigated outside the region, or may be forced to land. In any case, the user may be given some time to comply with, or the flight response measures may take effect immediately. Any other type of flight response measure may be provided that may allow the unmanned aerial vehicle to comply with its unmet flight regulations.
In some cases, the geofencing devices may be used to define a flight restriction zone, where one or more sets of flight controls may be applicable. The geofence boundary may be a perimeter of the flight-restriction region. In some cases, the same set of flight restrictions may apply within or outside of the flight restriction region for a particular UAV in a particular mission.
Optionally, a geo-fencing device may be used to define a plurality of flight restriction zones. FIG. 24 illustrates an example of a geo-fencing device that may have multiple flight restriction zones. The geo-fence device 2410 may be used to define a first flight restriction zone (e.g., zone a 2420a) and a second flight restriction zone (e.g., zone B2420B). Optionally, a third flight restriction zone (e.g., zone C2420C) may be provided. Any number of flight restriction zones can be defined by the geo-fencing device. For example, one or more, two or more, three or more, four or more, five or more, six or more, seven or more, eight or more, nine or more, ten or more, eleven or more, twelve or more, 15 or more, 20 or more, 25 or more, 30 or more, 50 or more, or 100 or more flight-restricted zones may be defined by the geo-fence device.
The zones may or may not overlap. In one case, zone a, zone B and zone C may be separate, non-overlapping zones. For example, the region a may have a circular shape. Zone B may be circular (e.g., may be inside the outer boundary of zone B and outside the outer boundary of zone a). Zone C may be rectangular with open holes (e.g., may be inside the outer boundary of zone C and outside the outer boundary of zone B). The outer boundaries of the regions may be concentric such that the boundaries do not intersect one another. Alternatively, the outer boundaries of the regions may intersect one another. In some cases, the regions may overlap. In some cases, one zone may be located entirely within another zone. For example, zone a may be a circle. Zone B may be a circle without holes. Zone a may be located entirely within zone B. In some cases, an inner region may have all the limitations of an outer region if one region is located within another region.
The zones may be of any size or shape. The zone may be defined by a two-dimensional boundary. The space above or below the two-dimensional boundary may be part of the zone. For example, a region may have two-dimensional boundaries in the shape of a circle, oval, triangle, square, rectangle, any quadrilateral, bar, pentagon, hexagon, octagon, crescent, donut, star, or any other regular or irregular shape. The shape may or may not include one or more holes therein. The zone may be defined by a three-dimensional boundary. The space enclosed within the three-dimensional boundary may be a portion of the region. For example, the zones may have a spherical shape, a cylindrical shape, a prismatic shape (having any shape in cross-section), a hemispherical shape, a bowl shape, a donut shape, a bell shape, a walled shape, a conical shape, or any regular or irregular shape. The different zones defined by the geofence may have the same shape or may have different shapes. The different zones defined by the geo-fencing device can have different sizes.
In some cases, the geo-fencing device can be located within the outer boundaries of all zones. Alternatively, the geo-fencing device can be located outside the outer boundaries of one or more zones. The zones may be all spaced apart and not intersecting. However, all zones of the geo-fenced device can be located with reference to the geo-fenced device. If the geo-fenced device is to move within the environment, the region of the geo-fenced device can move with the device. For example, if a geofencing device moves approximately 10 meters eastward, then the zone of the geofencing device may correspondingly move 10 meters eastward. In some embodiments, the zones may be radially symmetric. The zone may remain unchanged regardless of how the geo-fencing device is rotated. Alternatively, the zones may not be radially symmetric. For example, if the geo-fencing device is rotated, the zone may be rotated accordingly. The zone can rotate about the geo-fencing device. For example, if the geo-fencing device is rotated 90 degrees clockwise, the zone may be rotated 90 degrees clockwise about a point at the geo-fencing device. In some cases, rotation of the geo-fencing device may not affect the location of the zone.
In some cases, each flight restriction zone may have its own restriction. A set of flight controls may be associated with different flight restriction zone boundaries and corresponding restrictions. For example, a set of flight restrictions may include a boundary for zone a, a boundary for zone B, a boundary for zone C, a limit for zone a, a limit for zone B, and/or a limit for zone C. Different zones may have different restrictions. In one example, zone a may restrict flight such that no unmanned aerial vehicle may enter zone a. Zone B may allow flight but may prevent the unmanned aerial vehicle from operating the camera within zone B. Zone C may allow for flight and camera use, but may not allow for the unmanned aerial vehicle to fly below the lower altitude limit. The instructions for these different regulations may be provided in a set of flight regulations. Different zones may have limitations on different aspects of unmanned aerial vehicle operation. Different zones may have limitations on the same aspect of unmanned aerial vehicle operation, but at different levels. For example, zone a, zone B, and zone C may limit the flight of the unmanned aerial vehicle. However, in different zones, the flight of the unmanned aerial vehicle may be restricted in different ways. For example, in zone a, no unmanned aerial vehicle may be permitted to enter at all. In zone B, the unmanned aerial vehicle may have to fly above a lower altitude limit, where the altitude of the lower altitude limit increases with increasing distance from zone a. In zone C, the unmanned aerial vehicle may not fly below a lower altitude limit that may be maintained at a substantially level altitude, where the lower altitude limit of zone C matches the highest point of the lower altitude limit of zone B.
Similarly, each zone may have its own set of boundaries, which may be the same or different from the boundaries of other zones. Each set of boundaries may correspond to a different set of flight controls. Each set of boundaries may correspond to a different set of flight limits. In some cases, the type of flight restrictions for each set of boundaries may be the same. Alternatively, the type of flight restrictions for each set of boundaries may be different.
In some embodiments, the zone closest to the geo-fenced device may have the strictest limits. In some embodiments, the region closer to the geo-fence device may have tighter restrictions than the region further from the geo-fence device. Regions further away from the geo-fenced device may have fewer or less stringent restrictions than regions closer to the geo-fenced device. For example, in zone a, no unmanned aerial vehicle may be allowed to enter. In zone B, the unmanned aerial vehicle may be allowed to fly above a first lower altitude limit. In zone C, the unmanned aerial vehicle may be allowed to fly above a second lower altitude limit that is lower than the first lower altitude limit. In some embodiments, all restrictions in zones further away from the geo-fenced device may apply to all zones closer to the geo-fenced device. Thus, the area closer to the geo-fenced device may have additional, restrictions on other areas. For example, zone C may have a set of limits. Zone B may have the limits of all zones C plus the additional zone B limits. Zone a may have all zone C limits, zone B additional limits, and additional limits from zone a. For example, zone C may allow an unmanned aerial vehicle to fly anywhere within the zone and operate the payload without landing, zone B may also not allow the unmanned aerial vehicle to land, but may also prevent the payload from operating on the unmanned aerial vehicle, yet still allow the unmanned aerial vehicle to fly at any point, while zone a may not allow the unmanned aerial vehicle to land, may prevent operation of the payload, and may require the unmanned aerial vehicle to fly above a lower altitude limit.
In other embodiments, the zones may have restrictions that are independent of one another. The area closer to the geo-fencing device need not be more restrictive than other areas. For example, zone a may prevent the unmanned aerial vehicle from operating the payload but may allow the unmanned aerial vehicle to fly anywhere, zone B may allow the unmanned aerial vehicle to operate the payload but may prevent the unmanned aerial vehicle from flying above an upper altitude limit, and zone C may prevent wireless communications from the unmanned aerial vehicle while the unmanned aerial vehicle is capable of flying anywhere and operating the payload.
Unmanned aerial vehicle navigation
One or more unmanned aerial vehicles may navigate through the region. The unmanned aerial vehicle may travel along a flight path. The flight path may be predetermined, semi-predetermined, or may be created in real time.
For example, the entire flight path may be predetermined. Each position along the flight path may be predetermined. In some cases, the flight path may include a location of the unmanned aerial vehicle within the region. In some cases, the flight path may also include an orientation of the UAV at the location. In one example, the predetermined flight path may predetermine a position and orientation of the unmanned aerial vehicle. Alternatively, only the position of the unmanned aerial vehicle may be predetermined, and the orientation of the unmanned aerial vehicle may not be predetermined and may be variable. Other functions of the UAV may or may not be predetermined as part of the predetermined flight path. For example, payload usage may be predetermined as part of the flight path. For example, an unmanned aerial vehicle may carry an image capture device. The position at which the image capture device is turned on or off, the zoom, mode, or other operating characteristics of the image capture device at various positions along the path may be predetermined. In some cases, the positioning (e.g., orientation) of the image capture device relative to the UAV may also be predetermined as part of the flight path. For example, the image capture device may have a first orientation relative to the unmanned aerial vehicle at a first location, and then may switch to a second orientation relative to the unmanned aerial vehicle at a second location. In another example, the wireless communication may be predetermined as part of the flight path. For example, it may be predetermined that the unmanned aerial vehicle will use a certain communication frequency at a first portion of the flight path and then switch to a different communication frequency at a second portion of the flight path. Any other operational function of the unmanned aerial vehicle may be predetermined as part of the predetermined flight path. In some embodiments, aspects of unmanned aerial vehicle operation that are not predetermined as part of the flight path may be variable. The user may be able to provide input that may control one or more variable features of the operation of the unmanned aerial vehicle as the unmanned aerial vehicle traverses the flight path. The user may or may not be able to alter the predetermined portion of the predetermined flight path.
In another example, the flight path may be semi-predetermined. Some parts or checkpoints may be provided for the flight path, which may be predetermined. The non-predetermined portion may be variable and/or user controllable. For example, a series of waypoints may be predetermined for the flight path. The unmanned aerial vehicle flight path between waypoints may be variable. However, even though the path between waypoints may vary, the unmanned aerial vehicle may be directed to each waypoint. In some cases, the final destination may be predetermined. The entire path to the final destination may be variable and/or user controllable.
In another example, the path may be created in real-time. The entire flight may be controlled by the user. The user may manually control the unmanned aerial vehicle without any schedule or predetermined path or goal. The user may freely maneuver the unmanned aerial vehicle within the environment. A flight path of the UAV may be created as the UAV traverses the environment.
Geofence devices within an area may affect the flight path of the unmanned aerial vehicle. In some cases, geofencing devices within the environment may be considered and one or more sets of flight restrictions may be imposed on the unmanned aerial vehicle operating within the environment. The unmanned aerial vehicle behavior may or may not be altered by the geofencing device. If the actions of the UAV do not comply with a set of flight regulations, the UAV behavior may be altered. If the actions of the UAV comply with a set of flight regulations, the UAV behavior may optionally not be altered. The geofence device may be considered when the unmanned aerial vehicle is traversing a predetermined flight path, a semi-predetermined flight path, or a real-time flight path.
Fig. 42 illustrates an example of an environment with an unmanned aerial vehicle that can be traversing a flight path within the environment and one or more geo-fencing devices. One or more unmanned aerial vehicles (e.g., unmanned aerial vehicle a 4210a, unmanned aerial vehicle B4210B) may traverse the environment along a flight path. The flight paths (e.g., path a, path B, path C) may be predetermined, may be semi-predetermined, or may be determined in real-time. The unmanned aerial vehicle may optionally fly to a destination 4220. The destination may be a predetermined destination or may be a destination determined in real time. The destination may be the final destination or may be waypoints along the path. The destination may be any location that may be a target of the unmanned aerial vehicle. One or more geo-fence devices (e.g., GF 14230 a, GF 24230 b, GF 34230 c, GF 44230 d, or GF 54230 e) may be provided within the environment. The geo-fencing device can have a geo-fencing device boundary.
The unmanned aerial vehicle can operate according to a set of flight regulations that can be associated with one or more geo-fencing devices. As described elsewhere herein, any interaction may be provided between the unmanned aerial vehicle and the geo-fencing device. As described elsewhere herein, any interaction may be provided between the empty pipe system (or other external device or system) and the unmanned aerial vehicle and/or the geo-fencing device. The interaction may result in the generation of a set of flight controls that may be provided to or generated onboard the UAV. The set of flight restrictions may include one or more restrictions imposed by geofencing devices within the area.
In one example, unmanned aerial vehicle 4210a may proceed toward destination 4220. One or more geo-fence devices 4230a, 4230b may be provided between the unmanned aerial vehicle and the destination. The geo-fencing device may have a boundary that may fall between the unmanned aerial vehicle and the destination. In some cases, the geofence device boundary may obstruct the path of the unmanned aerial vehicle toward its destination. The unmanned aerial vehicle may optionally have a flight trajectory. In some cases, if the unmanned aerial vehicle is to continue along the flight trajectory, the unmanned aerial vehicle may encounter a boundary of the geofencing device en route to the destination. The trajectory may be such that the unmanned aerial vehicle follows a predetermined flight path, a semi-predetermined flight path, or a real-time flight path.
In one example, the originally planned flight path may intersect a restricted area within the boundary of the geofencing device. If this is the case, the path may be changed to another path (e.g., path A) that may avoid the geofencing device and keep the unmanned aerial vehicle outside of the restricted area. The path a may be calculated to enable the unmanned aerial vehicle to reach the destination while avoiding the restricted area. In some cases, path a may be selected to enable the unmanned aerial vehicle to reach the destination with a relatively small amount of deviation from the predetermined path. The minimum amount of deviation possible can be calculated. Alternatively, the amount of deviation may be within 50% or less, 40% or less, 30% or less, 20% or less, 10% or less, 5% or less, or 1% or less of the smallest amount of deviation possible. For example, the GF1 and GF2 boundaries may be determined to overlap such that the unmanned aerial vehicle may not pass between the geo-fence devices. The unmanned aerial vehicle may choose to take a path around the GF2 side or the GF1 side. The GF2 path may be shorter or deviate less from the original path. Thus, path a may be selected to surround the GF2 side. In some cases, path a may be selected that takes into account environmental conditions. If the wind is blowing vigorously, the unmanned aerial vehicle can take a wider path to avoid the boundary, providing greater assurance that the unmanned aerial vehicle may not be inadvertently blown into the confined area. Other metrics, such as energy efficiency, may be considered. The unmanned aerial vehicle may be directed to a path having a relatively high level of energy efficiency. The predetermined path may thus be altered to avoid the geofence device while the UAV is in flight.
In other cases, a predetermined path may be calculated or generated to pre-dodge the geofencing device. For example, the user may enter a suggested path, waypoint or destination. The user may indicate that the user wishes the UAV to reach a particular destination (which may be a waypoint). The empty pipe system or any other system described elsewhere herein can collect data about the geo-fenced devices in the area. The empty pipe system can detect the location and/or boundary of the geo-fencing device. The user may optionally suggest a flight path to the destination. The air traffic control system may accept, reject or alter the path. In some cases, the air traffic control system may suggest a path (e.g., path a) that may allow the unmanned aerial vehicle to reach the destination while avoiding the restricted area. The suggested path may be selected to have a relatively small amount of deviation from the path originally suggested by the user. The user may choose to accept or reject the suggested path. Alternatively, the proposed path from the empty pipe system may be implemented automatically. Thus, the predetermined flight path of the unmanned aerial vehicle may have considered the geofencing devices and plotted the path through each geofencing device to get the unmanned aerial vehicle to the destination. In some implementations, the user need not suggest the entire path but may suggest one or more destinations. The air management system may take into account geofence device information and may generate a predetermined flight path that allows the unmanned aerial vehicle to reach the destination without entering the restricted area. In some cases, multiple possible paths to avoid the restricted area may be considered and a single path may be selected from the multiple paths.
In another example, the semi-predetermined flight path may position the unmanned aerial vehicle on a trajectory that intersects a restricted area within the boundaries of the geofencing device. For example, a destination may be entered and the unmanned aerial vehicle may travel toward the destination. If this is the case, the path may be changed to another path (e.g., path A) that may avoid the geofencing device and keep the unmanned aerial vehicle outside of the restricted area. The path a may be calculated to enable the unmanned aerial vehicle to reach the destination while avoiding the restricted area. In some cases, path a may be selected to cause the unmanned aerial vehicle to reach the destination with a relatively small amount of deviation based on the previous trajectory. A new path may be selected that takes into account environmental conditions or other conditions. The semi-predetermined path may thus be altered while the unmanned aerial vehicle is in flight to avoid the geofence device, yet still allow the unmanned aerial vehicle to reach the destination.
In some cases, a semi-predetermined path may be calculated or generated to pre-dodge the geofencing device. For example, the user may enter a suggested destination (e.g., final destination, waypoint). The empty pipe system or any other system described elsewhere herein can collect data about the geo-fenced devices in the area. The empty pipe system can detect the location and/or boundary of the geo-fencing device. The air management system can determine whether the suggested destination is located within a restricted area, which can be located within the boundaries of the geo-fencing device. If the destination is not located within the restricted area, the suggested destination may be accepted. The air traffic system may reject the destination if the destination is located in a restricted area that the unmanned aerial vehicle will not be allowed to enter. In some cases, the empty pipe system may suggest another destination that may be outside the restricted area but near where the original destination was located. In generating the new proposed destination, one or more factors may be considered, such as distance from the originally proposed destination, ease of flight path approaching the new destination, or environmental conditions.
In addition, the user can manually control the unmanned aerial vehicle in real time. The unmanned aerial vehicle can have a trajectory that can indicate that the unmanned aerial vehicle is about to enter a restricted area within the geofence device boundary. If this is the case, the path may be changed to another path (e.g., path A) that may avoid the geofencing device and keep the unmanned aerial vehicle outside of the restricted area. In some cases, a user may be alerted that the user is approaching a boundary and optionally given some time to make a correction from the line. If the user does not modify himself within the allotted time, control may be taken over from the user. Alternatively, the path may be automatically changed without giving the user time to modify itself. The take-over may cause the UAV to fly along a modified path (e.g., path A). Path a may be calculated to enable the unmanned aerial vehicle to reach the planned destination while avoiding the restricted area. When formulating path a, one or more factors may be considered, such as deviation from the original trajectory, energy efficiency, or environmental conditions. Thus, the real-time path can be altered to avoid the geofence device while the UAV is in flight.
In some embodiments, a local navigation map may be provided for the unmanned aerial vehicle. The unmanned aerial vehicle may receive a local navigation map from an air traffic system or other external device. The unmanned aerial vehicle can receive a local navigation map from the geofencing device. The local navigation map can include locations of one or more geo-fencing devices. The local navigation map may include boundaries of one or more geo-fencing devices. The local navigation map may include information about restrictions that may be imposed on the unmanned aerial vehicle in various regions. For example, if the unmanned aerial vehicle is not allowed to fly within the boundaries of the geofence device, the local map may indicate on the map restrictions on flight in the region. In another example, the unmanned aerial vehicle is not permitted to operate on the payload within the boundaries of another geo-fencing device, and the local map may indicate on the map limits on payload operation in the region. One or more sets of flight controls for the unmanned aerial vehicle may be reflected in the local navigation map.
The unmanned aerial vehicle may use a local navigation map to navigate through the region. In some cases, the local navigational map may include a predetermined path of the unmanned aerial vehicle drawn therein. The predetermined path may have considered a geo-fencing device. The unmanned aerial vehicle may then be able to follow the predetermined path. If the unmanned aerial vehicle deviates from the predetermined path, adjustments may be made if necessary. The current position of the UAV may be compared to where the UAV should be on a map. If the predetermined path does not take into account the geo-fencing device and is seen entering the restricted area, the predetermined path can be updated and the map information can be updated to reflect the information.
In some implementations, the local navigation map can include one or more destinations for the unmanned aerial vehicle as part of the semi-predetermined path. The destination may include a waypoint for the unmanned aerial vehicle. The destination may already take into account the geo-fencing device. For example, the destination may be selected at a location that is not within the restricted area. The unmanned aerial vehicle may be capable of traveling between a destination and a destination. If one or more restricted areas are encountered, adjustments may be made as necessary. If the destination does not consider a geo-fencing device and one or more destinations are located within the restricted area, the destination can be updated to an area outside the restricted area and the map information can be updated to reflect the information.
Alternatively, the unmanned aerial vehicle may be operating along a real-time path. The local navigation map may track the position of the unmanned aerial vehicle relative to the one or more geo-fencing devices. If the UAV is seen approaching a restricted area of flight, adjustments may be made to alter the UAV path if necessary.
In some cases, as the UAV traverses the environment, the local navigation map may be updated to reflect information about the environment in the vicinity of the UAV. For example, as the UAV approaches a new portion of the environment, a local navigation map for the UAV may reflect information about the new portion of the environment. In some implementations, if the UAV leaves a previous portion of the environment, the local navigational map may no longer reflect information about the previous portion of the environment. In some cases, the local navigation map may be updated by an empty pipe system. In other cases, one or more geo-fencing devices may be updating the map. When an unmanned aerial vehicle approaches a geofencing device, the geofencing device can provide the unmanned aerial vehicle with information that can be used to update a local map. When the unmanned aerial vehicle encounters a different geo-fence device during its performance of its mission, then the unmanned aerial vehicle map can be updated according to the various geo-fence devices that have local information to the geo-fence device.
The geofencing device may impose limitations on unmanned aerial vehicle operation. As previously mentioned, one example may be a restriction on the flight of the unmanned aerial vehicle. For example, the unmanned aerial vehicle may not be allowed to fly within the boundaries of the geofencing device (i.e., the restricted-flight area). Other examples of flight restrictions imposed by the geofencing device may include payload operation restrictions, payload positioning restrictions, carrier restrictions, carried object restrictions, sensor restrictions, communication restrictions, navigation restrictions, power usage restrictions, or any other type of restriction. The unmanned aerial vehicle may engage in activities that may or may not comply with the restrictions. The flight path of an unmanned aerial vehicle may be affected by different types of restrictions. Different methods of dealing with different types of flight restrictions may be provided.
An unmanned aerial vehicle (e.g., unmanned aerial vehicle B4210B) may be heading toward destination 4220. If the unmanned aerial vehicle were to follow the most direct path (e.g., path B), the unmanned aerial vehicle may enter an area within the boundaries of a geo-fence device (e.g., GF 44230 d). The restrictions within the region may relate to factors other than the presence of the unmanned aerial vehicle. For example, the limit may be a lower height limit. An unmanned aerial vehicle may be required to fly above a particular altitude. If the UAV is able to reach the altitude, the UAV may be allowed to travel along path B to a destination. However, if the UAV is unable to reach the altitude, or if reaching the altitude would result in a greater deviation in the UAV flight path than bypassing the region (e.g., along path C), the UAV may be commanded to travel around the region (e.g., along path C). In another example, the limitation may be an operation of the payload (e.g., capture of an image). If the UAV is able to turn off its camera or capture no image using its camera, the UAV may travel along path B (its camera is off when located within the region) and then be able to turn on its camera when it leaves the region. However, if the UAV is unable to turn its camera off or stop capturing images, or if the UAV is not expected to turn its camera off, the UAV may be routed to bypass the region along path C. Thus, depending on the constraints within the region, the unmanned aerial vehicle may be able to follow the original path or direction, or may be routed to bypass the region. If the UAV is unable to comply with the restrictions while located within the region, or if the UAV is less desirable to comply with the restrictions than if the UAV is routed to bypass the region, the UAV may be routed to bypass the region. One or more factors may be considered in determining whether the unmanned aerial vehicle is not expected to comply with the restrictions when located within the region. Factors such as environmental conditions, navigation needs, energy efficiency, safety, or mission objectives may be considered in determining whether the unmanned aerial vehicle should be routed to bypass the region or whether the unmanned aerial vehicle should comply with regional restrictions.
This may be applicable when the unmanned aerial vehicle is flying on a predetermined path, a semi-predetermined path, or a real-time path. For example, when an unmanned aerial vehicle is flying on a predetermined path and the predetermined path traverses the region, the same determination may be made whether to maintain the predetermined path or change the path. When the unmanned aerial vehicle is flying on a semi-predetermined path toward a destination and the most direct path or path trajectory traverses the region, the same determination may be made whether to remain on the direct path/path along the trajectory or to change the path. When an unmanned aerial vehicle is flying according to real-time manual instructions from a user and the user is directing the unmanned aerial vehicle toward a zone, the same determination may be made as to whether to follow user commands and allow the user to direct the unmanned aerial vehicle into the zone or to take over control and alter the path.
Geo-fence identification
The geo-fence device can be uniquely identifiable. In some implementations, the geo-fencing device can have its own unique geo-fencing device identifier. The geo-fence identifier can uniquely identify the geo-fence device from among other geo-fence devices. A geo-fence device can be distinguished from other geo-fence devices by its geo-fence identifier.
Fig. 40 shows an example of a system having multiple geo-fence devices, each device having a corresponding geo-fence identifier. The first geo-fence device 4010a can have a first geo-fence identifier (e.g., geo-fence ID1), the second geo-fence device 4010b can have a second geo-fence identifier (e.g., geo-fence ID2) and the third geo-fence device 4010c can have a third geo-fence identifier (e.g., geo-fence ID 3). One or more unmanned aerial vehicles 4020a, 4020b may be located within an environment within which a geofencing device may be provided. In some embodiments, an empty pipe system 4030 or other external device may be provided that may provide multiple sets of flight controls. Any other architecture may be provided for the generation of flight controls, such as those described elsewhere herein. For example, flight controls may be generated or stored at an air traffic system, one or more unmanned aerial vehicles, or one or more geo-fencing devices. The empty pipe system is provided by way of example only and not by way of limitation.
The geo-fence device can have a geo-fence identifier (e.g., geo-fence ID1, geo-fence ID2, geo-fence ID3 … …) that identifies the geo-fence device. The geo-fence identifier can be unique to the geo-fence device. Other geo-fence devices may have identifiers different from the geo-fence device. The geo-fence identifier can uniquely distinguish and/or distinguish the geo-fence device from other individuals. Each geo-fence device may be assigned only a single geo-fence identifier. Alternatively, the geo-fence device may be capable of registering multiple geo-fence identifiers. In some cases, a single geo-fence identifier may be assigned to only a single geo-fence device. Alternatively, a single geo-fence identifier may be shared by multiple geo-fence devices. In a preferred embodiment, a one-to-one correspondence may be provided between geo-fence devices and corresponding geo-fence identifiers.
Optionally, the geo-fence device can be authenticated as an authorized geo-fence device for the geo-fence identifier. The authentication process may include verification of the identity of the geo-fenced device. Examples of authentication processes are described in more detail elsewhere herein.
In some implementations, an ID registration database of the authentication system can hold identity information for the geo-fenced device. The ID registration database can assign a unique identifier to each geo-fence device. The unique identifier may optionally be a randomly generated alphanumeric string or any other type of identifier that can uniquely identify the geo-fence device from among other geo-fence devices. The unique identifier may be generated by an ID registration database or may be selected from a list of possible identifiers that have not yet been assigned. The identifier may be used to authenticate the geo-fence device. The ID registration database may or may not interact with one or more geo-fencing devices.
A set of flight restrictions related to a geo-fence device can be generated based on information about the geo-fence device. The information about the geo-fence device can include identification information about the geo-fence device. The identification information may include a geo-fence identifier or a geo-fence device type. In some implementations, the geo-fence identifier can indicate a geo-fence device type.
The geo-fencing device type can have any characteristic. For example, the geo-fence device type can indicate a model of the geo-fence device, a performance of the geo-fence device, a range of the geo-fence device (e.g., a predetermined range of the geo-fence device for detection or communication purposes), a boundary of the geo-fence device, a power capability (e.g., battery life) of the geo-fence device, a manufacturer of the geo-fence device, or a type of restriction imposed by the geo-fence device. The geofence identifier may uniquely identify the geofence from other geofence devices. A geo-fence identifier can be received from a geo-fence device. In some cases, the geo-fencing device can have an identification module. The geo-fence identifier can be stored on the identification module. In some cases, the identification module may not be altered or removed from the geo-fencing device. The geo-fence device identifier may be tamper-resistant or tamper-resistant.
One aspect of the invention can relate to a method of identifying a geo-fencing device, the method comprising: receiving a geo-fence identifier that uniquely identifies the geo-fence device from other geo-fence devices; generating a set of flight restrictions for the unmanned aerial vehicle based on the geofence identifier; and operating the UAV in accordance with the set of flight controls. A geo-fencing device identification system can be provided comprising: one or more processors operably configured to, individually or collectively: receiving a geo-fence identifier that uniquely identifies the geo-fence device from other geo-fence devices; and generating a set of flight restrictions for the unmanned aerial vehicle based on the geofence identifier to allow operation of the unmanned aerial vehicle under the set of flight restrictions. The system may also include one or more communication modules, wherein the one or more processors are operably coupled to the one or more communication modules.
FIG. 25 illustrates a process for generating a set of flight controls according to an embodiment of the present invention. A geo-fence device identifier 2510 can be received. A set of flight limits 2520 may be provided based on the geo-fence device identifier.
The geofence device identifier may be received by a device or system that may generate a set of flight controls. For example, the geofence device identifier may be received by an empty pipe system. Alternatively, the geofence device identifier may be received by the unmanned aerial vehicle, one or more processors of the geofence device, a user terminal, a memory storage system, or any other component or system. The geo-fence device identifier can be received by one or more processors of any component or system. The same component or system may generate a set of flight controls. For example, one or more processors, unmanned aerial vehicles, geo-fencing devices, user terminals, memory storage systems, or any other component or system of the air traffic management system may generate a set of flight restrictions based on the geo-fencing device identifier.
A set of flight controls can be generated based on the identity of the geo-fencing device. A set of flight restrictions can be generated based on the geofence device type. Other factors such as unmanned aerial vehicle information, user information, environmental conditions, or timing may affect the generation of a set of flight controls.
Any type of flight control may be provided, such as those described elsewhere herein. Flight control may be applicable to any aspect of unmanned aircraft operation. The flight control can be associated with a geofence device location and/or boundary. The set of flight controls can be applicable to the particular geofence device with which it is associated. Another geofence device may have a second set of flight controls, which may be applicable. In some examples, the set of flight regulations determines that the unmanned aerial vehicle is configured to remain within at least a predetermined distance of the geofence device, or the set of flight regulations determines that the unmanned aerial vehicle is configured to remain within at least a predetermined distance of the geofence device. The set of flight restrictions may include an upper flight limit above which the UAV cannot fly or a lower flight limit below which the UAV cannot fly when located at a predetermined position relative to the geofence device. The set of flight restrictions may include limitations on unmanned aerial vehicle payload usage based on a positioning of the unmanned aerial vehicle relative to a location of the geo-fence device, or the set of flight restrictions may include limitations on unmanned aerial vehicle communication unit usage based on a positioning of the unmanned aerial vehicle relative to a location of the geo-fence device.
A set of flight restrictions can be generated for each geofencing device. For example, the empty pipe system 4030 may generate and/or provide multiple sets of flight controls for various geofence devices. For example, unmanned aerial vehicle 4020a proximate to a first geo-fence device (e.g., geo-fence ID 14010 a) may be provided a set of flight restrictions for the first geo-fence device. The unmanned aerial vehicle can operate in compliance with a set of flight regulations for the first geo-fence device. The unmanned aerial vehicle may communicate with an air traffic control system to receive the generated set of flight controls. In some cases, the unmanned aerial vehicle may inform the empty pipe system that the unmanned aerial vehicle is approaching the first geo-fence device. Alternatively, the geofencing device may inform the empty pipe system that the unmanned aerial vehicle is approaching the geofencing device.
Second unmanned aerial vehicle 4020b may be in proximity to another geofence device (e.g., geofence ID 34010 c). A second set of flight restrictions for the other geo-fencing device may be provided to the second unmanned aerial vehicle. The unmanned aerial vehicle can operate in compliance with a second set of flight regulations for the other geo-fencing device. The unmanned aerial vehicle may communicate with an air traffic control system to receive the generated set of flight controls. In some cases, the unmanned aerial vehicle may inform the air traffic system that the unmanned aerial vehicle is approaching another geofencing device. Alternatively, the other geo-fence device may inform the air traffic system that the unmanned aerial vehicle is approaching the other geo-fence device.
An unmanned aerial vehicle can receive a set of flight controls applicable to the unmanned aerial vehicle. For example, if an unmanned aerial vehicle is within range of a first geo-fence device but not within range of a second or third geo-fence device, the unmanned aerial vehicle may only receive a set of flight restrictions for the first geo-fence device. In some cases, only a set of flight restrictions for the first geo-fence device may be generated for the UAV. Similarly, the second UAV may only receive a set of flight restrictions for a third geo-fence device if the second UAV is within range of the third geo-fence device but not within range of either the first or second geo-fence devices. In some cases, only a set of flight restrictions for a third geo-fencing device may be generated for the second unmanned aerial vehicle.
Geo-fence authentication
The identity of the geo-fence device can be authenticated. The identity of the geo-fenced device can be verified by undergoing an authentication process. The authentication process can confirm that the geo-fence device using the geo-fence identifier matches the geo-fence device to which the geo-fence identifier registration pertains.
One aspect of the invention relates to a method of authenticating a geo-fenced device, the method comprising: authenticating an identity of a geo-fence device, wherein the identity of the geo-fence device is uniquely distinguishable from other geo-fence devices; providing a set of flight controls for an unmanned aerial vehicle, wherein the flight controls relate to a location of a certified geofencing device; and operating the UAV in accordance with the set of flight controls. A geo-fencing device authentication system can comprise: one or more processors individually or collectively configured to: authenticating an identity of a geo-fence device, wherein the identity of the geo-fence device is uniquely distinguishable from other geo-fence devices; and generating a set of flight restrictions for the unmanned aerial vehicle, wherein the flight restrictions relate to a location of the certified geofencing device to allow operation of the unmanned aerial vehicle under the set of flight restrictions. The system may also include one or more communication modules, wherein the one or more processors are operably coupled to the one or more communication modules.
Fig. 26 illustrates a process for authenticating a geo-fence device according to an embodiment of the present invention. A geo-fence device identifier 2610 can be received. The identity of the geo-fence device can be authenticated 2620. A set of flight controls 2630 may be provided.
The geofence device identifier may be received by a device or system that may generate a set of flight controls. For example, the geofence device identifier may be received by an empty pipe system. Alternatively, the geofence device identifier may be received by the unmanned aerial vehicle, one or more processors of the geofence device, a user terminal, a memory storage system, or any other component or system. The geo-fence device identifier can be received by one or more processors of any component or system. The same component or system may generate a set of flight controls. For example, one or more processors, unmanned aerial vehicles, geo-fencing devices, user terminals, memory storage systems, or any other component or system of the air traffic management system may generate a set of flight restrictions based on the geo-fencing device identifier.
A set of flight restrictions can be generated after receiving the geo-fence device identifier. A set of flight controls can be generated after authenticating the identity of the geo-fencing device. A set of flight controls can be generated based on the identity of the geo-fencing device. A set of flight controls can be generated without regard to the geofence device identity. Authentication of the geo-fencing device identity may be required prior to generating a set of flight controls. Alternatively, a set of flight controls can be generated even if the geo-fencing device is not authenticated. In some embodiments, a first set of flight restrictions may be provided for a geo-fence device if the geo-fence device is authenticated and a second set of flight restrictions may be provided for the geo-fence if the geo-fence device is not authenticated. The first set of flight controls and the second set of flight controls may be different. In some embodiments, the second set of flight controls may be more stringent or more restrictive than the first set. A set of flight restrictions can be generated based on the geofence device type. Other factors such as unmanned aerial vehicle information, user information, environmental conditions, or timing may affect the generation of a set of flight controls.
Any authentication process may be used to authenticate the geo-fence device. Any of the techniques described elsewhere herein for authenticating other devices may be applicable to authentication of geo-fenced devices. For example, the process used in authenticating a user or an unmanned aerial vehicle may be applicable to a geo-fencing device. In one example, a geo-fenced device can be authenticated by means of a key on top of the geo-fenced device. The geofence key may be non-removable from the geofence device. Optionally, the key cannot be removed from the geo-fence device without damaging the geo-fence device. The key can be stored in an identification module of the geo-fence device. In some cases, the identification module cannot be removed from the geo-fencing device without damaging the geo-fencing device. The identification module may have any of the characteristics of any other type of identification module (e.g., unmanned aerial vehicle identification module) as described elsewhere herein.
Geo-fence authentication
Geo-fencing via authentication center
The certified unmanned aerial vehicle may broadcast its location and IMSI via a wireless link, which may have information of a signature-containing nature. In addition, the unmanned aerial vehicle and the certification center may have a negotiated and generated set of reliable CK1 and IK1, which is characterized as SCS1 (secure communications set).
The geo-fence device is similarly authenticated by an authentication center. Specifically, similar to the certification of unmanned aerial vehicles, the geo-fence device and the certification center may negotiate and produce reliable CK2 and IK2, which are characterized as SCS 2.
The wireless channel for communication between the geo-fencing device and the UAV may be implemented by multiplexing the individual channels for multiple access arrangements, such as time, frequency, or code division. The wireless information emitted by the unmanned aerial vehicle or the geo-fencing device can be transmitted in a form of signature authentication, such as the form depicted in fig. 16. Thus, when a Message (MSG) is to be sent, the format of the information sent is as follows:
equation 1: MSG1| ((HASH (MSG1) | | SCR () (+) SCR (ik)) | | IMSI
Wherein MSG1| | RAND | | | TIMESTAMP | | | GPS
In equation 1 above, SCR () may be a common cipher generator, and SCR (IK) may be an IK-derived data mask. In addition, in this specification, MSG is an original message, HASH () is a HASH function, RAND is a random number, TIMESTAMP is a current timestamp, and GPS is a current location to avoid replay attacks.
Upon receiving the information for the unmanned aerial vehicle, the geo-fencing device may set up a network link with the authentication center and report the IMSI of the unmanned aerial vehicle. The certification center may query whether the unmanned aerial vehicle is allowed to be present in the region. If the query indicates that the unmanned aerial vehicle is prohibited from entering the region or indicates that restrictive information needs to be transmitted to the unmanned aerial vehicle, the authentication center may notify the geo-fence device via the network, and the geo-fence device will transmit the restrictive information by way of signature authentication.
The information sent by the geo-fence device may not be forged and may be controlled by the authentication center. After receiving the information sent by the geo-fencing device, the unmanned aerial vehicle may continue to provide encrypted transmissions via a CK 1-protected link established with the authentication center via the remote control and/or the public communication network. In this manner, the unmanned aerial vehicle can send the geofence information received by the unmanned aerial vehicle to an authentication center for authentication. After successful confirmation that the geofence information is authentic, the unmanned aerial vehicle can interpret the contents of the geofence device. At the same time, it may report to the user via a remote control. In some examples, flight path corrections may be made by a user or by the unmanned aerial vehicle itself.
During flight, the unmanned aerial vehicle may announce its location or its destination. After the authentication center knows such information and finds that the unmanned aerial vehicle cannot enter the corresponding region, it may send out prohibition information, asking the unmanned aerial vehicle to return to the home. The process described above can be implemented with certainty and can withstand various attacks. For example, if an unmanned aerial vehicle continues to enter a restricted area, the authentication center may record the intrusion of the unmanned aerial vehicle and report the intrusion to an associated regulatory authority.
Fail to pass the authentication center
The certified unmanned aerial vehicle can wirelessly broadcast relevant information with digital signature characteristics, such as its ID, its location, and its channel, and other such information. After the geofencing device receives the above information from the unmanned aerial vehicle, it can connect to the air management system via the network and report the IMSI of the unmanned aerial vehicle. The empty pipe system may then determine whether the UAV is present in the area. If it is determined that the UAV may not enter the region or that restrictive information is required to inform the UAV, the air management system may inform the geofence device over a network, and the geofence device may issue the restrictive information by way of signature verification. The information sent by the geo-fence device may not be forged and may be controlled by the authentication center. The following describes a secure information channel from the certification center to the unmanned aerial vehicle.
Certain public key algorithms may be employed in embodiments of the present invention. The cryptographic pair may be selected according to a public key algorithm. The cryptographic pair consists of a public key and a private key. Thus, two password pairs are provided. The geo-fence device and the authentication center each control their own private keys. The geofence device private key controlled by the geofence device is denoted KP and the corresponding public key is denoted KO. The certification center private key controlled by the certification center is denoted as CAP, and the corresponding certification center public key is denoted as CAO.
In an example, when registering a geo-fence device, the cryptographic pair KP (private key for the geo-fence device) and KO (public key for the geo-fence device) can be assigned by and posted to the certification authority. The private Key (KP) of the geofence device is neither readable nor duplicative. Additionally, the certification authority can encrypt a public Key (KO) of the geo-fenced device using a private key (CAP) of the certification authority to generate a certificate C. The air traffic system obtains the certificate C from the certificate authority and sends it to the geo-fencing device. In addition, the MSG to be sent to the unmanned aerial vehicle may first be subjected to a HASH function HASH to generate the digest D. The MSG may then be encrypted using the private Key (KP) of the geo-fence device to form a signature S. Further, the geo-fencing device may transmit C, S and the MSG to the UAV over the wireless channel.
After the UAV receives C, S and the MSG, it can decrypt C using the known authentication device 'S public key (CAO) to obtain the geofence device' S public Key (KO), and can also decrypt signature S using KO to obtain the decrypted digest D1. In addition, the unmanned aerial vehicle may perform a HASH function HASH on the MSG to obtain the digest D. The UAV may compare decrypted digest D1 with digest D. If the two are the same, the unmanned aerial vehicle can authenticate the MSG sent by the geo-fencing device. Therefore, with the above-described signature process, the unmanned aerial vehicle and the authentication center can securely communicate with each other.
During flight, the unmanned aerial vehicle may announce its location or target. Upon receiving the location of the unmanned aerial vehicle, the authentication center may send restrictive information and request the unmanned aerial vehicle to return to the flight if it is determined that the unmanned aerial vehicle is prohibited from entering the corresponding airspace. The above process can be performed safely and can withstand various attacks. If the unmanned aerial vehicle continues to enter, the certification authority may record the illegal entry of the unmanned aerial vehicle and report to the regulatory body.
Alternatively, the geo-fence device may continue to broadcast the restrictive information in one direction. The authentication of the information is the same as the above-described process. The restrictive information may indicate flight privileges for various types of unmanned aerial vehicles in the area. In addition, the wireless channel used for communication between the geo-fencing device and the UAV may be arranged for multiple access by way of channel multiplexing, such as time, frequency, or code division.
The air traffic control system can actively push information
Based on the location and planned route of the unmanned aerial vehicle that is about to fly and the unmanned aerial vehicle that is in flight, the air management system can determine in real time that the geofence may be affected. In the example, the air traffic control system may prepare a list of geofences to avoid for each unmanned aerial vehicle. The geofences to be avoided by each UAV may be affected depending on the UAV, the user of the UAV, and the level of flight mission. In addition, the air traffic system may send the list to the UAV over an encrypted channel between the air traffic system and the UAV, which is then forwarded to the user.
Before and during flight, the unmanned aerial vehicle with the electronic map may accept the information pushed by the air traffic system. The unmanned aerial vehicle can also receive information regarding, for example, the location, scope of coverage, period of coverage, etc. of the unmanned aerial vehicle and geofences proximate its corridor. The unmanned aerial vehicle may also send an explicit acknowledgement of receipt back to the air traffic system. The unmanned aerial vehicle can also actively acquire and update valid geofence information about its course and provide such information to the air management system. When such information is actively provided, the unmanned aerial vehicle may not have to send a receipt confirmation back to the air traffic system.
When the air management system pushes information and when the unmanned aerial vehicle actively acquires information, the system can be used to ensure that the subject issuing the information has not been counterfeited and that the information has not been manipulated. By pushing and retrieving information using a secure communication connection established during the authentication process, communication security may be ensured. For details on establishing the secure communication connection and security control, refer to the preceding sections.
Data store with associated identifiers
Fig. 27 illustrates another example of device information that may be stored in a memory according to an embodiment of the present invention.
A memory storage system 2710 may be provided. Information may be provided from one or more users 2715a, 2715b, one or more user terminals 2720a, 2720b, one or more unmanned aerial vehicles 2730a, 2730b, and/or one or more geo-fencing devices 2750a, 2750 b. The information may include any type of data (e.g., one or more commands, environmental data), data sources (e.g., an identifier of a device from which the data was generated), and related device identifiers (e.g., an associated user identifier, an associated UAV identifier, an associated geofence identifier), and/or any other associated timing information or other associated information. One or more sets of information 2740 may be stored.
The memory storage system 2710 may include one or more memory storage units. The memory storage system may include one or more databases that may store the information described herein. The memory storage system may include a computer-readable medium. The memory storage system may have any of the characteristics of any of the other memory stores described herein (e.g., the memory storage system in fig. 11). The memory storage system may be provided at a single location or may be distributed over multiple locations. In some embodiments, a memory storage system may include a single memory storage unit or multiple memory storage units. A cloud computer infrastructure may be provided. In some cases, a peer-to-peer (P2P) memory storage system may be provided.
The memory storage system may be provided off-board the unmanned aerial vehicle. The memory storage system may be provided on a device external to the UAV. The memory storage system may be provided outside the remote control. The memory storage system may be provided on a device external to the remote control. The memory storage system may be provided in addition to the UAV and the remote control. The memory storage system may be part of an authentication system. The memory storage system may be part of an empty pipe system. The memory storage system may include one or more memory units, which may be one or more memory units of an authentication system, such as an empty pipe system. Alternatively, the memory storage system may be separate from the authentication system. The memory storage system may be owned and/or operated by the same entity as the authentication system. Alternatively, the memory storage system may be owned and/or operated by a different entity than the authentication system.
The communication system may include one or more recorders. The one or more recorders may receive data from any device of the communication system. For example, one or more recorders may receive data from one or more unmanned aerial vehicles. One or more recorders may receive data from one or more users and/or remote controls. One or more memory storage units may be provided on one or more recorders. For example, one or more memory storage units may be provided on one or more recorders that receive one or more messages from the unmanned aerial vehicle, a user, and/or a remote control. The one or more recorders may or may not have a limited range to receive information. For example, the recorder may be configured to receive data from devices located within the same physical area as the recorder. For example, a first recorder may receive information from an unmanned aerial vehicle when the unmanned aerial vehicle is located in a first zone, and a second recorder may receive information from the unmanned aerial vehicle when the unmanned aerial vehicle is located in a second zone. Alternatively, the recorder does not have a limited range and can receive information from a device (e.g., an unmanned aerial vehicle, a remote control, a geo-fencing device) regardless of where the device is located. The recorder may be a memory storage unit and/or may communicate aggregated information to the memory storage unit.
Information from one or more data sources may be stored in memory. The data source may be any device or entity that may be the source of the recorded data. For example, for data 1, the data source may be a first unmanned aerial vehicle (e.g., unmanned aerial vehicle 1). For data 2, the data source may be a second unmanned aerial vehicle (e.g., unmanned aerial vehicle 2). The data source may be a user, a user terminal (e.g., remote control), an unmanned aerial vehicle, a geo-fencing device, a recorder, an external sensor, or any other type of device. The information may relate to a data source that may be stored in a memory.
For example, information about one or more users 2715a, 2715b may be stored in a memory storage system. The information may include user identification information. Examples of user identification information may include user identifiers (e.g., user ID1, user ID2, user ID3 … …). The user identifier may be unique to the user. In some cases, the information from the user may include information that facilitates identifying and/or authenticating the user. The information from one or more users may include information about the users. Information from one or more users may include data originating from the users. In one example, the data may include one or more commands from a user. The one or more commands may include commands to effect operation of the UAV. Any other type of information may be provided by one or more users and may be stored in a memory storage system.
In some embodiments, all user inputs may be stored as data in a memory storage system. Alternatively, only selected user inputs may be stored in the memory storage system. In some cases, only certain types of user input are stored in the memory storage system. For example, in some embodiments, only user identification input and/or command information is stored in the memory storage system.
The user may optionally provide information to the memory storage system by way of one or more user terminals 2720a, 2720 b. The user terminal may be a device capable of interacting with a user. The user terminal may be capable of interacting with the unmanned aerial vehicle. The user terminal may be a remote control configured to transmit one or more operation commands to the unmanned aerial vehicle. The user terminal may be a display device configured to show data based on information received from the unmanned aerial vehicle. The user terminal may be capable of both transmitting information to and receiving information from the unmanned aerial vehicle. In some embodiments, the user terminal may be a data source for data stored in a memory storage system. For example, the remote control 1 may be a source of data 4.
A user may provide information to the memory storage system by means of any other type of device. For example, one or more computers or other devices may be provided, which may be capable of receiving user input. The device may be capable of communicating user input to the memory storage device. The device does not need to interact with an unmanned aerial vehicle.
The user terminals 2720a, 2720b may provide data to a memory storage system. The user terminal may provide information related to the user, user commands, or any other type of information. The user terminal may provide information about the user terminal itself. For example, a user terminal identification may be provided. In some cases, a user identifier and/or a user terminal identifier may be provided. Optionally a user key and/or a user terminal key may be provided. In some examples, the user does not provide any input related to the user key, but the user key information may be stored on the user terminal or may be accessible by the user terminal. In some cases, the user key information may be stored on physical memory of the user terminal. Alternatively, the user key information may be stored outside the physical memory (e.g., on the cloud) and may be accessible by the user terminal. In some embodiments, the user terminal may transmit a user identifier and/or an associated command.
Unmanned aerial vehicles 2730a, 2730b may provide information to a memory storage system. An unmanned aerial vehicle may provide information about the unmanned aerial vehicle. For example, unmanned aircraft identification information may be provided. Examples of the unmanned aerial vehicle identification information may include an unmanned aerial vehicle identifier (e.g., unmanned aerial vehicle ID1, unmanned aerial vehicle ID2, unmanned aerial vehicle ID3 … …). The UAV identifier may be unique to the UAV. In some cases, the information from the unmanned aerial vehicle may include information that facilitates identification and/or authentication of the unmanned aerial vehicle. The information from one or more unmanned aerial vehicles may include information about the unmanned aerial vehicle. The information from one or more unmanned aerial vehicles may include any data (e.g., data 1, data 2, data 3 … …) received by the unmanned aerial vehicle. The data may include commands to effect operation of the unmanned aerial vehicle. Any other type of information may be provided by one or more unmanned aerial vehicles and may be stored in a memory storage system.
Geofence devices 2750a, 2750b can provide information to a memory storage system. A geo-fencing device can provide information related to the geo-fencing device. For example, geofence device identification information may be provided. Examples of the geo-fence device identification information may include a geo-fence device identifier (e.g., geo-fence device 1, geo-fence device 2 … …). The geo-fence device identifier can be unique to the geo-fence device. In some cases, the information from the geo-fence device may include information that facilitates identification and/or authentication of the geo-fence device. The information from the one or more geo-fence devices may include information about the geo-fence devices. The information from the one or more unmanned aerial vehicles may include any data (e.g., data 5, data 6 … …) received by the geo-fencing device. The data may include a location of the geo-fencing device, a detected condition or presence of the UAV, or information related to flight control. Any other type of information can be provided by one or more geo-fencing devices and can be stored in a memory storage system.
Any of the device-related information described herein may be authenticated prior to storing the device in a memory storage system. For example, a user may be authenticated prior to storing information related to the user in a memory storage system. For example, the user may be authenticated prior to obtaining and/or storing the user identifier by the memory storage system. Thus, in some implementations, only authenticated user identifiers are stored in the memory storage system. Alternatively, the user need not be authenticated and the purported user identifier may be stored in the memory storage system prior to authentication. If authenticated, an indication may be made that the user identifier has been verified. If not, an indication may be made that the user identifier has been marked as suspicious activity, or that a failed authentication attempt was made using the user identifier.
Optionally, the UAV may be authenticated prior to storing information relating to the UAV in the memory storage system. For example, the UAV may be authenticated prior to obtaining and/or storing the UAV identifier by the memory storage system. Thus, in some implementations, only authenticated unmanned aerial vehicle identifiers are stored in the memory storage system. Alternatively, the UAV need not be authenticated and the purported UAV identifier may be stored in the memory storage system prior to authentication. If authenticated, an indication may be made that the UAV identifier has been verified. If not, an indication may be made that the UAV identifier has been marked as suspicious activity or that a failed authentication attempt was made using the UAV identifier.
Similarly, the geo-fence device can be authenticated prior to storing information related to the geo-fence device in the memory storage system. For example, the geo-fence device identifier can be authenticated prior to obtaining and/or storing by a memory storage system the geo-fence device identifier. Thus, in some implementations, only authenticated geo-fence device identifiers are stored in the memory storage system. Alternatively, the geo-fenced device need not be authenticated and the purported geo-fenced device identifier may be stored in the memory storage system prior to authentication. If authenticated, an indication can be made that the geo-fence device identifier has been verified. If not, an indication may be made that the geo-fenced device identifier has been marked as suspicious activity, or that a failed authentication attempt was made using the geo-fenced device identifier.
In addition to the data source, related device information about the corresponding data may be stored. For example, if a command is issued, the associated devices may include a device that issues the command and/or a device that receives the command. The data source may be a device that issues commands. In another example, the data may be collected by sensors on the unmanned aerial vehicle. The data source may be an unmanned aerial vehicle. The unmanned aerial vehicle may communicate the sensed data to a plurality of devices, which may be included in the information about the relevant devices. For example, data 3 may be sensed by the UAV 1, and the UAV 1 may send the device to a user (e.g., user ID3) and a geo-fencing device (e.g., geo-fencing device 2). The relevant device information may include a user, a user terminal (e.g., remote control), an unmanned aerial vehicle, a geo-fencing device, a recorder, an external sensor, or any other type of device.
The memory storage unit may store one or more sets of information 2740. The sets of information may include information from a user, a user terminal, an unmanned aerial vehicle, a geo-fencing device, a recorder, an external sensor, or any other type of device. The sets of information may include one or more sets of data, data sources, and information about the associated devices. In some cases, a single data item may be provided for a single set of information. Alternatively, multiple data items may be provided for a single set of information.
The memory storage system may store multiple sets of information relating to a particular interaction between two devices. For example, multiple commands may be issued during an interaction between two devices. An interaction may be the execution of a task. In some cases, the memory storage unit may store only information about a particular interaction. Alternatively, the memory storage system may store information about multiple interactions between two devices. The memory storage system may optionally store information based on an identifier of a particular device. Data associated with the same device (e.g., the same UAV) may be stored together. Alternatively, the memory storage unit may store information according to the device identifier. Data associated with a device or a particular combination of devices may be stored together.
Alternatively, the memory storage system may store multiple sets of information about interactions between multiple devices or groups of devices. The memory storage system may be a data repository that collects information from multiple users, user terminals, unmanned aerial vehicles, geo-fencing devices, or other devices. The memory storage system may store information from a plurality of tasks, which may include individual users, individual user terminals, individual unmanned aerial vehicles, individual geo-fencing devices, and/or various combinations thereof. In some cases, the set of information in the memory storage system may be searchable or indexable. The set of information may be discovered or indexed according to any parameter, such as user identity, user terminal identity, unmanned aerial vehicle identity, geo-fencing device identity, time, device combination, data type, location, or any other information. The information sets may be stored according to any parameter.
In some cases, information in a memory storage system may be analyzed. The information set may be analyzed to detect one or more behavioral patterns. The information sets may be analyzed to detect one or more characteristics that may be related to an accident or adverse condition. The information sets may be used to analyze air traffic flow. A statistical analysis may be performed on the set of information in the memory storage unit. Such statistical analysis may help identify trends or related factors. For example, it may be noted that certain UAV models may generally have a higher accident rate than other UAV models. Thus, the information in the memory storage system may be analyzed synthetically to aggregate information about the operation of the unmanned aerial vehicle. Such an integrated analysis need not respond to a particular event or scenario.
The memory storage system may be updated in real time. For example, when data is sent or received, information about the data may be recorded in a memory storage system along with any other information from the set of information. This may occur in real time. The data and any related information in the set of information may be stored in less than 10 minutes, 5 minutes, 3 minutes, 2 minutes, 1 minute, 30 seconds, 15 seconds, 10 seconds, 5 seconds, 3 seconds, 1 second, 0.5 seconds, or 0.1 seconds of sending or receiving the command.
In an alternative embodiment, the memory storage system may not need to be updated in real time. The memory storage system can be periodically updated at regular or irregular intervals. In some cases, an update schedule may be provided, which may include regular or irregular update times. The update schedule may be fixed or may be modifiable. The memory storage system may be updated in response to a detected event or condition.
The memory storage system may store the set of information for any period of time. In some cases, the information sets may be stored indefinitely until they are deleted. Deletion of the information set may or may not be allowed. In some cases, only the handler or administrator of the memory storage system may be allowed to interact with the data stored in the memory storage system. In some cases, only the operator of the authentication system (e.g., the air traffic system, the authentication center) may be allowed to interact with the data stored in the memory storage system.
Alternatively, the set of information may be deleted automatically after a period of time. The time period may be pre-established. For example, the set of information may be automatically deleted after more than a predetermined period of time. Examples of the predetermined period of time may include, but are not limited to, 20 years, 15 years, 12 years, 10 years, 7 years, 5 years, 4 years, 3 years, 2 years, 1 year, 9 months, 6 months, 3 months, 2 months, 1 month, 4 weeks, 3 weeks, 2 weeks, 1 week, 4 days, 3 days, 2 days, 1 day, 18 hours, 12 hours, 6 hours, 3 hours, 1 hour, 30 minutes, or 10 minutes. In some cases, the information set may be manually deleted only after a predetermined period of time has elapsed.
Identity-based geofence restrictions
One or more sets of flight controls within the environment can correspond to one or more geofence devices. Each set of flight controls can be associated with a geofencing device. In some embodiments, only a single set of flight controls is associated with the geofence device. For example, the same set of flight controls may apply regardless of the number or type of unmanned aerial vehicles that may interact with the geofencing device. Alternatively, multiple sets of flight controls can be associated with the geofence device. This may occur when multiple unmanned aerial vehicles interact with the geofencing device. Each of the multiple sets of unmanned aerial vehicles may have its own set of flight controls. For example, a first UAV may have a first set of flight restrictions, and a second set of UAVs may have a second set of flight restrictions. The limits provided by the first set of flight controls and the second set of flight controls may be different. In some cases, the difference may be due to an identity of the UAV (e.g., a difference between the first UAV identity and the second UAV identity). The difference may be due to the identity of the user (e.g., a difference in the identity of a first user operating the first UAV and the identity of a second user operating the second UAV). The difference may be due to any other factor, such as time, environmental conditions, or any other factor. A set of flight controls can be associated with a single geo-fencing device.
Fig. 28 illustrates a geo-fencing device 2810 that can provide a different set of flight restrictions in different scenarios. The different sets of flight limits may have the same boundaries or may have different boundaries 2820a, 2820 b. Different unmanned aerial vehicles 2830a, 2830b, 2840a, 2840b may receive different sets of flight restrictions.
The geo-fencing device 2810 can be provided at a location within an environment. A set of flight controls may be provided to an unmanned aerial vehicle within an environment proximate the location. If multiple unmanned aerial vehicles are located within the environment proximate to the location, they may each receive a set of flight controls. Multiple sets of flight controls may be the same across multiple unmanned aerial vehicles. The sets of flight controls may be different between multiple unmanned aerial vehicles. In one example, the differences in the sets of flight controls may be based on the identity of the unmanned aerial vehicle. The differences in the sets of flight controls may be based on the type of UAV. For example, a first set of unmanned aerial vehicles 2830a, 2830b may have a first unmanned aerial vehicle type, while a second set of unmanned aerial vehicles 2840a, 2840b may have a second unmanned aerial vehicle type. The first type of unmanned aerial vehicle and the second type of unmanned aerial vehicle may be different. Any description herein of differences in sets of flight controls based on the type of unmanned aerial vehicle is provided by way of example only and may be applicable to any other type of factor that may result in different sets of flight controls. These factors may include user information (e.g., user identity, user type), environmental conditions, timing, other UAV information, or any other type of factor described elsewhere herein.
The first set of unmanned aerial vehicles may receive a first set of flight controls and the second set of unmanned aerial vehicles may receive a second set of flight controls. The first set of flight controls and the second set of flight controls may be different. For example, a first set of unmanned aerial vehicles 2830a, 2830b can receive a first set of flight restrictions, while a second set of unmanned aerial vehicles 2840a, 2840b can receive a second set of flight restrictions for geofence device 2810. The boundary between the first set of flight controls and the second set of flight controls may be different. Thus, multiple sets of boundaries can be provided for the same geofencing device through multiple sets of flight controls. The first set of flight controls may have a first set of boundaries and the second set of flight controls may have a second set of boundaries. In some cases, a single set of flight controls may have multiple sets of boundaries. For example, even if a single UAV receives a set of flight restrictions, there may be multiple sets of boundaries, whether for multiple zones, different zones at different times, different zones for different conditions detected, or any other factor. A first set of boundaries 2820a may be provided in a first set of regulations for a first set of unmanned aerial vehicles and a second set of boundaries 2820b may be provided in a second set of regulations for a second set of unmanned aerial vehicles. The boundaries may have different sizes or shapes. The boundaries may overlap.
In the example provided, the boundary may limit the presence of the unmanned aerial vehicle. This is provided by way of example only and any other type of limitation may apply to the boundary. The first set of flight restrictions may limit the presence of a first type of UAV 2830a, 2830b in the region with a first set of boundaries 2820 a. Thus, the first type of unmanned aerial vehicle may not enter the first set of boundaries. The second type of unmanned aerial vehicle 2840a, 2840b may not receive the first set of flight controls and therefore may not be limited by limitations from the first set of flight controls. Thus, a second type of unmanned aerial vehicle may enter within first set of boundaries (e.g., unmanned aerial vehicle 2840b has entered within first set of boundaries 2820 a).
The second set of flight restrictions may limit the presence of a second type of UAVs 2840a, 2840b in the region with a second set of boundaries 2820 b. Thus, the second type of unmanned aerial vehicle may not enter the second set of boundaries. The first type of unmanned aerial vehicle 2830a, 2830b may not receive the second set of flight controls and therefore may not be limited by limitations from the second set of flight controls. Thus, the first type of unmanned aerial vehicle may enter within the second set of boundaries (e.g., unmanned aerial vehicle 2830b has entered within second set of boundaries 2820 b).
Thus, even a single geo-fencing device may have a high degree of flexibility. The geofencing device may be capable of providing reference points for different sets of flight regulations for different unmanned aerial vehicles that may be proximate to the geofencing device under different circumstances, which may provide altitude control over the type of activity in the vicinity of the geofencing device without requiring any modification or update to the geofencing device itself. In some embodiments, multiple sets of flight restrictions can be generated outside of the geofencing device, so any updates or particular rules can be directed outside of the geofencing device that may not require any changes to the geofencing device itself. In some cases, the geofencing device itself may generate multiple sets of flight restrictions thereon, but may receive updates to parameters, algorithms, or data from the cloud for generating a set of flight restrictions. The geo-fencing device need not receive any manual input in performing its function. Alternatively, the user may choose to provide a personalized request or input for the geo-fencing device.
Geo-fencing devices change over time
As previously described, a set of flight controls may change over time. Unmanned aerial vehicles that encounter geofencing devices at different times may have different sets of flight restrictions. In some embodiments, the set of flight controls may be applicable to a particular encounter or a particular time. For example, if an unmanned aerial vehicle first approaches a geofence device, the unmanned aerial vehicle may receive a first set of flight restrictions. If the UAV is to fly elsewhere and then return and encounter the geofencing device a second time, the UAV may receive a second set of flight restrictions. The first and second groups may be identical. Alternatively, they may be different. The set of flight controls may have changed based on time or other conditions, such as detected environmental conditions. Thus, different sets of flight conditions may be delivered to the unmanned aerial vehicle depending on the conditions (e.g., time, environmental conditions). The set of flight conditions need not include any conditions themselves. In other embodiments, the set of flight controls may contain different conditions, including timing. For example, a set of flight restrictions provided for an unmanned aerial vehicle may indicate that a first set of boundaries and limits apply before 3:00pm, a second set of boundaries and limits apply between 3:00pm and 5:00pm, and a third set of boundaries and limits apply after 5:00 pm. Thus, a set of flight restrictions may include different sets of boundaries and restrictions for the UAV based on different conditions (e.g., time, environmental conditions, etc.).
However, providing a set of flight regulations, unmanned aerial vehicles may be subject to different restrictions on the geofencing device depending on conditions such as time or environmental conditions.
FIG. 29 illustrates an example of a geo-fencing device with multiple sets of flight controls that can change over time. The regulation of changes over time may be provided by way of example only and may be applicable to any other type of condition, such as environmental conditions. For example, if an example describes a change at a first time, a second time, a third time, etc., the example may apply to a first set of environmental conditions, a second set of environmental conditions, a third set of environmental conditions, or any other set of conditions (e.g., a first set of conditions, a second set of conditions, a third set of conditions).
The geo-fence device 2910 may be illustrated at different times (time A, B, C or D). Different conditions may be provided at different times or locations at different times. An unmanned aerial vehicle 2920 may be provided in proximity to the geo-fencing device. The illustrated unmanned aerial vehicles at different times may be the same unmanned aerial vehicle or may be different unmanned aerial vehicles. The geo-fence device can have a set of boundaries 2930a, 2930b, 2930c, 2930 d.
The boundary may change over time. For example, a set of boundaries 2930a at a first time (e.g., time a) may be different from a set of boundaries 2930B at a second time (e.g., time B). The boundary may change in any manner. For example, the lateral portion of the boundary may change (e.g., boundary 2930a when time a and boundary 2930C when time C) and/or the vertical portion of the boundary may change (e.g., boundary 2930a when time a and boundary 2930B when time B). In some cases, the lateral and vertical portions of the boundary may change simultaneously (e.g., boundary 2930B at time B and boundary 2930C at time C). The size and/or shape of the boundary may vary. In some cases, the boundary may remain unchanged at different times. For example, a set of boundaries 2930a at a first time (e.g., time a) may be the same as a set of boundaries 2930D at a second time (e.g., time D).
The type of restriction imposed on the boundary may change over time. While a set of boundaries remains the same, the type of restriction may change. For example, at a first time (e.g., time a), boundary 2930a may be the same as boundary 2930D at a second time (e.g., time D). However, a first set of limits may apply to a first time (e.g., at time a, unmanned aerial vehicle 2920 may not be allowed to enter within boundary 2930 a), while a second set of limits may apply to a second time (e.g., at time D, unmanned aerial vehicle 2920 may be allowed to enter within boundary 2930D, although other limits may optionally be imposed, such as not allowing operational payload). Although the boundaries remain unchanged, the type of restriction may be different. Although the boundaries remain the same and the type of restriction is the same, the level of restriction may be different (e.g., no wireless communication is allowed to be used versus only wireless communication within a particular frequency range).
In some embodiments, the type of restriction imposed with respect to the boundary may remain unchanged, while the boundary changes over time. For example, at a first time (e.g., time a), boundary 2930a may change with respect to boundary 2930B at a second time (e.g., time B). The first set of limits may apply to a first time and the second set of limits may apply to a second time. The first set of limits may be the same as the second set of limits. For example, at time a, unmanned aerial vehicle 2920 may not be allowed to enter within boundary 2930 a. At time B, the unmanned aerial vehicle may not be allowed to enter within boundary 2930B as such, but the boundary may have changed such that the unmanned aerial vehicle may enter an area that the unmanned aerial vehicle was previously unable to enter and/or the unmanned aerial vehicle may not be able to enter an area that the unmanned aerial vehicle was previously able to enter. The area that the unmanned aerial vehicle may be able to enter may have changed as the boundaries change.
Alternatively, the first set of limits may be different from the second set of limits. For example, at time B, the unmanned aerial vehicle may not be allowed to enter within boundary 2930B. At time C, the unmanned aerial vehicle may be able to enter within boundary 2930C, but may not be able to issue any wireless communications when within the boundary. Thus, both the boundary and the set of limits may change. The set of restrictions may change such that the type of restriction changes. The set of restrictions may change so that the type of restriction may be the same, but the level of restriction may change.
Unmanned aerial vehicles may encounter geofencing devices under various circumstances. Different situations may be provided in sequence (the unmanned aerial vehicle encounters the geofencing device at multiple points in time or under multiple different conditions), or in alternative embodiments different situations may be provided (e.g., the unmanned aerial vehicle may theoretically first arrive at the geofencing device at different points in time or under different sets of conditions).
In some embodiments, changes to a set of flight controls can occur without requiring the geofencing device to have an indicator. For example, the air traffic control system may be aware of the geofencing device and the unmanned aerial vehicle location. The air management system can detect when the unmanned aerial vehicle is within a predetermined range of the geofencing device. When the drone approaches the geofencing device, the air management system may be aware of a set of conditions (e.g., current time, environmental conditions, drone identity or type, user identity or type, etc.). Based on the condition, the air management system may provide a set of flight controls to the UAV. The unmanned aerial vehicle need not detect the geofencing device, nor an indicator on the geofencing device, although the unmanned aerial vehicle may do so, as described elsewhere herein.
In another example, the geofence device may detect the presence of the unmanned aerial vehicle when the unmanned aerial vehicle is proximate to the geofence device. The geofence device may detect an unmanned aerial vehicle when the unmanned aerial vehicle comes within a predetermined range of the geofence device. When the UAV approaches the geofencing device, the geofencing device may be aware of a set of conditions (e.g., current time, environmental conditions, UAV identity or type, user identity or type, etc.). Based on the condition, the geofencing device may provide a set of flight controls for the UAV. The unmanned aerial vehicle need not detect the geo-fence device, nor the indicator of the geo-fence device, although the unmanned aerial vehicle can do so, as described elsewhere herein.
Additionally, the UAV may generate a set of flight controls onboard the UAV. The unmanned aerial vehicle can detect the presence of the geofencing device. Alternatively, the unmanned aerial vehicle may be provided with information from an empty pipe system or a geofencing device regarding the distance of the unmanned aerial vehicle from the geofencing device. When the UAV approaches the geofencing device, the UAV may be aware of a set of conditions (e.g., current time, environmental conditions, UAV identity or type, user identity or type, etc.). Based on the condition, the UAV may generate a set of flight controls on the UAV. The unmanned aerial vehicle need not detect the geofencing device, nor an indicator on the geofencing device, although the unmanned aerial vehicle can do so, as described elsewhere herein.
In other embodiments, the geo-fence device 2910 may include an indicator 2940. The indicator may be any type of indicator as described elsewhere herein. Although visual indicia are provided by way of example only, any other type of indicator may be used, such as a wireless signal, a thermal signal, an acoustic signal, or any other type of indicator. The unmanned aerial vehicle may be capable of detecting an indicator of the geo-fencing device. The unmanned aerial vehicle may be capable of detecting the indicator when the unmanned aerial vehicle approaches (e.g., comes within a predetermined range of) the geo-fencing device.
The indicator may change over time. The change in the indicator may reflect different conditions. The indicator may change periodically (e.g., at regular or irregular intervals), according to a preset schedule, or in response to a detected event or condition. The change of the indicator can be initiated by the geo-fencing device itself. The geo-fencing device can have a set of instructions on which indicators to provide under which circumstances. The instructions may be updated with information from an external device (e.g., a cloud, an air traffic system, an unmanned aerial vehicle, other geo-fencing device, etc.). In some cases, changes to the indicator may incorporate data from one or more external devices. In one example, the empty pipe system can command the geo-fencing device to change the indicator. In another example, another external device, such as an unmanned aerial vehicle, other geo-fencing device, or a remote control of the geo-fencing device, may provide instructions to the geo-fencing device to change the indicator. The indicator may change a characteristic. The characteristic may be detectable by the unmanned aerial vehicle. For example, for a visual indicia, the change in the characteristic may include a change in the visual appearance of the indicator. In another example, for an acoustic signal, the change in the characteristic may comprise a change in a detectable acoustic feature of the indicator. For wireless signals, the change in the characteristic may include a change in the information transmitted by the indicator.
For example, the indicator may be changed periodically. In one example, the indicator may change every hour. In another example, the indicator may change every day. The geo-fencing device may have an onboard clock that may allow the geo-fencing device to record the time.
The indicator may be changed according to a preset schedule. For example, the schedule may indicate that the indicator should change from the first indicator property to the second indicator property at 9:00AM on monday, and then change from the second indicator property to the third property at 3:00PM on monday, and then change from the third property back to the first property at 1:00AM on tuesday, and then change from the first property to the second property at 10:00AM on tuesday, and so on. The schedule may be modifiable. In some cases, an operator of the empty pipe system may be able to change the schedule. In another example, the owner or operator of the geo-fencing device may be able to alter the schedule. The owner and/or operator of the geo-fencing device may be able to access or alter the schedule by manually interacting with the geo-fencing device or remotely from a separate device, which may result in updating the schedule of the geo-fencing device.
In another example, the indicator may change in response to a detected event or condition. For example, if the air traffic flow around the geo-fencing device reaches a threshold density, the indicator may change. The indicator can change if the climate around the geo-fencing device changes (e.g., begins to rain or wind speed increases). Alternatively, the dynamic indicator may change in response to the detected presence of the UAV. For example, an unmanned aerial vehicle may be uniquely identified. The geo-fencing device may receive an unmanned aerial vehicle identifier that uniquely identifies the unmanned aerial vehicle from among other unmanned aerial vehicles. The indicator parameter or characteristic may be selected based on the unmanned aerial vehicle identifier. For example, a set of flight controls may depend on the identity of the UAV or the UAV type. In another example, the indicator parameter or characteristic may be selected based on a user identifier. For example, a set of flight controls may depend on the identity of the user or the type of user.
The change in the indicator may reflect a change in a set of flight controls. For example, the unmanned aerial vehicle may be able to detect a change in the indicator characteristic and know that a different set of restrictions is already in place. For example, when the indicator changes, the UAV may use a different set of flight controls. Alternatively, the unmanned aerial vehicle may use the same set of flight controls, but the same set of flight controls may apply for different restrictions or boundaries for different indicators.
For example, if the UAV detects that indicator 2940 is under a first set of characteristics (e.g., showing an "X"), the UAV may know that a first set of restrictions are in place (e.g., first set of boundaries 2930a, first set of limits). If the UAV detects that indicator 2940 is under a second set of characteristics (e.g., showing "O"), the UAV may know that a second set of restrictions are in place (e.g., second set of boundaries 2930b, second set of limits). If the drone detects that indicator 2940 is under a third set of characteristics (e.g., showing "═") the drone may know that a third set of regulations is in place (e.g., third set of boundaries 2930c, third set of limits). If the UAV detects that indicator 2940 is under a fourth set of characteristics (e.g., showing a "+"), the UAV may know that a fourth set of restrictions are in place (e.g., fourth set of boundaries 2930d, fourth set of limits).
Based on local memory on the UAV, the UAV may be aware that different regulations correspond to different indicator characteristics. The local memory on the UAV may or may not be updated continuously, periodically, according to a schedule, or in response to a detected event or condition. In some cases, an external device, such as an empty pipe system, may be aware of different regulations corresponding to different indicator characteristics. The unmanned aerial vehicle may transmit information regarding the detected indicator characteristic to an external device. The external device may generate the set of flight controls and then provide the set of flight controls to the UAV in response. Any other communication architecture may be used, such as those described elsewhere herein.
Aspects of the invention may relate to a geo-fencing apparatus comprising: one or more memory storage units configured to store a plurality of indicator parameters; and a dynamic indicator that (1) changes from conforming to a first indicator parameter of the plurality of indicator parameters to conforming to a second indicator parameter over time, and (2) is configured to be detectable by the unmanned aerial vehicle (a) when the unmanned aerial vehicle is in flight and (b) when the unmanned aerial vehicle enters within a predetermined geographic range of the geofence device. A method of providing a set of flight controls to an unmanned aerial vehicle may be provided, the method comprising: storing a plurality of indicator parameters in one or more memory storage units of a geo-fencing device; and changing a dynamic indicator of the geofence device over time from being compliant with a first indicator parameter of the plurality of indicator parameters to being compliant with a second indicator parameter, wherein the dynamic indicator is configured to be detectable by the unmanned aerial vehicle (a) while the unmanned aerial vehicle is in flight and (b) when the unmanned aerial vehicle enters within a predetermined geographic range of the geofence device.
As previously described, the indicator may be a dynamic indicator. The dynamic indicator may have one or more parameters/characteristics. In some embodiments, the dynamic indicator may be a visual indicia that changes appearance over time. A first appearance of the visual indicia may be generated based on the first indicator parameter, and a second appearance of the visual indicia different from the first appearance of the visual indicia may be generated based on the second indicator parameter. In another example, the dynamic indicator may be a wireless signal that changes characteristic over time. A first characteristic of the wireless signal may be generated based on the first indicator parameter, and a second characteristic of the wireless signal, different from the first characteristic of the wireless signal, may be generated based on the second indicator parameter.
The dynamic indicator can uniquely identify and distinguish the geo-fence device from other geo-fence devices. For example, different geo-fencing devices may have different indicators. The different dynamic indicators may be different from each other. So when the drone detects a dynamic indicator, the drone may be aware not only of the set of flight restrictions in place under the conditions, but also of the identity of the geo-fencing device. In other embodiments, the indicator need not be unique to each geo-fencing device. In some cases, the same indicators may be provided for the same type of geo-fencing device. The indicator may be unique to the geo-fencing device type. When the unmanned aerial vehicle detects a dynamic indicator, the unmanned aerial vehicle may be aware not only of the set of flight controls in place under the conditions, but also of the type of geo-fencing device. In other cases, the indicator need not be unique to the device. Different geo-fencing devices of different types may show the same indicator. The indicator may reflect a set of flight controls corresponding to the indicator, and the unmanned aerial vehicle need not uniquely identify the geo-fence device or the geo-fence device type.
The dynamic indicator may indicate a first set of flight restrictions when the first indicator parameter is met and may indicate a second set of flight restrictions when the second indicator parameter is met. As previously described, the set of flight controls may be generated and/or stored at any device within the unmanned aerial vehicle system. Any combination of communications may occur to allow the unmanned aerial vehicle to operate in accordance with the set of flight controls. In some examples, the first and second sets of flight controls may be stored on-board the unmanned aerial vehicle, the first and second sets of flight controls may be stored on an air traffic system off-board the unmanned aerial vehicle, or the first and second sets of flight controls may be stored on-board a geo-fencing device.
Geofence overlap and priority
Fig. 30 illustrates a scenario in which an unmanned aerial vehicle may be provided within an overlapping region of multiple geofencing devices. Multiple geo- fencing devices 3010a, 3010b may be provided within an environment. The geofencing devices can have corresponding boundaries 3020a, 3020 b. One or more unmanned aerial vehicles may be provided within the environment.
The unmanned aerial vehicle 3030d may be located outside the boundaries of the first and second geo- fence devices 3010a and 3010 b. The unmanned aerial vehicle may optionally be outside of a set of constraints from the first geo-fence device or the second geo-fence device. An unmanned aerial vehicle can operate freely within an environment without having a set of flight regulations imposed on the unmanned aerial vehicle.
The unmanned aerial vehicle 3030a may be located within the boundaries of the first geo-fence device 3010a and outside the boundaries of the second geo-fence device 3010 b. The unmanned aerial vehicle can be under a set of restrictions from the first geo-fence device and not under a set of restrictions from the second geo-fence device. The unmanned aerial vehicle can operate in compliance with a first set of flight regulations associated with the first geo-fence device.
The unmanned aerial vehicle 3030c may be located within the boundaries of the second geo-fence device 3010b and outside the boundaries of the first geo-fence device 3010 a. The unmanned aerial vehicle can be under a set of restrictions from the second geo-fence device and not under a set of restrictions from the first geo-fence device. The unmanned aerial vehicle may operate in compliance with a second set of flight regulations associated with a second geo-fence device.
The unmanned aerial vehicle 3030d may be located within the boundaries of the first and second geo- fence devices 3010a and 3010 b. The unmanned aerial vehicle may be under a set of flight controls. Different possibilities may be provided for the unmanned aerial vehicle to fall within the range of multiple geofencing devices. Any possibility in describing overlapping zones may be applied here.
For example, one or more regions provided by different geo-fencing devices may overlap. For example, a first zone within a first set of boundaries 3020a of a first geo-fence device may overlap with a second zone within a second set of boundaries 3020b of a second geo-fence device.
When multiple zones overlap, rules from multiple zones may remain in place. For example, a first set of flight controls associated with a first geo-fence device and a second set of flight controls associated with a second geo-fence device may remain in place in the overlap region. In some cases, rules from multiple zones may remain in place as long as they do not conflict with each other. For example, the first geo-fence device may disallow use of the unmanned aerial vehicle payload when the unmanned aerial vehicle is located within the first zone. The second geo-fence device may not allow use of the unmanned aerial vehicle communication when the unmanned aerial vehicle is located within the second zone. When UAV 3030d is located in both zones, the UAV may not be allowed to operate the UAV payload and may not be allowed to use the communication unit.
If there are conflicts between rules, various rule responses may be applied. For example, the most restrictive set of rules may be applied. For example, if a first zone requires that the unmanned aerial vehicle fly below an altitude of 400 feet and a second zone requires that the unmanned aerial vehicle fly below an altitude of 200 feet, then rules for flying below an altitude of 200 feet may be applied in the overlap zone when the unmanned aerial vehicle is located within the overlap zone. This may include mixing and matching a set of rules to form a most restrictive set. For example, if a first zone requires that the unmanned aerial vehicle fly above 100 feet and below 400 feet, and a second zone requires that the unmanned aerial vehicle fly above 50 feet and below 200 feet, then when the unmanned aerial vehicle is located in the overlap zone, it may use the lower flight limit from the first zone and the upper flight limit from the second zone to fly between 100 feet and 200 feet.
In another case, a level may be provided for the zone. One or more priority levels may be provided within the hierarchy. A higher priority (e.g., higher level) geo-fence device may have a prevailing rule as compared to a lower priority (e.g., lower level) geo-fence device regardless of whether the rule associated with the higher priority geo-fence device has more or less restrictions as compared to the rule of the lower priority geo-fence device. The priority level of the geo-fencing device can be pre-selected or pre-entered. In some cases, a user providing a set of rules for the zone may indicate which geofencing devices are higher in priority than other geofencing devices. In some cases, a manufacturer of a geo-fence device may pre-select a level for the geo-fence device. The pre-selected priority may or may not be changed. In other cases, the owner or operator of the geo-fence device may enter a level rating for the geo-fence device. The owner or operator of the geo-fence device may be able to alter the geo-fence device priority. In some cases, the priority level for the geofencing device may be determined by an external device, such as an empty pipe system, one or more unmanned aerial vehicles, or other geofencing device. In some cases, an operator of the empty pipe system may be able to view information about multiple geo-fence devices and enter or adjust a priority level of the geo-fence devices. In some embodiments, some priority levels may be authorized by jurisdiction. For example, certain jurisdictions may require that a geofenced device of a government facility or emergency service have a higher priority level than a privately owned or operated geofenced device.
In one implementation of a priority-driven set of rules for unmanned aerial vehicles in the overlap region, the first zone may require that the unmanned aerial vehicle fly below 400 feet and the payload be shut down. The second zone may require the unmanned aerial vehicle to fly below 200 feet and have no payload limitations. If the first zone can be associated with a higher priority geofence device, then the rule from the first zone can be applied without applying any rule from the second zone. For example, an unmanned aerial vehicle may fly below 400 feet and have the payload shut down. If the second zone is associated with a higher priority geofence device, then the rule from the second zone may be applied without applying any rule from the first zone. For example, an unmanned aerial vehicle may fly below 200 feet and not have any payload restrictions.
In some cases, multiple sets of flight controls may be provided when multiple zones overlap. A primary set of flight controls that the unmanned aerial vehicle may comply with may be provided. As previously described, the set of primary flight controls may include both the first set of flight controls and the second set of flight controls when they do not conflict, a more restrictive set of flight controls may be incorporated between the first set of flight controls and the second set of flight controls, aspects of both the first set of flight controls and the second set of flight controls may be incorporated in a more restrictive manner, or a set of flight controls associated with a higher priority geofence device may be incorporated.
One aspect of the invention may include a method of operating an unmanned aerial vehicle, the method comprising: determining a position of the UAV; identifying a plurality of geo-fence devices, wherein each geo-fence device indicates a set of flight restrictions for an unmanned aerial vehicle in an area covering a location of the unmanned aerial vehicle; prioritizing, with the aid of one or more processors, a set of primary flight restrictions to be followed by the UAV, the set of primary flight restrictions selected from a plurality of sets of flight restrictions of the plurality of geo-fence devices; and operating the UAV in accordance with the set of primary flight controls. Similarly, a non-transitory computer-readable medium containing program instructions for operating an unmanned aerial vehicle may be provided, the computer-readable medium comprising: program instructions for determining a position of the UAV; program instructions for identifying a plurality of geo-fence devices, wherein each geo-fence device indicates a set of flight restrictions for an unmanned aerial vehicle in an area covering a location of the unmanned aerial vehicle; and program instructions for prioritizing a set of primary flight restrictions to be followed by the UAV to allow operation of the UAV according to the set of primary flight restrictions, the set of primary flight restrictions selected from a plurality of sets of flight restrictions of the plurality of geo-fence devices.
Further, an unmanned aerial vehicle flight control prioritization system can include: one or more processors individually or collectively configured to: determining a position of the UAV; identifying a plurality of geo-fence devices, wherein each geo-fence device indicates a set of flight restrictions for an unmanned aerial vehicle in an area covering a location of the unmanned aerial vehicle; and prioritizing a set of primary flight restrictions to be followed by the UAV, the set of primary flight restrictions selected from a plurality of sets of flight restrictions of the plurality of geo-fencing devices to allow operation of the UAV according to the set of primary flight restrictions. The system may also include one or more communication modules, wherein the one or more processors are operably coupled to the one or more communication modules.
Fig. 31 illustrates an example of different regulations for different geo-fencing devices in accordance with an aspect of the present invention. Multiple geo- fence devices 3110a, 3110b, 3110c may have priority levels and/or one or more sets of geofencing. The set of flight controls may include one or more metrics. One or more regulatory values may be associated with the one or more metrics. One example of a metric may include a type of policing. For example, a first metric may apply to a lower height limit, a second metric may apply to a payload operation limit, a third metric may apply to a wireless communication limit, a fourth metric may apply to a battery capacity limit, a fifth metric may apply to a speed limit, a sixth metric may apply to a maximum item carrying weight, and so on.
In some embodiments, the geo-fencing devices may have different priority levels. Any number of priority levels may be provided. For example, one or more, two or more, three or more, four or more, five or more, six or more, seven or more, eight or more, nine or more, ten or more, fifteen or more, 20 or more, 25 or more, 30 or more, 40 or more, 50 or more, or 100 or more priority levels may be provided. The priority level may be qualitative or quantitative. For example, the priority levels may be divided into a low priority level, a medium priority level, and a high priority level. A geo-fence device with a high priority rating may be hierarchically higher than a geo-fence device with a medium priority rating. Any classification method may be provided for the priority levels. For example, priority level a, priority level B, priority level C, etc. may be provided. In some cases, the priority level may have a numerical value. For example, geo-fence device a 3110a may have a priority rating of 98, geo-fence device B3110B may have a priority rating of 17, and geo-fence device C3110C may have a priority rating of 54. In some cases, the priority level with the higher value may be higher in rank.
The plurality of geo-fence devices may have different priority levels, and a set of flight restrictions from the geo-fence device having the highest priority level may be selected as the set of primary flight restrictions. For example, if an unmanned aerial vehicle is located in an area of the overlap of geofence devices A, B and C, the unmanned aerial vehicle may have a set of primary flight restrictions that use the restrictions associated with geofence device a because geofence device a has the highest priority level. If the unmanned aerial vehicle is located in an area of the overlap of geofence devices B and C, then the unmanned aerial vehicle may have a set of flight restrictions that use the restrictions associated with geofence device C, because geofence device C has a higher priority level than geofence device B. Thus, in the scenario, the set of primary flight regulations for the unmanned aerial vehicle may include regulation AVAL3 for metric a, regulation BVAL3 for metric B, regulation EVAL3 for metric E, and regulation FVAL3 for metric F. In some cases, different geo-fencing devices may have the same priority level. Other techniques may be used to determine a set of primary flight restrictions for the unmanned aerial vehicle if the unmanned aerial vehicle falls within an overlap of geofence devices of equal priority. For example, any of the other examples described elsewhere herein may be used.
In other cases, the set of primary flight controls may include the most stringent control selected from the plurality of sets of flight controls. For example, if the unmanned aerial vehicle is located in a region of the overlap region of geofence devices A, B and C, then for each metric, the most restrictive value may be selected from the individual geofence devices. For example, for metric a, the most restrictive of AVAL1, AVAL2, or AVAL3 may be selected. For example, if metric a is a lower height limit, and AVAL 1-400 feet, AVAL 2-250 feet, and AVAL 3-300 feet, then AVAL1 may be selected because AVAL1 provides the most restrictive lower height limit. For metric B, the most restrictive of BVAL1 or BVAL3 may be selected. Since the geo-fence device does not have any restrictions on metric B, geo-fence device B is already the least restrictive. If metric B is a payload operation limit and BVAL1 is capable of powering the payload but not storing any data collected, and BVAL3 is not capable of powering the payload, then BVAL3 may be selected because BVAL3 provides more restrictive payload usage. Neither geo-fence device a nor geo-fence device C have any limitations with respect to metric D. Thus, for metric D, DVAL2 may be selected because it is the default most restrictive. For each metric, the most restrictive metric may be selected from the geo-fencing devices.
In other cases, a set of primary flight controls can include controls from geofence devices having the most stringent flight controls overall. If, in general, geo-fence device C has the most stringent regulations compared to geo-fence devices A and B, then the set of primary row regulations may include the regulation of geo-fence device C. Even though some of the metrics for a and B are more stringent, geo-fence device D may be selected if the geo-fence device as a whole is more stringent.
The set of primary flight controls may include flight controls from a single set of flight controls. For example, a set of primary flight restrictions can include restrictions from geo-fence device a only, restrictions from geo-fence device B only, or restrictions from geo-fence device C only. A set of primary flight controls includes flight controls from a plurality of sets of flight controls. For example, a set of primary flight controls can include flight controls from two or more of geofence devices A, B and C. Values from different geo-fencing devices may be selected for different metrics.
In some implementations, a set of primary flight controls may be prioritized based on an identity of the unmanned aerial vehicle (e.g., an unmanned aerial vehicle identifier or type of unmanned aerial vehicle). An unmanned aerial vehicle identifier may be received. The UAV identifier may uniquely identify the UAV from other UAVs. A set of primary flight controls may be based on a unique unmanned aerial vehicle identity. For example, the identity of the unmanned aerial vehicle can determine which set of geo-fencing devices to use for regulation, or which combination of geo-fencing devices to use for regulation. The identity of the UAV may determine which technique to use to determine the set of primary flight controls. The set of primary flight controls may be based on an unmanned aerial vehicle type. For example, the type of unmanned aerial vehicle can determine which set of geo-fencing devices to use for regulation, or which combination of geo-fencing devices to use for regulation. The type of UAV may determine which technique to use to determine the set of primary flight controls.
In some implementations, a set of primary flight controls can be prioritized based on user identity (e.g., user identifier or user type). A user identifier may be received. The user identifier may uniquely identify the user from among other users. A set of primary flight controls may be based on a unique user identity. For example, the user identity may determine which set of geo-fencing devices to use for regulation, or which combination of geo-fencing device regulations to use. The user identity may determine which technique to use to determine the set of primary flight controls. The set of primary flight controls may be based on a user type. For example, the user type may determine which set of geo-fencing devices to use for regulation, or which combination of geo-fencing device regulations to use. The user type may determine which technique to use to determine the set of primary flight controls.
The plurality of geo-fencing devices having overlapping regions may be stationary geo-fencing devices. Alternatively, they may include one or more mobile geofence devices. An overlapping zone may be created when a mobile geofencing device encounters a stationary geofencing device. When a stationary geofencing device has a boundary that can change over time, an overlapping region can be created or can be made to disappear.
Mobile geo-fencing
The geo-fencing device may be stationary or mobile. In some cases, the geo-fencing device may remain at the same location. In some cases, the geo-fencing device may remain at the same location unless moved by an individual. The substantially stationary geofence device may be placed in an environment and may not be self-propelled. The stationary geo-fencing device may be fixed to or supported by a stationary structure. The user can manually move the stationary geo-fencing device from the first location to the second location.
The geo-fencing device may be mobile. The geo-fencing device is movable from location to location. The geo-fencing device can be moved without the person moving the geo-fencing device. The mobile geofencing device can be self-propelled. The mobile geofencing devices can be affixed to or supported by a movable object such as a vehicle. The mobile geofencing device may have one or more propulsion units thereon that may allow the mobile geofencing device to move around in the environment. The mobile geofencing devices can be attached to or supported by a movable object, which can have one or more propulsion units that can allow the movable object to move around in an environment with the mobile geofencing devices.
Fig. 32 illustrates an example of a mobile geo-fencing device in accordance with an embodiment of the present invention. The mobile geofencing devices may be unmanned aerial vehicles 3210a, 3210 b. The mobile geofencing device may be an aircraft, a ground-based vehicle, a water-based vehicle, or a space-based vehicle, or any combination thereof. The unmanned aerial vehicle is provided by way of example only, and any description of the unmanned aerial vehicle herein may be applicable to any other vehicle or movable object.
Geo- fence devices 3210a, 3210b may have a location that may change over time. A distance d may be provided between the mobile geo-fencing devices. The mobile geofence devices can have corresponding boundaries 3220a, 3220 b. The boundary may remain unchanged or may change over time. The boundary may change in response to one or more detected conditions as described elsewhere herein.
The mobile geofencing device can issue a wireless communication. The wireless communication may be a message that may include information about the mobile geo-fencing device. Identification information, such as a geo-fencing device identifier or geo-fencing device type, may be transmitted. The message may include a message signature. The message may include geofence device key information. The message may include location information for the mobile geo-fencing device. For example, the message may include global coordinates for the geo-fencing device. The mobile geofencing device can have a GPS unit or other locator thereon that can provide the location of the mobile geofencing device. The message may include information about the flight plan or course of the geo-fencing device. The message may include time information, such as the time the message was sent. The time can be provided according to a clock on top of the mobile geo-fencing device. The message may include flight control information. For example, the message may include information about the received flight command and/or the flight command being executed.
When the mobile geofencing devices are unmanned aerial vehicles, messages can be sent and received from each other. For example, when a first mobile geo-fence device 3210a is proximate to a second mobile geo-fence device 3210b, each mobile geo-fence device may issue a message with any of the information described herein. The unmanned aerial vehicles may identify and/or detect each other based on the transmitted messages. Alternatively, other detection or identification techniques may be used, such as those described elsewhere herein.
Messages from the unmanned aerial vehicle may continue to be sent. For example, the message may be continuously broadcast. The unmanned aerial vehicles may issue messages regardless of whether they have detected each other. The message from the UAV may be sent out periodically (e.g., at regular or irregular intervals), according to a schedule, or in response to a detected event or condition. For example, a message may be sent when an unmanned aerial vehicle detects the presence of other unmanned aerial vehicles. In another example, the unmanned aerial vehicle may issue a message when commanded by the air management system.
In some cases, the message for the unmanned aerial vehicle may include a geofence radius. The geofence radius may be related to the maneuverability or mission of the corresponding unmanned aerial vehicle. For example, if the unmanned aerial vehicle is more maneuverable, a smaller radius may be provided. Larger radii may be provided when the unmanned aerial vehicle is less maneuverable.
For example, first unmanned aerial vehicle 3210a may broadcast a first geofence radius (e.g., RA). Alternatively, the second UAV 3210b may broadcast a second geo-fence radius (e.g., RB). When the second UAV receives the radius from the first UAV, a distance d between the first UAV and the second UAV may be calculated. If the distance d is less than RA or less than RB, then the second UAV may make course corrections or hover-in-place while it may inform the first UAV that a collision potential exists. In that case, the first unmanned aerial vehicle may make course corrections or hover on site.
Similarly, during flight, the second UAV may simultaneously broadcast a second geo-fence radius (e.g., RB). When the first UAV receives the radius from the second UAV, a distance d between the first UAV and the second UAV may be calculated. If the distance d is less than RA or less than RB, then the first UAV may make a course correction or hover-in-place while it may inform the second UAV that there is a potential for a collision. In that case, the second unmanned aerial vehicle may make course corrections or hover on site.
When two unmanned aerial vehicles broadcast information, both unmanned aerial vehicles can detect the possibility of a collision and provide course correction. In some cases, one of the unmanned aerial vehicles may continue its course while the other unmanned aerial vehicle takes evasive action to avoid a possible collision. In other cases, both unmanned aerial vehicles may take some form of evasive action to avoid a possible collision.
In some embodiments, when there is a priority difference between the unmanned aerial vehicles, only one of the unmanned aerial vehicles may take evasive action and the other may not. For example, a higher priority unmanned aerial vehicle may not be required to take evasive action, while a lower priority unmanned aerial vehicle may be forced to take evasive action. In another example, a calculation may be made as to which unmanned aerial vehicle will be susceptible to taking evasive action. For example, if a first UAV is moving very quickly and a second UAV is moving very slowly, it may be easier to make a first UAV lane change if the second UAV is not able to exit the route in time. In another example, if the second UAV may be timely out of route, it may be enabled to make course corrections, otherwise it would take more energy to enable the first UAV to take evasive action.
Thus, the unmanned aerial vehicle can be used as a mobile geofencing device, which can help the unmanned aerial vehicle avoid collisions. For collision avoidance applications, the UAV may restrict entry of other UAVs within the boundaries of the UAV. For example, the first UAV may prevent other UAVs, such as the second UAV, from entering the boundary of the first UAV. Similarly, the second UAV may prevent the first UAV from entering a boundary of the second UAV. Flight response measures may occur that may help prevent collisions if one of the unmanned aerial vehicles enters the boundary of the other unmanned aerial vehicle.
Similar collision avoidance applications may also be provided for stationary mobile geofencing devices. For example, if the first geo-fence device is a stationary geo-fence device mounted on a stationary object (such as a building on the ground), it may be helpful to prevent a second mobile geo-fence device (e.g., an unmanned aerial vehicle) from hitting the stationary object on which the first geo-fence device is mounted. The geofencing device may provide a virtual "force field" that may prevent unauthorized unmanned aerial vehicles or other mobile geofencing devices from entering into the boundary.
In other embodiments, other types of restrictions may be provided within the boundaries of the geo-fencing device. For example, payload operation limits may be provided. In one example, two unmanned aerial vehicles may be traveling, each having its own corresponding camera that may be capturing images. The first UAV may have flight restrictions that do not allow other UAVs within the boundary to operate the camera. The second UAV may have flight restrictions that allow other UAVs within the boundary to operate the camera but not allow images to be recorded. Thus, when a second UAV enters the boundary of a first UAV, the second UAV may have to power down its camera. If it cannot power down its camera, it may be forced to take evasive action to avoid the boundary of the first UAV. When the first UAV enters the boundary of the second UAV, it may keep its camera on but must stop recording. Similarly, if it cannot stop recording, it may be forced to take evasive action. This type of restriction may be useful if it is desired that the activity of the geo-fencing device not be recorded or captured on the camera. For example, while an unmanned aerial vehicle is undergoing a mission, it may not be desirable for other unmanned aerial vehicles to capture images of the unmanned aerial vehicle. Any other type of limitation may be used, such as those described elsewhere herein.
Aspects of the invention may include a method of identifying a mobile geo-fencing device, the method comprising: receiving, at an unmanned aerial vehicle, a signal from a mobile geofencing device, the signal indicative of (1) a location of the mobile geofencing device and (2) one or more geofence boundaries of the mobile geofencing device; calculating a distance between the UAV and the mobile geofencing device; determining whether the UAV falls within the one or more geofence boundaries of the mobile geofence device based on the distance; and operating the unmanned aerial vehicle under a set of flight regulations provided based on the mobile geofence device when the unmanned aerial vehicle falls within the one or more geofence boundaries of the mobile geofence device.
The unmanned aerial vehicle may include: a communication unit configured to receive a signal from a mobile geo-fence device, the signal indicating (1) a location of the mobile geo-fence device and (2) one or more geo-fence boundaries of the mobile geo-fence device; and one or more processors operatively coupled to the communication unit and individually or collectively configured to: calculating a distance between the UAV and the mobile geofencing device; determining whether the UAV falls within the one or more geofence boundaries of the mobile geofence device based on the distance; and generating a signal to enable operation of the unmanned aerial vehicle under a set of flight regulations provided based on the mobile geofence device when the unmanned aerial vehicle falls within one or more geofence boundaries of the mobile geofence device.
The one or more geofence boundaries of the mobile geofence device may be a circular boundary having a first radius centered at the geofence device. The distance between the unmanned aerial vehicle and the mobile geofence device can be compared to the first radius. The unmanned aerial vehicle can also be a geofencing device having a second set of one or more geofence boundaries. The second set of one or more geofence boundaries of the unmanned aerial vehicle can be circular boundaries having a second radius centered on the unmanned aerial vehicle. The distance between the unmanned aerial vehicle and the mobile geofencing device can be compared to the second radius. The mobile geofencing device may be another unmanned aerial vehicle.
Fig. 33 illustrates an example of mobile geofencing devices in proximity to each other, in accordance with an embodiment of the present invention. The first mobile geofence device 3310a may be approaching the second mobile geofence device 3310 b. The second mobile geofencing device can be approaching the first mobile geofencing device. The geo-fencing devices may be in proximity to each other. The mobile geofencing device can have a corresponding set of boundaries 3320a, 3320 b. The mobile geofencing device can be moving along the corresponding trajectory 3330a, 3330 b.
In some implementations, the trajectory and/or boundaries of the mobile geo-fence devices can be analyzed to determine whether the mobile geo-fence devices will likely cross into each other's boundaries. In some cases, the mobile geofencing device may be moving quickly, so it may be desirable to make a determination of whether the device is located on a lane of early collisions. The trajectory can be analyzed to predict a future location of the mobile geofencing device. The boundaries can be analyzed to determine the berths that mobile geofence devices will need to have in order to avoid each other. For example, a larger boundary may result in the mobile geo-fence devices needing to remain relatively wide moored to each other. A smaller boundary may result in allowing the mobile geo-fence devices to remain small moors to each other.
In some cases, a trajectory may be provided for moving the geofencing devices in direct proximity to each other. Alternatively, one or more tracks may be eliminated. The speed and/or acceleration of the moving geo-fencing device may also be considered in determining whether to take evasive action. And, determining whether any mobile geo-fence device needs to take avoidance action, which mobile geo-fence device will need to take avoidance action, or whether both mobile geo-fence devices will need to take avoidance action. Similarly, the type of avoidance action (e.g., whether to change course, slow down, accelerate, hover, or change any other geo-fence device operating parameter) may be determined.
Fig. 34 illustrates another example of a mobile geo-fencing device in accordance with an embodiment of the present invention. Mobile geofencing device 3410 can be a movable object, or can be fixed to or supported by movable object 3420. The movable object may be a vehicle, such as a ground-based vehicle, a water-based vehicle, an air-based vehicle, or a space-based vehicle. The mobile geofencing device can have a boundary 3430. The boundary may move with the mobile geofencing device. Unmanned aerial vehicle 3440 may be in proximity to a geofencing device.
Geofencing device 3410 may have any associated set of flight controls for unmanned aerial vehicle 3440. In one example, flight regulations may require that the unmanned aerial vehicle remain within geofence boundary 3430 of the mobile geofence device. The unmanned aerial vehicle may be free to fly within the airspace defined by the geofence boundary. As the mobile geofencing device moves, the boundary may move with the mobile geofencing device. The unmanned aerial vehicle can then also move with the mobile geofencing device. Such a limitation may be useful in scenarios where the unmanned aerial vehicle may be expected to follow a movable object. For example, the movable object may be a ground-based vehicle, and the unmanned aerial vehicle may be expected to fly over the ground-based vehicle and capture images of the surrounding environment, which may be displayed on the ground-based vehicle. An unmanned aerial vehicle may be maintained in proximity to the ground-based vehicle without requiring a user to actively operate the unmanned aerial vehicle. The unmanned aerial vehicle may remain within the boundaries that move with the vehicle and may thus follow the vehicle.
In another example, the constraint may keep the unmanned aerial vehicle outside of the boundary. The restraint may prevent the unmanned aerial vehicle from hitting or colliding with the ground-based vehicle. Even if the unmanned aerial vehicle is being manually operated by a user, the unmanned aerial vehicle may take flight response measures that may prevent the unmanned aerial vehicle from crashing into the vehicle if the user provides instructions that will cause the unmanned aerial vehicle to crash into the vehicle. This may be useful for inexperienced unmanned aircraft users who might otherwise inadvertently cause a crash, or malicious unmanned aircraft users who might have been trying to intentionally bump into a vehicle. The mobile geofence device can protect objects from malicious users that would otherwise cause the unmanned aerial vehicle to collide with the vicinity of the mobile geofence device.
An additional example may be for payload limitations. For example, when an unmanned aerial vehicle is located within the boundaries of a mobile geofencing device, the unmanned aerial vehicle may not be allowed to capture environmental images. The unmanned aerial vehicle may not be able to capture images of the mobile geofencing device and the movable objects even as the mobile geofencing device moves around within the environment. This may be useful if it is desired to prevent the unmanned aerial vehicle from collecting data (such as image data) about a moving geo-fencing device or a movable object.
In another example, the restriction may prevent wireless communication of the unmanned aerial vehicle when the unmanned aerial vehicle is located within the boundary. In some examples, the restrictions may allow wireless communication but may prevent communication that may interfere with the functionality of the movable object and/or the geo-fencing device. When the geofencing device is moving, the unmanned aerial vehicle may not be able to use wireless communications that may interfere with wireless communications of the geofencing device and/or the movable object. This may be useful to prevent random unmanned aerial vehicles from entering the vicinity of the movable object and interfering with the communication of the movable object and/or the mobile geofencing device.
Any other type of restriction as described elsewhere herein may be applicable to the mobile geo-fencing device.
User interface
Information about one or more geo-fencing devices may be shown on the display device. The apparatus may be a user terminal viewable by a user. The user terminal may also act as a remote control that may send one or more operational commands to the unmanned aerial vehicle. The remote control may be configured to accept user input to effect operation of the UAV. Examples of unmanned aerial vehicle operations that may be controlled by a user may include flight, payload operation, payload positioning, carrier operation, sensor operation, wireless communication, navigation, power usage, item delivery, or any other operation of the unmanned aerial vehicle. For example, a user may control the flight of an unmanned aerial vehicle via a remote control. The user terminal may receive data from the unmanned aerial vehicle. The unmanned aerial vehicle may capture data using one or more sensors, such as a camera. The user terminal may be provided with images from a camera or may be provided with data from any other sensor. The user terminal can also act as a remote control that can send one or more commands to the geo-fencing device or alter the functionality of the geo-fencing device. The device may be a display device on the geofencing device itself. The geo-fencing device can show information about the geo-fencing device and any surrounding geo-fencing devices. The device may be a display device of an operator (e.g., manager) of the empty pipe system. The device may be a display device of a judicial entity user (e.g., a government worker, an employee of a government agency) and/or an emergency service user (e.g., a police officer, etc.). The device may be a display device that is viewable by any other individual involved in the unmanned aerial vehicle system.
The display device may include a screen or other type of display. The screen may be an LCD screen, a CRT screen, a plasma screen, an LED screen, a touch screen and/or may use any other technique to display information known in the art or later displayed.
Fig. 35 illustrates an example of a user interface showing information about one or more geo-fencing devices, according to an embodiment of the present invention. The display device 3510 can have a screen or other portion that can show a user interface 3520 that can display information about one or more geo-fence devices 3520. In one example, a map of the geo- fence devices 3530a, 3530b, 3530c can be displayed. The user interface can show the location of the geo-fencing devices relative to each other. The user interface may show corresponding boundaries 3540a, 3540b, 3540c for the geo-fence devices. The position of unmanned aerial vehicle 3550 relative to the geofence device can be displayed.
The display device may be a remote device that may be used to allow a user to view geofence device information. Information regarding the location of the geo-fence device can be aggregated from one or more geo-fence devices. The remote device may receive information directly from one or more geo-fencing devices. For example, a geo-fencing device can transmit a signal regarding the location of the geo-fencing device. Alternatively, information from one or more geo-fencing devices may be provided indirectly to the display device. In one example, the empty pipe system, other portions of the authentication system, or any other system may collect information about the geo-fenced device. For example, the empty pipe system can receive information about the location of the geo-fencing device and can transmit the information about the geo-fencing device to a remote device.
The geofencing device can transmit information about the geofence boundary to a remote device or to another device or system, such as an air traffic management system. The empty pipe system can receive information about the geofence boundary from the geofence device and can transmit the information to a remote device. In other embodiments, the geo-fencing device may only transmit location information, while an air traffic management system or other system may provide information about the boundary. For example, the geofencing device can transmit the geofencing device location to an empty pipe system, and the empty pipe system can determine the boundary for the geofencing device. The air management system may then send information about the boundary to the remote device along with the location information. In some implementations, an empty pipe system or other system can generate the boundary based on the location of the geo-fencing device. The location of the boundary can be determined with respect to the location of the geo-fencing device. The air management system may take into account other factors in determining the geofence boundary, such as information about the unmanned aerial vehicle and/or the user in the vicinity of the geofence device, environmental conditions, timing, and/or any other factors.
In some embodiments, the remote device may be a remote control of the unmanned aerial vehicle. An unmanned aerial vehicle being controlled by a remote device can be located in proximity to one or more geo-fencing devices. The empty pipe system or the geofencing device can detect that the unmanned aerial vehicle is in the vicinity of the geofencing device. The unmanned aerial vehicle can determine that it is located in the vicinity of the geofencing device. A geofence boundary may be generated based on information about the unmanned aerial vehicle or a user operating the unmanned aerial vehicle. The geofence boundary may be generated at an empty pipe system, a geofence device, an unmanned aerial vehicle, and/or a remote control. The remote device may display information about the boundaries of the geofencing device, which may or may not be specifically tailored for the UAV.
In one example, a border is shown, which may be customized for a remote device viewing the border. The boundary may be customized based on information about the UAV that may be in communication with the remote device (e.g., UAV identifier, UAV type, UAV activity). One or more operations of the UAV may be controlled by commands from a remote device. The unmanned aerial vehicle may transmit information regarding data collected by the unmanned aerial vehicle to a remote device. For example, images captured by an image capture device on the unmanned aerial vehicle may be sent down to the remote device.
When customizing the boundaries for a remote device, other remote devices may or may not see the same boundaries as the remote device. For example, a first remote device may communicate with a first UAV, which may be located in proximity to one or more geofencing devices. The first remote device can display information regarding the location and/or boundary of the geo-fencing device. The location of the first unmanned aerial vehicle can be displayed with respect to the location and/or boundary of the geo-fencing device. The second remote device may be in communication with a second UAV, which may be located in proximity to one or more geofencing devices. The second remote device can display information regarding the location and/or boundary of the geo-fencing device. The location of the second unmanned aerial vehicle can be displayed with respect to the location and/or boundary of the geo-fencing device. In some cases, the information regarding the location of the geo-fencing device may be consistent between the first remote device and the second remote device. For example, the same location may be provided for the same geo-fencing device displayed on both the first remote device and the second remote device. The information about the boundary for the geo-fenced device may or may not be consistent between the first remote device and the second remote device. For example, the boundary may appear the same on the first remote device and the second remote device. Alternatively, the boundary may appear different on the first remote device and the second remote device. In some cases, the boundaries of the geofencing device may change depending on the identity of the unmanned aerial vehicle. The boundaries of the geofencing device may vary depending on the type of unmanned aerial vehicle. Thus, the first remote device and the second remote device may display different boundaries for the same geo-fencing device. All of the geo-fencing devices, some of the geo-fencing devices, one of the geo-fencing devices, or none of the geo-fencing devices may show a different boundary between the first remote device and the second remote device.
The first remote device may show the location of the first UAV and the second remote device may show the location of the second UAV. The positions of the first unmanned aerial vehicle and the second unmanned aerial vehicle may be different from each other. The first remote device may or may not show the location of the second UAV. The second remote device may or may not show the location of the first UAV. The remote device may show the location of the UAV to which the remote device may correspond. The remote device may or may not show the location of other unmanned aerial vehicles.
One aspect of the invention relates to a method of displaying geofence information for an unmanned aerial vehicle, the method comprising: receiving geofence device data, the data comprising (1) a location of at least one geofence device and (2) one or more geofence boundaries of the at least one geofence device; providing a display configured to display information to a user; and showing on the display a map having (1) a location of the at least one geo-fence device and (2) one or more geo-fence boundaries of the at least one geo-fence device. The display device may include: a communication unit configured to receive geo-fence device data comprising (1) a location of at least one geo-fence device and (2) one or more geo-fence boundaries of the at least one geo-fence device; and a display configured to display information to a user, wherein the display shows a map having (1) a location of the at least one geo-fence device and (2) the one or more geo-fence boundaries of the at least one geo-fence device.
The position of the object shown on the remote display device may be updated in real time. For example, the position of the unmanned aerial vehicle shown on the remote device may be updated in real time. The location of the unmanned aerial vehicle can be shown with respect to at least one geo-fencing device. The position of the UAV may be updated continuously, periodically (e.g., at regular or irregular intervals), according to a schedule, or in response to a detected event or condition. In some cases, the position of the unmanned aerial vehicle on the screen may be updated within less than 15 minutes, 10 minutes, 5 minutes, 3 minutes, 2 minutes, 1 minute, 30 seconds, 15 seconds, 10 seconds, 5 seconds, 3 seconds, 2 seconds, 1 second, 0.5 seconds, 0.1 seconds, 0.05 seconds, or 0.01 seconds of the unmanned aerial vehicle movement.
In some embodiments, the geo-fencing device may be stationary. The location of the geo-fencing device need not be updated or can be updated continuously, periodically, according to a schedule, or in response to a detected event or condition. In some cases, a user may move a stationary geo-fencing device. For example, a user may pick up the geo-fencing device and move it to another location. The updated location may be displayed on the remote device.
Alternatively, the geo-fencing device may be mobile. The location of the geo-fencing device can vary. The location of one or more geo-fencing devices shown on the remote display can be updated in real-time. The location of one or more geo-fencing devices may be shown relative to each other and/or other features on the map. The location of the geo-fence device may be updated continuously, periodically (e.g., at regular or irregular intervals), according to a schedule, or in response to a detected event or condition. In some cases, the location of the geo-fence device on the screen may be updated within less than 15 minutes, 10 minutes, 5 minutes, 3 minutes, 2 minutes, 1 minute, 30 seconds, 15 seconds, 10 seconds, 5 seconds, 3 seconds, 2 seconds, 1 second, 0.5 seconds, 0.1 seconds, 0.05 seconds, or 0.01 seconds of the movement of the geo-fence device.
The boundaries of the geo-fencing device can be substantially static. There is no need to update the display of the substantially static geofence device boundaries. Alternatively, the boundary may be updated continuously, periodically, according to a schedule, or in response to a detected event or condition.
The boundaries of the geo-fencing device can optionally change over time. The location of the border shown on the remote display may be updated in real time. The location of one or more geofence device boundaries may be shown relative to each other and/or other features on the map. The geofence device boundaries may be updated continuously, periodically (e.g., at regular or irregular intervals), in response to a schedule, or in response to a detected event or condition. In some cases, the geofence device boundary shown on the screen may be updated within less than 15 minutes, 10 minutes, 5 minutes, 3 minutes, 2 minutes, 1 minute, 30 seconds, 15 seconds, 10 seconds, 5 seconds, 3 seconds, 2 seconds, 1 second, 0.5 seconds, 0.1 seconds, 0.05 seconds, or 0.01 seconds of the geofence device boundary change. The geofence device boundaries may vary according to any number of factors, such as those described elsewhere herein. For example, environmental conditions may cause a change in the boundary. Other factors such as unmanned aerial vehicle information, user information, or timing may cause the boundary to change.
The user interface may optionally show a visual indicator of one type of flight control imposed by the at least one geo-fencing device. Different kinds of flight restrictions can be imposed by the geofencing device. Examples of categories may include, but are not limited to, flight controls, payload controls, communication controls, power usage/management controls, controls regarding items carried, navigation controls, sensor controls, or any other controls. The visual indicators may visually distinguish between different types or kinds of flight controls. Examples of types of visual indicators may include, but are not limited to, words, numbers, symbols, icons, sizes, images, colors, patterns, highlights, or any other visual indicator that may help distinguish between different types of flight controls. For example, different colors may be provided for different types of flight controls imposed by the at least one geo-fencing device. For example, a first geo-fence device or boundary may have a first color (e.g., red) that indicates a first type of flight control (e.g., altitude cap), while a second geo-fence device or boundary may have a second color (e.g., green) that indicates a second type of flight control (e.g., payload usage). In some cases, a single geo-fencing device may have multiple types of flight controls. The visual indicator may indicate the types of coverage (e.g., red and green lines may be displayed on the border to indicate that both the upper height limit and the payload usage limit are in place). In some embodiments, the area within the boundary for which flight controls are applicable may be shaded or may have a color that indicates the type of flight control. In some cases, if regulations apply to one or more restrictions outside the boundary, the area outside the boundary may be shaded or colored. For example, an unmanned aerial vehicle may only be allowed to operate with payloads within a set of boundaries of the geofencing device. Then the area outside the boundary may be shaded indicating that payload usage is restricted outside the boundary of the geofencing device. Alternatively, the zones within the boundary may be shaded, indicating that the type of regulation is within the boundary and only operations within the boundary are allowed.
In some embodiments, the unmanned aerial vehicle may have a flight trajectory or direction. The trajectory or direction of the UAV may be shown on a user interface. For example, the arrow or vector may point in the direction in which the unmanned aerial vehicle is traveling. The visual indicator of the unmanned aerial vehicle direction or trajectory may or may not indicate unmanned aerial vehicle speed or other movement factors. For example, the indicator may be able to visually distinguish whether the unmanned aerial vehicle is traveling at a higher speed or a lower speed. In one example, a speed value may be displayed. In another example, a longer arrow or vector may correspond to a greater velocity than a shorter arrow or vector.
Information about the flight path of the UAV may be displayed on a user interface. Alternatively, information about past flight paths of the unmanned aerial vehicle may be displayed. For example, a dashed line or other indicator of a path may show where the UAV has traveled. The map may display other indicators of routes or paths that the unmanned aerial vehicle has traversed. In some cases, the future flight path may be displayed on a user interface. In some cases, the unmanned aerial vehicle may have a predetermined or semi-predetermined flight plan. The flight plan may include a planned future flight path. The planned flight path may be displayed on a user interface. For example, other indicators may be shown that plan a course or path in which the unmanned aerial vehicle is traveling. Future flight paths may be altered or updated in real time. The future flight path may be updated and/or displayed continuously, periodically (e.g., at regular or irregular intervals), according to a schedule. In some cases, the future flight path shown on the screen may be updated in less than 15 minutes, 10 minutes, 5 minutes, 3 minutes, 2 minutes, 1 minute, 30 seconds, 15 seconds, 10 seconds, 5 seconds, 3 seconds, 2 seconds, 1 second, 0.5 seconds, 0.1 seconds, 0.05 seconds, or 0.01 seconds of the change to the future flight path.
The priority level for the geo-fence device can be displayed on a user interface. The visual indicator can allow for visual differentiation between different priority levels for the geo-fencing device. For example, the size or shape of the icon may indicate a priority level for the geo-fence device. The color of the icon may indicate a priority level for the geo-fencing device. A tag, such as a word or a numerical value, can be provided by the geo-fence device, which can indicate a priority level of the geo-fence device. In some cases, the priority level may not be displayed visually under normal circumstances. However, the information may be displayed when the user selects the geo-fencing device or places the mouse on the geo-fencing device.
The remote display device may be configured to receive user input. In one example, the display device may have a touch screen that may record user input when a user touches or slides the screen. The device may have any other type of user interaction component, such as a button, mouse, joystick, trackball, touchpad, pen, inertial sensor, image capture device, motion capture device, or microphone.
The user input may affect the operation of the unmanned aerial vehicle or the geofencing device. Examples of unmanned aerial vehicle operation that may be affected by user input may include unmanned aerial vehicle power on or off, unmanned aerial vehicle flight path, unmanned aerial vehicle takeoff, unmanned aerial vehicle landing, unmanned aerial vehicle destination or waypoint, unmanned aerial vehicle flight mode (e.g., autonomous, semi-autonomous, or manual flight mode; or flight mode along a predetermined path, semi-predetermined path, or real-time path).
The user input can affect the operation of the geo-fencing device. The user input can affect a location of the geo-fencing device and/or a boundary of the geo-fencing device. The user input can affect a set of flight controls associated with the geo-fencing device. For example, the user input may affect one or more restrictions imposed by the geo-fencing device. The user input can affect the priority level of the geo-fencing device.
One aspect of the invention relates to a method of controlling a geo-fencing device, the method comprising: receiving data regarding at least one geo-fencing device; providing a display configured to show geo-fence device information to a user based on the received data about the at least one geo-fence device; receiving a user input affecting operation of the at least one geo-fencing device; and transmitting, by means of a transmitter, one or more signals affecting operation of the at least one geo-fencing device in accordance with the user input. A display device may include: a receiver configured to receive data regarding at least one geo-fencing device; a display configured to show geo-fence device information to a user based on the received data about the at least one geo-fence device; one or more processors individually or collectively configured for receiving user input affecting operation of the at least one geo-fencing device; and a transmitter configured to transmit one or more signals affecting operation of the at least one geo-fencing device in accordance with the user input.
Fig. 43 provides an example of an apparatus that can accept user input to control one or more geo-fencing devices in accordance with an embodiment of the present invention. The system may include one or more remote devices 4310. The system can also include one or more geo- fence devices 4320a, 4320b, 4320 c. The geo-fence device can optionally communicate with the empty pipe system 4330, another part of the authentication system, or any other system or device. Alternatively, the geo-fencing device may communicate directly with a remote device. The remote device may include a user interface on display 4315. The user interface can show information about the geo-fencing device. In one example, map 4340 may show location information about the geo-fencing device. Alternatively, the present invention may provide a list format, a chart format, or any other format. In some embodiments, one or more additional zones 4360 may be provided. The region may include tools or options regarding control of one or more geo-fencing devices. In some cases, the user may interact directly with the user interface 4350 to control the geo-fencing device.
In some embodiments, one-way or two-way communication may be provided between the geo-fencing device and the air traffic management system or other system. For example, a geofencing device may provide location information or other information about the geofencing device to an empty pipe system. The empty pipe system can relay one or more instructions to the geo-fencing device (e.g., instructions regarding whether to change location, change boundaries, change priorities, change flight restrictions, etc.). The empty pipe system or other system may have one-way or two-way communication with the remote display device. Information about the geo-fence device (e.g., geo-fence device location, boundary) can be transmitted from the empty pipe system to a remote display device. In some embodiments, information regarding one or more user inputs to the display device may be provided to the empty pipe system. For example, a user can provide input to affect operation of a geo-fencing device, the input can be transmitted to an empty pipe system, which can in turn transmit instructions to the corresponding geo-fencing device. In other embodiments, direct communication may be provided between the geo-fencing device and the remote display device. The geo-fence device can directly provide information about the geo-fence device, wherein at least some of the information can be displayed on a remote display device. The remote display device can receive user input affecting operation of at least one geo-fence device, and instructions to affect operation of at least one geo-fence device can be transmitted to the corresponding geo-fence device.
The user input can affect the operation of the geo-fencing device. The user input can affect the location of the geo-fencing device. In some cases, the geo-fencing device may be a mobile geo-fencing device. The user can provide input that can affect movement of the geo-fencing device. For example, the user input may indicate that the mobile geo-fencing device is to be moved to a new location. The user input can affect a geo-fencing device that is not currently moving. Alternatively, the user input may affect the geo-fencing device while the geo-fencing device is in motion. The user input can indicate a destination or waypoint for the geo-fence device. For example, the user can enter coordinates for a destination or waypoint of the geo-fence device. In another example, a user may click on a geo-fencing device on a map and drag it from a current location to a desired destination. In another example, the user can slide on the map using a finger to pick up and move the geo-fencing device to a new desired location. The user input may indicate a path for the geo-fencing device. A user can trace a desired path for a geo-fencing device using the user's finger. The user can input one or more parameters for the geofence device path (e.g., specify that the geofence device should take the shortest path that can reach a desired destination, etc., specify whether there are any constraints on the path, such as altitude limits, no-fly zones, land-or water-based infrastructure that the geofence device should follow). The user input can set one or more parameters for movement of the geo-fencing device. For example, the user input may indicate a translational velocity, an angular velocity, a translational acceleration, or an angular acceleration of the geo-fencing device. Any form of maximum or minimum velocity or acceleration may be provided.
The user input can affect the boundary of the geo-fencing device. One or more geofence boundaries of at least one geofence device may be affected by user input. The user input can affect the size and/or shape of the geofencing device boundary. In one example, a user may enter a set of coordinates and/or geometric parameters (e.g., a radius) for a desired boundary of a geo-fencing device. The user can trace out the desired geofencing device using the user's finger 4350 or a pointer. The depicted boundary may be free-hand drawn or one or more shape templates may be used. The user may select an existing boundary and drag and drop the boundary to change the size of the boundary. A user may drag and drop a portion of the boundary to extend the boundary or alter the shape of the boundary. The updated geofence boundary information can be shown in real-time. For example, updated boundary information may be shown on the display based on user input. In some cases, each geo-fencing device may have a default boundary. Alternatively, initially, the boundary may be undefined. The user may be able to change the default boundaries or enter new boundaries for undefined boundaries.
The user input can affect a set of flight controls associated with the geo-fencing device. For example, the user input may affect one or more restrictions imposed by the geo-fencing device. In some cases, the user may interact with the map 4340 to enter or alter flight restrictions. In other cases, a user may interact with one or more regions 4360 to enter or modify flight limits. The user may select the geofencing device for which the user wishes to enter or modify flight limits. In some cases, the user may select one or more flight restrictions from a plurality of available flight restrictions for the selected geo-fence device. The user may enter one or more values that may be specified for flight control. For example, the user may select such that the flight restrictions for the geo-fence device may include a lower altitude limit and a maximum speed. The user may then enter a value for the lower altitude limit and a value for the maximum speed. In other cases, the user may specify or generate flight restrictions without selecting from pre-existing options. In some cases, each geofence device may have a default associated flight limit. Alternatively, initially, a set of flight limits may be undefined. The user may be able to change a default set of flight limits or enter new flight limits for an undefined device. Thus, a user may be able to program one or more sets of flight restrictions for the geo-fencing device from a remote location. The user may be able to update a set of flight restrictions for the geo-fencing device from a remote location. The user may be able to program different sets of flight regulations for the geo-fence device for different conditions. For example, when an unmanned aerial vehicle encounters a geofencing device, a user may be able to specify that a first set of flight restrictions are to be provided to the first type of unmanned aerial vehicle and that a second set of flight restrictions are to be provided to the second type of unmanned aerial vehicle. The user may be able to program: the first set of flight restrictions are provided when the unmanned aerial vehicle encounters the geo-fencing device under a first set of environmental conditions, and the second set of flight restrictions are provided for the unmanned aerial vehicle that encounters the geo-fencing device under a second set of environmental conditions. The user can also program: the first set of flight restrictions is provided when the unmanned aerial vehicle encounters the geo-fence device at a first time, and the second set of flight restrictions is provided for the unmanned aerial vehicle that encounters the geo-fence device at a second time. The user may be able to program any type of condition or combination of conditions that may produce various flight controls.
The user input can affect the priority level of the geo-fencing device. For example, the user may specify that the geo-fencing device has a high priority, a medium priority, or a low priority. The user may specify a priority value for the device. Any other type of priority as described elsewhere herein may be defined by the user. In some cases, the geo-fencing device may have a default priority level. Alternatively, initially, the priority level of the geo-fence device may be undefined. The user may be able to change the default priority level or enter a new priority level for an undefined device. The user may be able to specify any available priority level for the geo-fencing device. Alternatively, the user may have limited flexibility or freedom to enter the priority level of the geo-fencing device. For example, certain priority levels may be reserved for official government or emergency services geofencing devices. Conventional private users may or may not be able to obtain the highest priority level for private geofence devices. In some cases, outside of the threshold priority, the empty pipe system handler/manager may need to approve a higher priority. For example, a private user may request a high priority level. The empty pipe system may approve or deny the request for a high priority level. In some cases, a governmental entity such as a governmental agency may approve or deny the request for high priority.
Thus, the user may advantageously provide input that can control the operation of one or more geo-fencing devices. The user may provide input via a remote device. Thus, the user need not be physically present at the location of the geofencing device to control the geofencing device. In some cases, the user may choose to be located near the geo-fencing device, or may choose to be remote from the geo-fencing device. The user controlling the operation of the geo-fence device may be an owner or operator of the geo-fence device. The individual operating the geofencing device may be separate from the individual controlling the unmanned aerial vehicle that may encounter the geofencing device, or may be the same user.
In other implementations, the user can interact directly with the geo-fencing device. The user can provide manual input to the geo-fencing device that can control operation of the geo-fencing device. The user input can also control the operation of other geo-fencing devices in the vicinity of the geo-fencing device. For example, a set of flight restrictions for a geofence device may be manually updated using a user interface mounted on the geofence device. Other operating characteristics of the geo-fence device, such as the boundaries of the geo-fence device or the priority level of the geo-fence device, can be manually updated using a user interface mounted on the geo-fence device.
Any of the functionality described elsewhere herein for controlling the operation of a geo-fence device (e.g., via a remote control) can also be applied to the user interface mounted on the geo-fence device. Any description herein of the data shown by the user interface may also apply to the user interface mounted on the geo-fencing device. In one example, the geo-fencing device can have screens and/or buttons with which a user can interact to control the operation of the geo-fencing device and/or view local data.
Geo-fencing device software applications
A user may have a device that performs various functions. The device may already be present in the user's possession. For example, the device may be a computer (e.g., personal computer, laptop, server), a mobile device (e.g., smartphone, cellular phone, tablet computer, personal digital assistant), or any other type of device. The device may be a network device capable of communicating over a network. The apparatus includes one or more memory storage units, which may include a non-transitory computer-readable medium that may store code, logic, or instructions for performing one or more of the steps described elsewhere herein. The apparatus may include one or more processors that may perform one or more steps individually or collectively according to code, logic, or instructions of a non-transitory computer readable medium as described herein. For example, a user may use the device to communicate (e.g., make a phone call, send or receive images, video or text, send or receive email). The device may have a browser that may allow a user to access the internet or browse a network.
When a device provides a set of reference points for a boundary associated with a set of flight controls, the device may become a geo-fencing device. In some cases, a device may be a geo-fencing device when geo-fencing software or applications are running at a reference point that may provide the device location as a set of boundaries associated with a set of flight controls. The device can have a locator that can provide the location of the geo-fencing device. For example, the location of a smartphone, laptop and/or tablet computer, or other mobile device may be determined. The device may be a relatively mobile device (e.g., a smartphone, a cellular phone, a tablet computer, a personal digital assistant, a laptop computer). Any description herein of a mobile device may be applicable to any other type of device.
The geofencing application can be downloaded to the mobile device. The mobile device may make a request for the mobile device from the system. In some cases, the system may be an empty pipe system, another component of an authentication system, or any other system. The system may provide mobile applications to a mobile device. The geofencing application may collect a location of the mobile device from a locator of the mobile device. For example, if the smartphone already has a locator, the geofencing application may use information from the smartphone locator to determine the location of the geofencing device. The application may provide the location of the mobile device to the blank pipe system (or any other system). The application may optionally provide the location of the geofencing device for the UAV. The mobile device may be converted to a geo-fence device (which may have any of the properties or characteristics of the geo-fence devices described elsewhere herein) through a geo-fence application. In some cases, the geofencing application will be launched or run to cause the mobile device to act as a geofencing device.
Alternatively, the mobile device may register in the geofence system when the geofence application is downloaded. For example, a mobile device may register as a geo-fenced device with an authentication system. The user may be able to specify a username and/or password or other information that may be used to later authenticate the geo-fence device or the user of the geo-fence device. A mobile device may have a unique identifier that may distinguish the mobile device from other devices. The unique identifier may be received via and/or generated by the mobile application. Any authentication process as described elsewhere herein may be provided.
In some implementations, the system can receive a location of the geo-fencing device. The system may be an empty pipe system or any other system. The system may be owned or operated by the same entity that can provide the geofencing mobile application to the mobile device. Alternatively, the system may be owned or operated by an entity other than the entity that can provide the geofencing application to the mobile device. Any description herein of the blank pipe system may also be applicable to any other entity described elsewhere herein. The location of the mobile device may be known and may be used to determine a boundary that may be associated with a set of flight controls. The position of the mobile device may provide a reference to the set of flight controls. The position of the boundary may use the position of the mobile device as a reference. For example, if the mobile device is to move, the location of the boundary may be updated accordingly to move with the mobile device. The location of the mobile device can be used to provide a reference point as a geo-fencing device.
A set of flight controls may be generated on top of the air traffic system. A set of flight restrictions can be generated based on a location of the mobile device provided via the geo-fencing mobile application. A set of flight controls may be generated when the unmanned aerial vehicle comes within a predetermined range of the mobile device. In some embodiments, the empty pipe system may receive a location of the unmanned aerial vehicle. The position of the UAV may be compared to the position of the mobile device to determine whether the UAV has entered a predetermined range. A set of flight restrictions may be generated in consideration of any factor or condition (e.g., unmanned aerial vehicle information, user information, environmental conditions, timing) as described elsewhere herein. The mobile device may provide information that may serve as factors or conditions, or other external data sources may be provided. For example, information collected using other mobile applications of the mobile device may be used to determine a set of flight controls. For example, a mobile device may have a weather application that may be operational and collect information about the mobile device's local weather. Such information may be provided to determine local environmental conditions of the mobile device. In another example, the mobile device may have a local clock that can determine the time. Similarly, the mobile device may have access to a user's calendar. The schedule of the mobile device user may be considered when determining a set of flight controls.
A set of flight controls may then be transmitted to the UAV. A set of flight restrictions can be transmitted to a geo-fencing device, which can in turn transmit the set of flight restrictions to the UAV. The unmanned aerial vehicle may operate according to the set of flight controls.
In some embodiments, a set of flight controls may be generated on top of the mobile device. A set of flight restrictions may be generated using information from the mobile device (e.g., location of the mobile device, information from other mobile applications of the mobile device). The mobile device may receive information when the UAV is within a predetermined range of the mobile device. For example, a mobile device may communicate with an unmanned aerial vehicle and receive a location of the unmanned aerial vehicle. The mobile device may compare the position of the UAV to the position of the mobile device to determine when the UAV is within a predetermined range of the mobile device. In other cases, the mobile device may receive the UAV location from an empty pipe system. The air management system may track the location of each UAV and send information to the mobile device. The air management system may also transmit other information about the unmanned aerial vehicle or the user of the unmanned aerial vehicle.
The mobile device may or may not send a set of flight controls to the air traffic system. In some cases, the mobile device may send a set of flight controls directly to the unmanned aerial vehicle. The mobile device may communicate with the air system and/or the UAV via a mobile application. When the geo-fencing device is a mobile device with a mobile application as described, any combination of the communication types as described elsewhere herein for the geo-fencing device may also be applicable.
The mobile application may also show a user interface to the user of the device. The user may interact with the user interface. The user interface can show information about various geo-fencing devices and/or associated boundaries in the area as described elsewhere herein. The user interface may show the position of the unmanned aerial vehicle. The user interface may show a path or travel trajectory taken or to be taken by the unmanned aerial vehicle. The user interface can show information regarding flight controls associated with the geo-fencing device. The mobile application user interface may show any information as described elsewhere herein.
A user can interact with a user interface provided by a mobile application to control operation of the geo-fencing device. Any type of operational input as described elsewhere herein may be provided. For example, a user may provide one or more parameters for a set of flight controls. The user may provide a boundary relative to the location of the mobile device for a set of flight restrictions. The user may specify the type of flight restrictions and/or values for various flight restriction metrics. The user can specify a priority for the geo-fenced mobile device. When generating a set of flight controls, parameters set by the user may be considered. The set of flight controls may be generated on the air traffic system or on the mobile device, and may be generated based on parameters from a user. Thus, a mobile application that can be used to allow a mobile device to act as a geo-fencing device can also provide information about the geo-fencing device and/or other geo-fencing devices and/or allow a user to control the operation of the geo-fencing device.
Geo-fencing device network
As previously discussed, the geo-fencing devices may communicate with each other. In some implementations, the geo-fence devices can communicate with each other via direct communication or via indirect communication. Various types of communication may be provided between geo-fencing devices as described elsewhere herein.
In some embodiments, a geo-fencing device can have information mounted thereon. The geofence device information may include information about the location of the geofence device, the boundary of the geofence device, the flight control associated with the geofence device, the priority level of the geofence device, and/or identity information of the geofence device (e.g., the geofence device type or the geofence device identifier). The geo-fencing device can collect data using one or more input elements. The input element may be a communication module, a sensor, or any other type of element that may be capable of collecting information. For example, the input element may sense an unmanned aerial vehicle that may be within a predetermined geographic range of the geofencing device. Information about the unmanned aerial vehicle can be determined through the input element. For example, the geo-fence device may be capable of determining a location of the unmanned aerial vehicle, a movement of the unmanned aerial vehicle, identity information of the unmanned aerial vehicle (e.g., unmanned aerial vehicle type or unmanned aerial vehicle identifier), a physical characteristic of the unmanned aerial vehicle, a power level of the unmanned aerial vehicle, or any other information about the unmanned aerial vehicle. In another example, the input element may collect environmental conditions (e.g., ambient climate, ambient complexity, traffic flow, or population density). For example, the input elements may gather information about local wind speed and direction and local air traffic flow.
Any information on the geo-fence device can be shared with other geo-fence devices. In some implementations, the geo-fence devices can share information about the geo-fence devices and/or any collected information. The geo-fencing device can be shared with other geo-fencing devices located within a physical range of the geo-fencing device. Alternatively, the geo-fence device can share information with other geo-fence devices without regard to the physical range of the other geo-fence devices. In addition to sending information to other geo-fencing devices, the geo-fencing devices may also receive information from other geo-fencing devices. In some cases, a geo-fencing device can receive information from other geo-fencing devices within physical range of the geo-fencing device. Alternatively, the geo-fencing device can receive information from other geo-fencing devices without regard to the physical range of the other geo-fencing devices. The geo-fencing device can also share information received from other geo-fencing devices with the other geo-fencing devices. For example, a first geo-fence device may share information received by the first geo-fence device from a third geo-fence device with a second geo-fence device. Similarly, a first geo-fence device may share information received by the first geo-fence device from a second geo-fence device with a third geo-fence device. Thus, the information collected by each geo-fencing device can be utilized by other geo-fencing devices. Collective knowledge of multiple geofencing devices may be greater than knowledge of a single geofencing device.
Thus, the geo-fenced devices may form a network that may share information with each other. A geo-fencing device can create and/or store a local map of the geo-fencing device. The local map can include information about a location of the geo-fencing device. The local map can include information about the location of the geo-fencing device within physical range of the geo-fencing device. The location of the other geo-fencing device can be received by the geo-fencing device from the other geo-fencing device. One or more input elements on a geo-fencing device can be used to sense a location of the geo-fencing device. The local map may include information about the location of one or more unmanned aerial vehicles within physical range of the geofencing device. Information about the unmanned aerial vehicle can be collected using one or more input elements of the geofencing device, or can be received from the unmanned aerial vehicle or other geofencing device. The local map can include environmental conditions within a physical range of the geo-fencing device. In some cases, each geofence device in a network of geofence devices may have a local map. In some cases, one or more of the geo-fencing devices may have a local map. Information from the partial maps of multiple geo-fencing devices may be shared or combined to form a larger or more complete map.
The geo-fence devices can share information. The geo-fence devices may share information directly (e.g., in a P2P manner) with each other or by means of additional entities. The additional entities may function as information repositories. In some implementations, the memory storage system and/or the empty pipe system can serve as a repository for information that can be shared between different geo-fencing devices.
In some embodiments, the unmanned aerial vehicle can share information with the geo-fencing device and vice versa. For example, unmanned aerial vehicles can collect environmental condition information that they can share with other geo-fencing devices. For example, an unmanned aerial vehicle may detect precipitation and local information of geofencing devices. Similarly, the geofencing device may share information with the unmanned aerial vehicle. For example, the geo-fencing devices may collect environmental condition information that they may share with the unmanned aerial vehicle. The geofencing device may collect information about local air traffic flow that the geofencing device may share with the unmanned aerial vehicle.
Geofence examples
Some examples of how unmanned aerial vehicle systems (including authentication systems and/or geo-fencing devices) may be utilized are provided below. Such examples are some illustrations of how the system may be applied, and are not limiting.
Example 1: geo-fencing apparatus for privacy
As the number of unmanned aircraft in an airspace increases, private individuals may wish to retain control over their own residence and retain some privacy. If an unmanned aerial vehicle having a camera is flying over an individual dwelling, the unmanned aerial vehicle may be able to capture images of the dwelling, which may include a user's private yard or rooftop. Unmanned aerial vehicles flying close to the dwelling may also cause noise pollution. In some embodiments, when a novice user is operating an unmanned aerial vehicle, the unmanned aerial vehicle risks colliding into an individual residence or into a person in which the user lives, causing injury or damage.
It may be desirable for individuals to be able to block unmanned aerial vehicles from entering private spaces where they exercise control. For example, an individual may wish to stop an unmanned aerial vehicle outside of their residence or property. Individuals may wish to have unmanned aerial vehicles blocked from residences that they own or that they are renting or subletting. In general, doing so may be challenging because others may be manipulating the unmanned aerial vehicle and may not even be aware of the individual's wishes, or may not have sufficient skill level to retain control of the unmanned aerial vehicle to prevent the unmanned aerial vehicle from drifting into the airspace above the private dwelling.
Individuals may be able to obtain a geo-fencing device that may prevent unmanned aerial vehicles from entering their residential space. In some cases, an individual may purchase a geo-fence device or receive a geo-fence device for free. An individual may place a geo-fencing device in an area where the individual wishes to block entry of an unmanned aerial vehicle.
Fig. 44 provides an illustration of how a geo-fencing device may be used with a private home to limit the use of an unmanned aerial vehicle. For example, person a 4410a may purchase geo-fence device 4420 a. Person a may place the geo-fencing device at a location within person a's property. For example, the geo-fencing device may be secured to the residence of person a. The geofence device may provide a geofence boundary 4430a within which the unmanned aerial vehicle may not enter. The boundary may be within the property of person a. The boundary may be at the boundary line of person a. The boundary may be outside the property of person a. The boundary may prevent the unmanned aerial vehicle from entering or flying over the property of individual a. The boundary may prevent all privately owned unmanned aerial vehicles from entering the airspace of person a. Even if the operator of the unmanned aerial vehicle sends a command for the unmanned aerial vehicle to enter the airspace of person a, the unmanned aerial vehicle may not respond and may be prevented from entering the airspace of person a. The unmanned aerial vehicle flight path may be automatically altered to prevent the unmanned aerial vehicle from entering the airspace of person a. Thus, individual a may be able to enjoy individual a's residence without worrying about whether the unmanned aerial vehicle will enter individual a's airspace.
Some individuals may not have a local geo-fencing device. For example, person B may not mind that the unmanned aerial vehicle enters the airspace of person B. Person B may not have a geo-fencing device. Person B may not have any boundaries that may prevent the unmanned aerial vehicle from passing. Thus, unmanned aerial vehicle 4440 may be found in the airspace of person B.
Person C may be concerned about privacy but may not mind having air traffic over person C's property. Person C can obtain geo-fence device 4420C. Person C may place the geo-fencing device at a location within the property of person C. For example, the geo-fencing device may be secured to the residence of person C. The geo-fence device can provide a geo-fence boundary 4430 c. The geofencing device of person C may allow the unmanned aerial vehicle to fly within the boundary, but may prevent operation of the cameras within the boundary. The boundary may be within the property of person C, may be at an address line of person C, or may be outside the property of person C. The boundary may prevent all privately owned unmanned aerial vehicles from taking pictures while located within the airspace of person C. Even if the operator of the unmanned aerial vehicle sends a command for the unmanned aerial vehicle to record information about the residence of person C using the onboard camera, the cameras of the unmanned aerial vehicle may be automatically powered off or may not be allowed to store or stream any images. Thus, individual C may be able to enjoy individual C's home without worrying about whether the unmanned aerial vehicle will capture images of individual C's property or images from within the airspace of individual C.
The geo-fencing device can be programmable such that an individual owning or operating the geo-fencing device can be able to alter the restrictions associated with the geo-fencing device. If person C later decides that person C no longer wishes to allow the unmanned aerial vehicle to fly over person C's property, person C can update the geo-fence device to no longer allow the unmanned aerial vehicle to fly over person C's property (similar to person A's device).
Any number of private individuals may be able to obtain a geo-fencing device to exercise control over their residence. By providing a geo-fencing device, a home can effectively choose to no longer have an unmanned aerial vehicle enter its airspace or perform some function within its airspace. In one example, the geo-fencing device can be secured to a roof, wall, fence, floor, garage, or any other portion of an individual's residence. The geo-fencing device may be outside of the home or may be inside the home. The geofencing device may be detectable by the unmanned aerial vehicle, may be capable of detecting the unmanned aerial vehicle while the unmanned aerial vehicle is approaching an individual airspace, or may have a location that may be communicated to an air management system. Control can be exercised over the unmanned aerial vehicle within the region that can prevent the unmanned aerial vehicle from behaving paradoxically with a set of flight controls associated with the geofencing device.
Example 2: geofencing device for containment
As more and more novice users of unmanned aerial vehicles attempt to maneuver unmanned aerial vehicles, the risk of unmanned aerial vehicle crashes or accidents may become higher. In some embodiments, an unmanned aerial vehicle novice user may be concerned that the unmanned aerial vehicle drifts out of control and crashes in areas where the user may not be able to recover the unmanned aerial vehicle. For example, if the unmanned aerial vehicle is in an area near a body of water, a user may be concerned that the unmanned aerial vehicle drifts above the body of water and is damaged when colliding into the body of water. In another example, a user may be concerned about flying an unmanned aerial vehicle over the user's property and inadvertently flying the unmanned aerial vehicle into a neighborhood yard or other area that the unmanned aerial vehicle may not enter, and crash.
It may be desirable for a user to be able to maneuver an unmanned aerial vehicle but maintain assurance that the unmanned aerial vehicle will remain within a particular region. For example, an individual may wish to practice manually maneuvering an unmanned aerial vehicle, but not allow the unmanned aerial vehicle to fly too far or out of sight. The user may wish to practice manually maneuvering the unmanned aerial vehicle while reducing the risk that the unmanned aerial vehicle will be damaged or located in an unrecoverable area.
The user may be able to obtain a geo-fencing device that may confine the unmanned aerial vehicle to a known area. In some cases, the user may purchase a geo-fence device or receive a geo-fence device for free. The individual may place the geofencing device in an area where the user wishes to include the unmanned aerial vehicle.
Fig. 45 provides an illustration of how a geo-fence device may be used to block an unmanned aerial vehicle. Scenario a illustrates a situation where a user 4510a of unmanned aerial vehicle 4520a may be located in the user's home. Geo-fencing device 4530a may be provided in the user's home. The geo-fencing device may have an associated boundary 4540 a. The unmanned aerial vehicle may be restricted such that the unmanned aerial vehicle is only allowed to fly within the boundary. The user may be able to manually control the flight of the UAV within the boundary. When the UAV approaches a boundary, flight of the UAV may be taken over from the operator to prevent the UAV from leaving the area. The takeover may cause the UAV to hover until a user provides input bringing the UAV off of a boundary, may cause the UAV to automatically steer, may cause the UAV to land within the area, or return to a starting point. The boundary may prevent the unmanned aerial vehicle from entering the property of the user neighbor 4150 b. Thus, the user need not worry about the unmanned aerial vehicle accidentally entering the neighborhood's yard, having to disturb the neighborhood to gain access to the unmanned aerial vehicle, or worry about potentially damaging objects in the neighborhood yard or injuring the neighborhood.
Scenario B illustrates a situation where user 4510c may maneuver unmanned aerial vehicle 4520c in an outdoor environment. Geo-fencing device 4530c may be provided in the environment. An associated boundary 4540c may be provided around the geo-fencing device. In some cases, the geo-fencing device may be portable. For example, a user may pick up the geofencing device from the user's residence and take it to a local park where the user wishes to practice maneuvering an unmanned aerial vehicle. The geo-fencing device may be positioned such that one or more potential traps or obstacles may be outside the boundary. For example, water 4550 or tree 4560 may be outside the boundary. In turn, the user may be able to practice maneuvering the unmanned aerial vehicle without fear that the unmanned aerial vehicle will strike a tree or fall into the water.
In some embodiments, the geo-fencing device may be picked up and carried by a user from location to location. In another example, the user may wear the geo-fencing device or carry the geo-fencing device in the user's pocket. Thus, the user may operate the unmanned aerial vehicle such that the unmanned aerial vehicle remains within a boundary that may surround the user carrying the geo-fencing device. The user can move around to maneuver the unmanned aerial vehicle. When the geo-fencing device is worn by or carried in a user's pocket, it can move with the user. Thus, the boundary of the unmanned aerial vehicle may travel with the user as the user walks around. The unmanned aerial vehicle may remain near the user, but may move with the user as the user walks around.
The systems, devices, and methods described herein may be applicable to a variety of objects, including movable objects and stationary objects. As previously mentioned, any description herein of an aircraft, such as an unmanned aerial vehicle, may be applicable and used with any movable object. Any description herein of an aircraft may be specifically applicable to an unmanned aerial vehicle. The movable object of the present invention may be configured for movement within any suitable environment, such as in the air (e.g., a fixed wing aircraft, a rotorcraft, or an aircraft having neither fixed wings nor rotors), in water (e.g., a ship or submarine), on the ground (e.g., an automobile such as a car, truck, bus, van, motorcycle, bicycle; movable structure or frame such as a pole, fishing rod; or train), underground (e.g., a subway), in space (e.g., a space shuttle, satellite, or probe), or any combination of these environments. The movable object may be a vehicle, such as the vehicles described elsewhere herein. In some embodiments, the movable object may be carried by or take off from a living body, such as a human or animal. Suitable animals may include avians, canines, felines, equines, bovines, ovines, porcines, dolphins, rodents, or insects.
The movable object may be able to move freely within the environment with respect to six degrees of freedom (e.g., three translational degrees of freedom and three rotational degrees of freedom). Alternatively, the movement of the movable object may be constrained with respect to one or more degrees of freedom, such as by a predetermined path, trajectory, or orientation. The movement may be actuated by any suitable actuation mechanism, such as an engine or motor. The actuating mechanism of the movable object may be powered by any suitable energy source, such as electrical energy, magnetic energy, solar energy, wind energy, gravitational energy, chemical energy, nuclear energy, or any suitable combination thereof. The movable object may be self-propelled via a propulsion system as described elsewhere herein. The propulsion system may optionally be operated by means of an energy source such as electric energy, magnetic energy, solar energy, wind energy, gravitational energy, chemical energy, nuclear energy or any suitable combination thereof. Alternatively, the movable object may be carried by a living being.
In some cases, the movable object may be an aircraft. For example, the aircraft may be a fixed wing aircraft (e.g., airplane, glider), a rotorcraft (e.g., helicopter, rotorcraft), an aircraft having both fixed wings and rotors, or an aircraft having neither fixed wings nor rotors (e.g., airship, hot air balloon). The aircraft may be self-propelled, such as self-propelled in the air. Self-propelled aircraft may utilize a propulsion system, such as a propulsion system including one or more engines, motors, wheels, axles, magnets, rotors, propellers, blades, nozzles, or any suitable combination thereof. In some cases, a propulsion system may be used to enable a movable object to take off from, land on, maintain its current position and/or orientation (e.g., hover), change orientation, and/or change position.
The movable object may be remotely controlled by a user or locally controlled by an occupant in or on the movable object. The movable object may be remotely controlled via an occupant within the individual vehicle. In some embodiments, the movable object is an unmanned movable object, such as an unmanned aerial vehicle. An unmanned movable object, such as an unmanned aerial vehicle, may not have a passenger riding on the movable object. The movable object may be controlled by a human or an autonomous control system (e.g., a computer control system), or any suitable combination thereof. The movable object may be an autonomous or semi-autonomous robot, such as a robot configured with artificial intelligence.
The movable object may have any suitable size and/or dimensions. In some embodiments, the movable object may be sized and/or dimensioned to fit within or on a vehicle within which a human occupant is located. Alternatively, the movable object may have a size and/or dimensions that are smaller than the size and/or dimensions in or on the vehicle that are capable of accommodating a human occupant. The movable object may have a size and/or dimensions suitable for being carried or carried by a human. Alternatively, the movable object may be larger than a size and/or dimension suitable for handling or carrying by a human. In some cases, the movable object can have a maximum dimension (e.g., length, width, height, diameter, diagonal) that is less than or equal to about: 2cm, 5cm, 10cm, 50cm, 1m, 2m, 5m or 10 m. The maximum dimension may be greater than or equal to about: 2cm, 5cm, 10cm, 50cm, 1m, 2m, 5m or 10 m. For example, the distance between the axes of opposing rotors of the movable object may be less than or equal to about: 2cm, 5cm, 10cm, 50cm, 1m, 2m, 5m or 10 m. Alternatively, the distance between the axes of opposing rotors may be greater than or equal to about: 2cm, 5cm, 10cm, 50cm, 1m, 2m, 5m or 10 m.
In some embodiments, the movable object may have a volume of less than 100cm x 100cm x 100cm, less than 50cm x 50cm x 30cm, or less than 5cm x 5cm x 3 cm. The total volume of the movable object may be less than or equal to about: 1cm3, 2cm3, 5cm3, 10cm3, 20cm3, 30cm3, 40cm3, 50cm3, 60cm3, 70cm3, 80cm3, 90cm3, 100cm3, 150cm3, 200cm3, 300cm3, 500cm3, 750cm3, 1000cm3, 5000cm3, 10,000cm3, 100,000cm3, 1m3, or 10m 3. Conversely, the total volume of the movable object may be greater than or equal to about: 1cm3, 2cm3, 5cm3, 10cm3, 20cm3, 30cm3, 40cm3, 50cm3, 60cm3, 70cm3, 80cm3, 90cm3, 100cm3, 150cm3, 200cm3, 300cm3, 500cm3, 750cm3, 1000cm3, 5000cm3, 10,000cm3, 100,000cm3, 1m3, or 10m 3.
In some embodiments, the movable object may have a footprint (which may refer to the cross-sectional area enclosed by the movable object) of less than or equal to about: 32,000 cm2, 20,000cm2, 10,000cm2, 1,000cm2, 500cm2, 100cm2, 50cm2, 10cm2, or 5cm 2. Conversely, the footprint may be greater than or equal to about: 32,000 cm2, 20,000cm2, 10,000cm2, 1,000cm2, 500cm2, 100cm2, 50cm2, 10cm2, or 5cm 2.
In some cases, the movable object may weigh no more than 1000 kg. The weight of the movable object may be less than or equal to about: 1000kg, 750kg, 500kg, 200kg, 150kg, 100kg, 80kg, 70kg, 60kg, 50kg, 45kg, 40kg, 35kg, 30kg, 25kg, 20kg, 15kg, 12kg, 10kg, 9kg, 8kg, 7kg, 6kg, 5kg, 4kg, 3kg, 2kg, 1kg, 0.5kg, 0.1kg, 0.05kg or 0.01 kg. Conversely, the weight may be greater than or equal to about: 1000kg, 750kg, 500kg, 200kg, 150kg, 100kg, 80kg, 70kg, 60kg, 50kg, 45kg, 40kg, 35kg, 30kg, 25kg, 20kg, 15kg, 12kg, 10kg, 9kg, 8kg, 7kg, 6kg, 5kg, 4kg, 3kg, 2kg, 1kg, 0.5kg, 0.1kg, 0.05kg or 0.01 kg.
In some embodiments, the load carried by the movable object relative to the movable object may be small. As further detailed elsewhere herein, the payload may include a payload and/or a carrier. In some examples, the ratio of the weight of the movable object to the weight of the load may be greater than, less than, or equal to about 1: 1. In some cases, the ratio of the weight of the movable object to the weight of the load may be greater than, less than, or equal to about 1: 1. Alternatively, the ratio of carrier weight to load weight can be greater than, less than, or equal to about 1: 1. When desired, the ratio of the weight of the movable object to the weight of the load may be less than or equal to: 1:2, 1:3, 1:4, 1:5, 1:10, or even less. Conversely, the ratio of the weight of the movable object to the weight of the load may also be greater than or equal to: 2:1, 3:1, 4:1, 5:1, 10:1, or even greater.
In some embodiments, the movable object may have low energy consumption. For example, the movable object may use less than about: 5W/h, 4W/h, 3W/h, 2W/h, 1W/h or less. In some cases, the carrier of the movable object may have a low energy consumption. For example, the carrier may use less than about: 5W/h, 4W/h, 3W/h, 2W/h, 1W/h or less. Alternatively, the payload of the movable object may have a low energy consumption, such as less than about: 5W/h, 4W/h, 3W/h, 2W/h, 1W/h or less.
Fig. 36 illustrates an Unmanned Aerial Vehicle (UAV)3600 in accordance with an embodiment of the present invention. The unmanned aerial vehicle may be an example of a movable object as described herein, and the methods and apparatus for discharging a battery assembly may be adapted thereto. Unmanned aerial vehicle 3600 may include a propulsion system having four rotors 3602, 3604, 3606, and 3608. Any number of rotors (e.g., one, two, three, four, five, six, or more) may be provided. A rotor, rotor assembly, or other propulsion system of an unmanned aerial vehicle may enable the unmanned aerial vehicle to hover over/hold in position, change orientation, and/or change position. The distance between the axes of the opposing rotors may be any suitable length 3610. For example, length 3610 may be less than or equal to 2m, or less than or equal to 5 m. In some embodiments, length 3610 may be in a range from 40cm to 1m, from 10cm to 2m, or from 5cm to 5 m. Any description herein of an unmanned aerial vehicle may be applicable to movable objects, such as different types of movable objects, and vice versa. The unmanned aerial vehicle may use a assisted takeoff system or method as described herein.
In some embodiments, the movable object may be configured to carry a load. The load may include one or more of passengers, goods, equipment, instruments, and the like. The load may be provided within the housing. The housing may be separate from or part of the housing of the movable object. Alternatively, the load may be provided with a housing and the movable object may not have a housing. Alternatively, some portion of the load or the entire load may be provided without a housing. The load may be rigidly fixed relative to the movable object. Alternatively, the load may be movable relative to the movable object (e.g., may translate or rotate relative to the movable object). As described elsewhere herein, the payload may include a payload and/or a carrier.
In some embodiments, movement of the movable object, carrier, and payload relative to a fixed reference frame (e.g., the surrounding environment) and/or relative to each other may be controlled by the terminal. The terminal may be a remote control at a location remote from the movable object, carrier and/or payload. The terminal may be mounted on or secured to the support platform. Alternatively, the terminal may be a handheld or wearable device. For example, the terminal may include a smartphone, tablet, laptop, computer, glasses, gloves, helmet, microphone, or a suitable combination thereof. The terminal may comprise a user interface such as a keyboard, mouse, joystick, touch screen or display. Any suitable user input may be used to interact with the terminal, such as manual input commands, voice control, gesture control, or position control (e.g., via movement, position, or tilt of the terminal).
The terminal may be used to control any suitable state of the movable object, carrier and/or payload. For example, the terminal may be used to control the position and/or orientation of the movable object, carrier, and/or payload relative to a fixed reference from and/or to one another. In some embodiments, the terminal may be used to control individual elements of the movable object, carrier, and/or payload, such as an actuation assembly of the carrier, a sensor of the payload, or an emitter of the payload. The terminal may comprise wireless communication means adapted to communicate with one or more of the movable object, the carrier or the payload.
The terminal may comprise a suitable display unit for viewing information of the movable object, the carrier and/or the payload. For example, the terminal may be configured to display information of the movable object, carrier, and/or payload regarding position, translational velocity, translational acceleration, orientation, angular velocity, angular acceleration, or any suitable combination thereof. In some implementations, the terminal can display information provided by the payload, such as data provided by the functional payload (e.g., images recorded by a camera or other image capture device).
Alternatively, the same terminal may simultaneously control the movable object, carrier and/or payload or the state of the movable object, carrier and/or payload and receive and/or display information from the movable object, carrier and/or payload. For example, the terminal may control the positioning of the payload relative to the environment while displaying image data captured by the payload, or information about the location of the payload. Alternatively, different terminals may be used for different functions. For example, a first terminal may control movement or state of a movable object, carrier, and/or payload, while a second terminal may receive and/or display information from the movable object, carrier, and/or payload. For example, a first terminal may be used to control the positioning of the payload relative to the environment while a second terminal displays image data captured by the payload. Various modes of communication may be utilized between a movable object and an integrated terminal that simultaneously controls the movable object and receives data, or between a movable object and multiple terminals that simultaneously control the movable object and receive data. For example, at least two different communication modes may be formed between a movable object and a terminal that simultaneously controls the movable object and receives data from the movable object.
Fig. 37 illustrates a movable object 3700 that includes a carrier 3702 and a payload 3704, according to an embodiment of the invention. Although movable object 3700 is depicted as an aircraft, such depiction is not intended to be limiting and any suitable type of movable object may be used as previously described. Those skilled in the art will appreciate that any of the embodiments described herein in the context of an aircraft system may be applicable to any suitable movable object (e.g., an unmanned aerial vehicle). In some cases, payload 3704 may be provided on movable object 3700 without carrier 3702. The movable object 3700 can include a propulsion mechanism 3706, a sensing system 3708, and a communication system 3710.
As previously described, propulsion mechanism 3706 can include one or more of a rotor, propeller, blade, engine, motor, wheel, axle, magnet, or nozzle. The movable object may have one or more, two or more, three or more, or four or more propulsion mechanisms. The propulsion mechanisms may all be of the same type. Alternatively, one or more of the propulsion mechanisms may be a different type of propulsion mechanism. The propulsion mechanism 3706 can be mounted on the movable object 3700 using any suitable device, such as a support element (e.g., a drive shaft) as described elsewhere herein. The propulsion mechanism 3706 may be mounted on any suitable portion of the movable object 3700, such as the top, bottom, front, back, sides, or suitable combinations thereof.
In some embodiments, the propulsion mechanism 3706 may enable the movable object 3700 to take off vertically from a surface or land vertically on a surface without any horizontal movement of the movable object 3700 (e.g., without traveling along a runway). Optionally, the propulsion mechanism 3706 may be operable to allow the movable object 3700 to hover in the air at a specified location and/or orientation. One or more propulsion mechanisms 3700 may be controlled independently of the other propulsion mechanisms. Alternatively, the propulsion mechanisms 3700 may be configured to be controlled simultaneously. For example, movable object 3700 can have multiple horizontally oriented rotors that can provide lift and/or thrust to the movable object. The plurality of horizontally oriented rotors can be actuated to provide vertical takeoff, vertical landing, and hovering capabilities to movable object 3700. In some embodiments, one or more of the horizontally oriented rotors may rotate in a clockwise direction while one or more of the horizontal rotors may rotate in a counterclockwise direction. For example, the number of clockwise rotors may be equal to the number of counterclockwise rotors. The rate of rotation of each horizontally oriented rotor can be independently varied to control the lift and/or thrust generated by each rotor and thereby adjust the spatial layout, velocity, and/or acceleration (e.g., with respect to up to three translational degrees of freedom and up to three rotational degrees of freedom) of movable object 3700.
The sensing system 3708 can include one or more sensors that can sense a spatial layout, velocity, and/or acceleration (e.g., with respect to up to three translational degrees of freedom and up to three rotational degrees of freedom) of the movable object 3700. The one or more sensors may include a Global Positioning System (GPS) sensor, a motion sensor, an inertial sensor, a distance sensor, or an image sensor. The sensing data provided by the sensing system 3708 may be used to control the spatial layout, speed, and/or orientation of the movable object 3700 (e.g., using a suitable processing unit and/or control module, as described below). Alternatively, the sensing system 3708 may be used to provide data about the environment surrounding the movable object, such as weather conditions, distance from potential obstacles, location of geographic features, location of man-made structures, and so forth.
The communication system 3710 supports communications via wireless signals 3716 to a terminal 3712 having a communication system 3714. The communication system 3710, 3714 may include any number of transmitters, receivers, and/or transceivers suitable for wireless communication. The communication may be a one-way communication such that data can only be transmitted in one direction. For example, the one-way communication may involve only the movable object 3700 transmitting data to the terminal 3712, or vice versa. Data may be transmitted from one or more transmitters of the communication system 3710 to one or more receivers of the communication system 3712, or vice versa. Alternatively, the communication may be a two-way communication, such that data is transmittable in both directions between the movable object 3700 and the terminal 3712. Bidirectional communication may involve transmission of data from one or more transmitters of the communication system 3710 to one or more receivers of the communication system 3714, and vice versa.
In some embodiments, terminal 3712 may provide control data to one or more of movable object 3700, carrier 3702, and payload 3704, as well as receive information (e.g., position and/or motion information of movable object 3700, carrier 3702, and payload 3704; data sensed by the payload, such as image data captured by a payload camera). In some cases, the control data from the terminal may include instructions for the relative position, movement, actuation, or control of the movable object, carrier, and/or payload. For example, the control data may result in a modification of the position and/or orientation of the movable object (e.g., via control of the propulsion mechanism 3706), or a movement of the payload relative to the movable object (e.g., via control of the carrier 3702). Control data from the terminal may result in control of the payload, such as control of the operation of a camera or other image capture device (e.g., taking a still or moving picture, zooming in or out, turning on or off, switching imaging modes, changing image resolution, changing focus, changing depth of field, changing exposure time, changing angle of view or field of view). In some cases, the communication from the movable object, carrier, and/or payload may include information from one or more sensors (e.g., sensors of sensing system 3708 or payload 3704). The communication may include sensed information from one or more different types of sensors (e.g., GPS sensors, motion sensors, inertial sensors, distance sensors, or image sensors). Such information may relate to the position (e.g., position, orientation), movement, or acceleration of the movable object, carrier, and/or payload. Such information from the payload may include data captured by the payload or a sensed state of the payload. The control data provided and transmitted by the terminal 3712 may be configured to control the state of one or more of the movable object 3700, the carrier 3702, or the payload 3704. Alternatively or in combination, the carrier 3702 and payload 3704 may also each include a communication module configured for communication with the terminal 3712 such that it can communicate with and control each of the movable object 3700, carrier 3702, and payload 3704 independently.
In some embodiments, the movable object 3700 may be configured for communication with another remote device, in addition to the terminal 3712 or in place of the terminal 3712. The terminal 3712 may also be configured to communicate with another remote device and the movable object 3700. For example, the movable object 3700 and/or the terminal 3712 can be in communication with another movable object or a carrier or payload of another movable object. The remote device may be a second terminal or other computing device (e.g., a computer, laptop, tablet, smart phone, or other mobile device) when desired. The remote device may be configured to transmit data to the movable object 3700, receive data from the movable object 3700, transmit data to the terminal 3712, and/or receive data from the terminal 3712. Alternatively, the remote device may be connected to the internet or other telecommunications network so that data received from the movable object 3700 and/or the terminal 3712 may be uploaded to a website or server.
Fig. 38 is a schematic diagram, illustrated by a block diagram, of a system 3800 for controlling a movable object, according to an embodiment of the invention. System 3800 can be used in conjunction with any suitable implementation of the systems, apparatuses, and methods disclosed herein. System 3800 may include a sensing module 3802, a processing unit 3804, a non-transitory computer-readable medium 3806, a control module 3808, and a communication module 3810.
The sensing module 3802 may utilize different types of sensors that collect information about the movable object in different ways. Different types of sensors may sense different types of signals or signals from different sources. For example, the sensors may include inertial sensors, GPS sensors, distance sensors (e.g., lidar) or vision/image sensors (e.g., cameras). The sensing module 3802 may be operably coupled to a processing unit 3804 having a plurality of processors. In some embodiments, the sensing module may be operably coupled to a transmission module 3812 (e.g., a Wi-Fi image transmission module) configured to transmit the sensed data directly to a suitable external device or system. For example, the transmission module 3812 may be used to transmit an image captured by the camera of the sensing module 3802 to a remote terminal.
The processing unit 3804 may have one or more processors, such as a programmable processor (e.g., a Central Processing Unit (CPU)). The processing unit 3804 may be operatively coupled to the non-transitory computer-readable medium 3806. The non-transitory computer-readable medium 3806 may store logic, code, and/or program instructions that are executable by the processing unit 3804 to perform one or more steps. The non-transitory computer-readable medium may include one or more memory units (e.g., a removable medium or external storage such as an SD card or Random Access Memory (RAM)). In some embodiments, data from the sensing module 3802 may be directly transferred to and stored within a memory unit of the non-transitory computer-readable medium 3806. The memory unit of the non-transitory computer-readable medium 3806 may store logic, code, and/or program instructions executable by the processing unit 3804 to perform any suitable implementation of the methods described herein. For example, processing unit 3804 may be configured to execute instructions that cause one or more processors of processing unit 3804 to analyze sensed data produced by the sensing module. The memory unit may store sensing data from the sensing module to be processed by the processing unit 3804. In some implementations, the memory unit of the non-transitory computer-readable medium 3806 may be used to store the processing results produced by the processing unit 3804.
In some embodiments, the processing unit 3804 may be operably coupled to a control module 3808, the control module 3808 configured to control a state of the movable object. For example, control module 3808 may be configured to control a propulsion mechanism of the movable object to adjust a spatial layout, a velocity, and/or an acceleration of the movable object with respect to six degrees of freedom. Alternatively or in combination, the control module 3808 may control one or more of the carrier, payload, or state of the sensing module.
The processing unit 3804 may be operatively coupled to a communication module 3810, the communication module 3810 being configured to transmit and/or receive data from one or more external devices (e.g., a terminal, a display device, or other remote control). Any suitable communication means may be used, such as wired or wireless communication. For example, the communication module 3810 may utilize one or more of a Local Area Network (LAN), a Wide Area Network (WAN), infrared, radio, WiFi, peer-to-peer (P2P) network, telecommunications network, cloud communications, and the like. Alternatively, relay stations such as towers, satellites or mobile stations may be used. Wireless communication may be distance dependent or independent. In some embodiments, the communication may or may not require a line of sight. The communication module 3810 may transmit and/or receive one or more of sensing data from the sensing module 3802, a processing result generated by the processing unit 3804, predetermined control data, a user command from a terminal or a remote controller, and the like.
The components of system 3800 can be arranged in any suitable configuration. For example, one or more components of system 3800 can be located on a movable object, a carrier, a payload, a terminal, a sensing system, or an additional external device in communication with one or more of the above. Moreover, while fig. 38 depicts a single processing unit 3804 and a single non-transitory computer-readable medium 3806, those skilled in the art will appreciate that this is not intended to be limiting and that system 3800 may include multiple processing units and/or non-transitory computer-readable media. In some implementations, one or more of the plurality of processing units and/or non-transitory computer-readable media may be located in different locations, such as on a movable object, a carrier, a payload, a terminal, a sensing module, an additional external device in communication with one or more of the above, or a suitable combination thereof, such that any suitable aspect of the processing and/or memory functions performed by the system 3800 may occur at one or more of the above locations.
While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous modifications, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.

Claims (10)

1. A method of determining a position of an unmanned aerial vehicle, comprising:
calculating a position of the UAV based on data from a recorder, wherein the recorder is configured to receive one or more messages from the UAV;
the location of the unmanned aerial vehicle is compared to a location of a geofence boundary, and one or more flight response actions are taken based on the comparison to regulate the unmanned aerial vehicle's activity within or outside of the geofence boundary.
2. The method of claim 1, wherein taking one or more flight response actions based on the comparison comprises:
providing information about the comparison to the UAV to cause the UAV to take flight response action.
3. The method of claim 2, wherein providing information about the comparison to the UAV to cause the UAV to take flight response action comprises:
providing information about the comparison to the UAV to cause the UAV to automatically control a flight path to avoid the geofence boundary, or to cause the UAV to automatically land upon entering one or more geofence boundaries, or to cause the UAV to automatically control a flight path to cause the UAV to exit an area enclosed by the geofence boundaries upon entering one or more geofence boundaries.
4. The method of claim 1, wherein taking one or more flight response actions based on the comparison comprises:
providing a command to alter a flight path based on the comparison UAV.
5. The method of claim 1, further comprising: determining a location of the geofence boundary based on a location of a geofence device.
6. The method according to any one of claims 1-5, further comprising:
comparing a reported position of the unmanned aerial vehicle with the calculated position of the unmanned aerial vehicle, and providing a warning when a difference between the reported position and the calculated position exceeds a threshold value.
7. The method of claim 6, wherein the warning indicates a type of hazard.
8. The method of claim 1, wherein the number of recorders is multiple.
9. The method of claim 1, wherein the recorder has a predetermined position.
10. An empty pipe system comprising: one or more processors configured to, among other things,
the one or more processors are to:
Calculating a position of the UAV based on data from a recorder, wherein the recorder is configured to receive one or more messages from the UAV;
the location of the unmanned aerial vehicle is compared to a location of a geofence boundary, and one or more flight response actions are taken based on the comparison to regulate the unmanned aerial vehicle's activity within or outside of the geofence boundary.
CN202011088133.0A 2015-03-31 2015-03-31 Method for determining position of unmanned aerial vehicle and air traffic control system Pending CN112908038A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011088133.0A CN112908038A (en) 2015-03-31 2015-03-31 Method for determining position of unmanned aerial vehicle and air traffic control system

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
PCT/CN2015/075621 WO2016154944A1 (en) 2015-03-31 2015-03-31 Systems and methods for tracking uav activity
CN201580078091.3A CN107407915B (en) 2015-03-31 2015-03-31 Authentication system and method for generating flight controls
CN202011088133.0A CN112908038A (en) 2015-03-31 2015-03-31 Method for determining position of unmanned aerial vehicle and air traffic control system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201580078091.3A Division CN107407915B (en) 2015-03-31 2015-03-31 Authentication system and method for generating flight controls

Publications (1)

Publication Number Publication Date
CN112908038A true CN112908038A (en) 2021-06-04

Family

ID=57005393

Family Applications (3)

Application Number Title Priority Date Filing Date
CN201580078091.3A Expired - Fee Related CN107407915B (en) 2015-03-31 2015-03-31 Authentication system and method for generating flight controls
CN201910302628.XA Active CN110015418B (en) 2015-03-31 2015-03-31 Authentication system and method for generating flight controls
CN202011088133.0A Pending CN112908038A (en) 2015-03-31 2015-03-31 Method for determining position of unmanned aerial vehicle and air traffic control system

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CN201580078091.3A Expired - Fee Related CN107407915B (en) 2015-03-31 2015-03-31 Authentication system and method for generating flight controls
CN201910302628.XA Active CN110015418B (en) 2015-03-31 2015-03-31 Authentication system and method for generating flight controls

Country Status (4)

Country Link
EP (1) EP3207428A4 (en)
JP (1) JP6535382B2 (en)
CN (3) CN107407915B (en)
WO (1) WO2016154944A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113781845A (en) * 2021-09-17 2021-12-10 杭州科技职业技术学院 Electronic fence establishing method and system for unmanned aerial vehicle and electronic equipment
CN114371735A (en) * 2022-01-07 2022-04-19 广东汇天航空航天科技有限公司 Aircraft geo-fence data processing method and system
CN115273562A (en) * 2022-07-27 2022-11-01 齐鲁空天信息研究院 Consistency monitoring method for general aviation low-altitude navigation flight

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6459014B2 (en) 2015-03-31 2019-01-30 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Geo-fencing device
WO2016154949A1 (en) 2015-03-31 2016-10-06 SZ DJI Technology Co., Ltd. Authentication systems and methods for generating flight regulations
US20220392353A1 (en) * 2016-02-08 2022-12-08 Skydio, Inc. Unmanned aerial vehicle privacy controls
US20180025649A1 (en) 2016-02-08 2018-01-25 Unmanned Innovation Inc. Unmanned aerial vehicle privacy controls
US9826415B1 (en) 2016-12-01 2017-11-21 T-Mobile Usa, Inc. Tactical rescue wireless base station
CN108260076B (en) * 2016-12-28 2020-10-09 中国电信股份有限公司 Method, platform and system for monitoring unmanned aerial vehicle running track
CN110235503B (en) * 2017-02-02 2023-07-14 瑞典爱立信有限公司 Allocation message acknowledging access attempt without allocating resources
US10735908B2 (en) * 2017-03-22 2020-08-04 Nokia Technologies Oy Systems and apparatuses for detecting unmanned aerial vehicle
RU2760321C2 (en) * 2017-04-10 2021-11-23 Нокиа Солюшнс энд Нетуоркс Ой Location determination based on time delay values
US10706381B2 (en) * 2017-07-05 2020-07-07 Omnitracs, Llc Vehicle and drone management system
US10583923B2 (en) 2017-08-07 2020-03-10 Honeywell International Inc. Control station audio and data recording systems for vehicles
EP3662301A1 (en) * 2017-08-15 2020-06-10 Flarm Technology AG Remote aircraft identification for uav
US11412475B2 (en) 2017-09-14 2022-08-09 Telefonaktiebolaget Lm Ericsson (Publ) Technique for verifying a geographical position of a UAV
FR3074347B1 (en) * 2017-11-24 2022-10-14 Thales Sa ELECTRONIC SYSTEM FOR REMOTE CONTROL OF DRONES, METHOD FOR COMPUTER PROGRAM ASSOCIATED
CN109120354A (en) * 2018-08-29 2019-01-01 无锡若飞科技有限公司 Unmanned plane monitoring and managing method and system and computer storage medium
US11279467B2 (en) * 2018-10-17 2022-03-22 Subaru Corporation Aircraft control system
US11587366B1 (en) 2018-11-20 2023-02-21 State Farm Mutual Automobile Insurance Company Systems and methods for selecting locations to validate automated vehicle data transmission
JPWO2020189702A1 (en) * 2019-03-19 2020-09-24
US20220187851A1 (en) * 2019-03-20 2022-06-16 Telefonaktiebolaget Lm Ericsson (Publ) Technique for controlling a uav
KR102290702B1 (en) * 2019-12-12 2021-08-17 서울여자대학교 산학협력단 Method for recognition about drone position
CN111103891B (en) * 2019-12-30 2021-03-16 西安交通大学 Unmanned aerial vehicle rapid posture control system and method based on skeleton point detection
EP3883235A1 (en) 2020-03-17 2021-09-22 Aptiv Technologies Limited Camera control modules and methods
SE2050738A1 (en) 2020-06-22 2021-12-23 Sony Group Corp System and method for image content recording of a moving user
CN112821930A (en) * 2020-10-25 2021-05-18 泰州物族信息科技有限公司 Adaptive antenna state management platform
CN112489413B (en) * 2020-11-27 2022-01-11 京东方科技集团股份有限公司 Control method and system of remote controller, storage medium and electronic equipment
US11907888B2 (en) 2020-12-23 2024-02-20 Here Global B.V. Aerial vehicle and computing device interaction for validating aerial vehicle activity
CN112904896B (en) * 2021-01-21 2022-11-04 中国南方电网有限责任公司超高压输电公司柳州局 Unmanned aerial vehicle autonomous driving route multiplexing method
CN116453378B (en) * 2023-06-16 2023-09-08 陕西德鑫智能科技有限公司 Unmanned aerial vehicle navigation section handover switching method and device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101120233A (en) * 2004-09-28 2008-02-06 特林布尔导航有限公司 Method and system for controlling a valuable movable item
JP2008146450A (en) * 2006-12-12 2008-06-26 Toshiba Corp Ads-b ground station
CN102186140A (en) * 2011-05-07 2011-09-14 东莞市车友互联信息科技有限公司 Method for monitoring global positioning system (GPS) terminal and server for implementing method
CN102279406A (en) * 2011-04-12 2011-12-14 广州星唯信息科技有限公司 Fence identification method using global positioning system (GPS) to position tracks
JP2012122775A (en) * 2010-12-06 2012-06-28 Nec Corp Aircraft position measuring system, time synchronization method, and time synchronization program for use in the system
US20120215382A1 (en) * 2011-02-23 2012-08-23 Hon Hai Precision Industry Co., Ltd. System and method for controlling unmanned aerial vehicle in flight space
CN103106807A (en) * 2013-01-11 2013-05-15 南威软件股份有限公司 Method of location early warning in official vehicle monitoring
CN103606302A (en) * 2013-08-13 2014-02-26 重庆享邑航空科技有限公司 Aircraft border-crossing management and control method and system

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3019071B2 (en) * 1998-02-16 2000-03-13 日本電気株式会社 Intrusion / collision prediction apparatus and method, and recording medium recording intrusion / collision prediction program
US20030067542A1 (en) * 2000-10-13 2003-04-10 Monroe David A. Apparatus for and method of collecting and distributing event data to strategic security personnel and response vehicles
US8446321B2 (en) * 1999-03-05 2013-05-21 Omnipol A.S. Deployable intelligence and tracking system for homeland security and search and rescue
JP2003240847A (en) * 2002-02-18 2003-08-27 Kawasaki Heavy Ind Ltd Pseudo-interrogation signal generator for air traffic control secondary surveillance radar
US7107148B1 (en) * 2003-10-23 2006-09-12 International Business Machines Corporation Navigating a UAV with on-board navigation algorithms with flight depiction
EP1868132A4 (en) * 2005-03-23 2014-06-18 Ihc Corp Authentication system
JP4640806B2 (en) * 2005-07-27 2011-03-02 株式会社エヌ・ティ・ティ・データ Collision risk prediction system and program
US7948439B2 (en) * 2008-06-20 2011-05-24 Honeywell International Inc. Tracking of autonomous systems
WO2010137596A1 (en) * 2009-05-26 2010-12-02 国立大学法人 千葉大学 Mobile body control device and mobile body in which mobile body control device is mounted
US20110019558A1 (en) * 2009-07-27 2011-01-27 Honeywell International Inc. Distributed latency measurement system for communication system analysis
US8425683B2 (en) * 2009-11-17 2013-04-23 Acoustic Systems, Inc. Method for tracking a scraper within a pipeline
CN103177545A (en) * 2011-12-26 2013-06-26 联想(北京)有限公司 Remote controller, mobile equipment and method for controlling mobile equipment by using remote controller
CN102637023A (en) * 2012-03-23 2012-08-15 王效波 Remote unmanned aerial vehicle cluster control method and system based on 3G (the 3rd Generation Telecommunication) and GPRS (General Packet Radio Service) cell phone communication
CN104052914A (en) * 2013-03-14 2014-09-17 董亮 System for automatic target following and shooting by use of aircraft
EP2801838B1 (en) * 2013-05-08 2021-02-24 Airbus Defence and Space GmbH Evaluating the position of an aerial vehicle
US9715005B2 (en) * 2013-06-06 2017-07-25 Zih Corp. Method, apparatus, and computer program product improving real time location systems with multiple location technologies
US9188979B2 (en) * 2013-08-06 2015-11-17 Lockheed Martin Corporation Method and system for remotely controlling a vehicle
JP5808781B2 (en) * 2013-09-12 2015-11-10 富士重工業株式会社 Flight control system for unmanned aerial vehicles

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101120233A (en) * 2004-09-28 2008-02-06 特林布尔导航有限公司 Method and system for controlling a valuable movable item
JP2008146450A (en) * 2006-12-12 2008-06-26 Toshiba Corp Ads-b ground station
JP2012122775A (en) * 2010-12-06 2012-06-28 Nec Corp Aircraft position measuring system, time synchronization method, and time synchronization program for use in the system
US20120215382A1 (en) * 2011-02-23 2012-08-23 Hon Hai Precision Industry Co., Ltd. System and method for controlling unmanned aerial vehicle in flight space
CN102279406A (en) * 2011-04-12 2011-12-14 广州星唯信息科技有限公司 Fence identification method using global positioning system (GPS) to position tracks
CN102186140A (en) * 2011-05-07 2011-09-14 东莞市车友互联信息科技有限公司 Method for monitoring global positioning system (GPS) terminal and server for implementing method
CN103106807A (en) * 2013-01-11 2013-05-15 南威软件股份有限公司 Method of location early warning in official vehicle monitoring
CN103606302A (en) * 2013-08-13 2014-02-26 重庆享邑航空科技有限公司 Aircraft border-crossing management and control method and system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113781845A (en) * 2021-09-17 2021-12-10 杭州科技职业技术学院 Electronic fence establishing method and system for unmanned aerial vehicle and electronic equipment
CN114371735A (en) * 2022-01-07 2022-04-19 广东汇天航空航天科技有限公司 Aircraft geo-fence data processing method and system
CN114371735B (en) * 2022-01-07 2023-11-03 广东汇天航空航天科技有限公司 Aircraft geofence data processing method and system
CN115273562A (en) * 2022-07-27 2022-11-01 齐鲁空天信息研究院 Consistency monitoring method for general aviation low-altitude navigation flight

Also Published As

Publication number Publication date
JP2018510402A (en) 2018-04-12
CN110015418B (en) 2021-05-18
CN107407915A (en) 2017-11-28
CN107407915B (en) 2020-11-03
CN110015418A (en) 2019-07-16
WO2016154944A1 (en) 2016-10-06
EP3207428A1 (en) 2017-08-23
EP3207428A4 (en) 2017-11-15
JP6535382B2 (en) 2019-06-26

Similar Documents

Publication Publication Date Title
US11961093B2 (en) Authentication systems and methods for generating flight regulations
US20210375143A1 (en) Systems and methods for geo-fencing device communications
CN107407915B (en) Authentication system and method for generating flight controls
CN107409174B (en) System and method for regulating operation of an unmanned aerial vehicle
CN107430403B (en) System and method with geo-fencing device level
CN107615359B (en) Authentication system and method for detecting unauthorized unmanned aerial vehicle activity
CN107615785B (en) System and method for displaying geofence device information
CN107531324B (en) System and method for mobile geofencing
CN107533331B (en) Geo-fencing device with dynamic characteristics
CN107408351B (en) Authentication system and method for generating flight controls
CN107430402B (en) System and method for identifying and authenticating geo-fence devices
EP3158553B1 (en) Authentication systems and methods for identification of authorized participants
JP7146834B2 (en) Method and system for determining level of authorization for unmanned aerial vehicle (UAV) operation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination