US20210294327A1 - System and method for movable object control - Google Patents

System and method for movable object control Download PDF

Info

Publication number
US20210294327A1
US20210294327A1 US17/209,104 US202117209104A US2021294327A1 US 20210294327 A1 US20210294327 A1 US 20210294327A1 US 202117209104 A US202117209104 A US 202117209104A US 2021294327 A1 US2021294327 A1 US 2021294327A1
Authority
US
United States
Prior art keywords
task
uav
camera
gimbal
tasks
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/209,104
Inventor
Dhanushram BALACHANDRAN
Robert Schlub
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DJI Technology Inc
Original Assignee
DJI Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DJI Technology Inc filed Critical DJI Technology Inc
Priority to US17/209,104 priority Critical patent/US20210294327A1/en
Publication of US20210294327A1 publication Critical patent/US20210294327A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0016Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0022Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the communication link
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/52Program synchronisation; Mutual exclusion, e.g. by means of semaphores
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/003Flight plan management
    • G08G5/0034Assembly of a flight plan
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • H04N5/23203
    • H04N5/23218
    • H04N5/23299
    • B64C2201/027
    • B64C2201/127
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/50Service provisioning or reconfiguring

Definitions

  • the disclosed embodiments relate generally to controlling movable objects and more particularly, but not exclusively, to systems and methods for configuring movable objects to perform parallel tasks.
  • Movable objects such as unmanned aircraft
  • movable objects can be manually controlled by a remote operator to accomplish a desired purpose.
  • movable objects can be preprogrammed with tasks, or missions, to execute autonomously or semi-autonomously.
  • a method of operating a movable object using a user terminal comprising: configuring a plurality of tasks on the user terminal for parallel execution by the movable object using an interface; and transmitting the tasks from the user terminal to the movable object for operating the movable object.
  • a system for operating a movable object comprising: an interface for using a user terminal to configure a plurality of tasks for parallel execution by the movable object; and one or more processors configured to operate the user interface and control transmission of the tasks to the movable object.
  • a non-transitory computer-readable medium with instructions stored thereon that, when executed by a processor, perform the steps comprising: configuring a plurality of tasks on the user terminal for parallel execution by the movable object using an interface; and transmitting the tasks from the user terminal to the movable object for operating the movable object.
  • FIG. 1 is an exemplary top-level block diagram illustrating an embodiment of a movable object operation system in communication with an exemplary movable object.
  • FIG. 2 is an exemplary diagram illustrating an embodiment of a movable object that is an unmanned aerial vehicle (UAV) having a plurality of functional modules.
  • UAV unmanned aerial vehicle
  • FIG. 3 is an exemplary block diagram illustrating an embodiment of the movable object operation system of FIG. 1 having a user terminal interacting with a movable object, wherein the user terminal includes a movable object manager.
  • FIG. 4 is an exemplary block diagram illustrating an alternative embodiment of the movable object operation system of FIG. 1 having a user terminal interacting with a movable object, wherein the movable object includes a movable object manager.
  • FIG. 5 is an exemplary block diagram illustrating another alternative embodiment of the movable object operation system of FIG. 1 , wherein the user terminal is shown as configuring movable object tasks that are transmitted to the movable object.
  • FIG. 6 is an exemplary block diagram illustrating another alternative embodiment of the movable object operation system of FIG. 1 , wherein the user terminal is shown as configuring movable object tasks that are transmitted to a UAV for parallel execution.
  • FIG. 7 is an exemplary diagram illustrating an embodiment of the movable object tasks of FIG. 6 , wherein the movable object tasks are a parallel custom mission for a UAV.
  • FIG. 8 is an exemplary diagram illustrating another embodiment of the movable object tasks of FIG. 6 , wherein the movable object tasks are a parallel custom mission for a UAV.
  • FIG. 9 is an exemplary diagram illustrating an embodiment of a communication protocol suitable for use with the movable object operation system of FIG. 1 .
  • FIG. 10 is an exemplary diagram illustrating an embodiment of a data packet suitable for use with the movable object operation system of FIG. 1 .
  • FIG. 11 is an exemplary flow chart illustrating an embodiment of a method of configuring a plurality of tasks for parallel execution by a movable object using an interface in the movable object operation system of FIG. 1
  • the present disclosure sets forth systems and methods for configuring a movable object to perform tasks, particularly tasks in parallel, which overcome limitations of prior systems and methods. More particularly, prior systems and methods for configuring movable object tasks (also interchangeably referred to herein as “missions” or “movable object missions”), allow configuring only a single movable object task at a time. The present systems and methods enable configuring multiple movable object tasks at a time in parallel, greatly enhancing the versatility of the movable object for fulfilling a wide variety of movable object needs.
  • the movable object operation system 10 can include a user terminal 100 , which can communicate with a movable object 300 via a communication link 200 .
  • the user terminal 100 can be used to interact with a user (not shown) to operate the movable object 300 and/or present data collected by the movable object 300 to the user.
  • the user terminal 100 can include, for example, remote controllers (not shown), portable computers, laptops, mobile devices, handheld devices, mobile telephones (for example, smartphones), tablet devices, tablet computers, personal digital assistants, handheld consoles, portable media players, wearable devices (for example, smartwatches and head-mounted displays), and the like.
  • the user terminal 100 can include one or more applications 110 , or application software installed on the user terminal 100 .
  • an application 110 can be configured to invoke an application programming interface (API).
  • the API can be part of a software development kit (SDK).
  • SDK software development kit
  • the SDK can advantageously specify functions that are frequently invoked by certain types of applications 110 .
  • applications 110 that are used to control a flying movable object 300 can invoke functions in an SDK involving navigation of the movable object 300 .
  • the application 110 can be colloquially referred to as an “app.”
  • the app can be made available and kept updated by a vendor through a mobile app store.
  • the user terminal 100 can include one or more processors 120 that can be used to execute the applications 110 .
  • the user terminal 100 can include any number of processors 120 , as desired.
  • each processor 120 can include one or more general purpose microprocessors (for example, single or multi-core processors), application-specific integrated circuits (ASIC), field-programmable gate arrays (FPGA), application-specific instruction-set processors, digital signal processing units, coprocessors, network processing units, audio processing units, encryption processing units, and the like.
  • the processor 120 can include an image processing engine or media processing unit.
  • the processors 120 can be configured to perform any of the methods described herein, including but not limited to a variety of tasks relating to mobile object operation and control.
  • the processors 120 can include specialized software and/or hardware, for example, for processing movable object tasks using an interface.
  • the user terminal 100 can include one or more memories 130 (alternatively referred to herein as a non-transient computer readable medium).
  • Suitable memories 130 can include, for example, random access memory (RAM), static RAM, dynamic RAM, read-only memory (ROM), programmable ROM, erasable programmable ROM, electrically erasable programmable ROM, flash memory, secure digital (SD) card, and the like. Instructions for performing any of the methods described herein can be stored in the memory 130 .
  • the memory 130 can be placed in operative communication with the processors 120 , as desired, and instructions can be transmitted from the memory 130 to the processors 120 for execution, as desired.
  • the user terminal 100 can additionally have one or more input/output devices 140 , such as buttons, a keyboard, keypad, trackball, displays, and/or a monitor.
  • input/output devices 140 such as buttons, a keyboard, keypad, trackball, displays, and/or a monitor.
  • Various user interface elements for example, windows, buttons, menus, icons, pop-ups, tabs, controls, cursors, insertion points, and the like) can be used to present data to and receive data from a user (not shown).
  • the user terminal 100 can be configured to communicate with the movable object 300 via a communication link 200 .
  • the communication link 200 can include an uplink for transmitting data (such as control data and application data) from the user terminal 100 to the movable object 300 , and a downlink for transmitting data (such as telemetry data, application data, image data, and video data) from the movable object 300 to the user terminal.
  • data such as control data and application data
  • a downlink for transmitting data (such as telemetry data, application data, image data, and video data) from the movable object 300 to the user terminal.
  • the uplink and downlink can share a single frequency using time modulation. In other embodiments, the uplink and downlink can use different frequencies.
  • the communication link 200 can be a wireless communication link 200 over a wireless network.
  • Suitable wireless communications can include, for example, radio, Wireless Fidelity (WiFi), cellular, satellite, and broadcasting.
  • Exemplary suitable wireless communication technologies include, but are not limited to, Global System for Mobile Communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband CDMA (W-CDMA), CDMA2000, IMT Single Carrier, Enhanced Data Rates for GSM Evolution (EDGE), Long-Term Evolution (LTE), LTE Advanced, Time-Division LTE (TD-LTE), High Performance Radio Local Area Network (HiperLAN), High Performance Radio Wide Area Network (HiperWAN), High Performance Radio Metropolitan Area Network (HiperMAN), Local Multipoint Distribution Service (LMDS), Worldwide Interoperability for Microwave Access (WiMAX), ZigBee, Bluetooth, Flash Orthogonal Frequency-Division Multiplexing (Flash-OFDM), High Capacity Spatial Division Multiple Access (HC-SDMA), Flash
  • the communication link 200 can be implemented over a 3G or 4G mobile telecommunications network, such as the UMTS system standardized by the 3rd Generation Partnership Project (3GPP), the W-CDMA radio interface, the TD-SCDMA radio interface, the HSPA+ UMTS release, the CDMA2000 system, EV-DO, EDGE, DECT, Mobile WiMAX, and technologies that comply with the International Mobile Telecommunications Advanced (IMT-Advanced) specification, such as LTE, Mobile WiMAX, and TD-LTE.
  • the communication link 200 can be implemented over a 5G mobile telecommunications network.
  • the communication link 200 can advantageously be encrypted to prevent third party intrusion into movable object operations.
  • Suitable encryption methods include, but are not limited to, internet key exchange, Internet Protocol Security (IPsec), Kerberos, point-to-point protocol, transport layer security, SSID hiding, MAC ID filtering, Static IP addressing, 802.11 security, Wired Equivalent Privacy (WEP), Wi-Fi Protected Access (WPA), WPA2, Temporal Key Integrity Protocol (TKIP), Extensible Authentication Protocol (EAP), Lightweight Extensible Authentication Protocol (LEAP), Protected Extensible Authentication Protocol (PEAP), and the like.
  • IPsec Internet Protocol Security
  • Kerberos Kerberos
  • point-to-point protocol transport layer security
  • SSID hiding MAC ID filtering
  • Static IP addressing 802.11 security
  • Wired Equivalent Privacy WEP
  • WPA Wi-Fi Protected Access
  • WPA2 Wi-Fi Protected Access
  • TKIP Temporal Key Integrity Protocol
  • Suitable movable objects 300 that can be operated using the present systems and methods include, but are not limited to, bicycles, automobiles, trucks, ships, boats, trains, helicopters, aircraft, robotic devices, various hybrids thereof, and the like.
  • the movable object 300 can be an unmanned aerial vehicle (UAV).
  • UAV unmanned aerial vehicle
  • UAV 350 that is suitable for use with the present systems and methods is shown. Colloquially referred to as “drones,” UAVs 350 are aircraft without a human pilot onboard the vehicle whose flight is controlled autonomously or by a remote pilot (or sometimes both). UAVs 350 are now finding increased usage in civilian applications involving various aerial operations, such as data-gathering or delivery.
  • the present movable object operation systems and methods are suitable for use with many types of UAVs 350 including, without limitation, quadcopters (also referred to a quadrotor helicopters or quad rotors), single rotor, dual rotor, trirotor, hexarotor, and octorotor rotorcraft UAVs 350 , fixed wing UAVs 350 , and hybrid rotorcraft-fixed wing UAVs 350 .
  • quadcopters also referred to a quadrotor helicopters or quad rotors
  • single rotor single rotor
  • dual rotor trirotor
  • hexarotor hexarotor
  • octorotor rotorcraft UAVs 350 fixed wing UAVs 350
  • hybrid rotorcraft-fixed wing UAVs 350 hybrid rotorcraft-fixed wing UAVs 350 .
  • the UAV 350 can include one or more functional modules 310 , as suitable for the function of the UAV 350 .
  • the UAV 350 includes a flight control module 311 for controlling flight operations, a gimbal module 312 for precise rotational and/or translational positioning of mounted objects, and a camera module 313 for capturing image and video information from the surroundings of the UAV 350 .
  • an exemplary user terminal 100 is shown in relation to an exemplary movable object 300 .
  • the user terminal 100 is shown as including a plurality of exemplary applications 110 a , 110 b , and 110 c .
  • Each of the applications 110 a , 110 b , and 110 c can be implemented using an interface 140 .
  • the interface 140 can be an application programming interface (API).
  • the interface 140 can include one or more predefined functions that are called by the applications 110 a , 110 b , and 110 c .
  • the interface 140 can include functions that allow an application 110 to configure one or more movable object tasks 160 (shown in FIG. 5 ) to be performed by the movable object 300 .
  • the movable object task 160 configured using the interface 140 can be a simple task (for example, move the movable object 300 to point A) or a complex task (move the movable object 300 from point A to point B while a camera of the movable object 300 follows and films an external scene of interest).
  • the movable object task 160 or movable object tasks 160 configured using the interface 140 can entail coordination of multiple functional modules 310 of the movable object 300 .
  • the exemplary movable object 300 shown in FIG. 3 is depicted with three functional modules 310 a , 310 b , and 310 c .
  • the interface 140 can further include functions that allow the movable object tasks 160 to be transmitted to the movable object 300 .
  • the interface 140 can include a function that configures a movable object task 160 based on user input, and transmit the configured movable object task 160 to the movable object 300 without further user input.
  • the movable object tasks 160 can be configured using a movable object manager 150 .
  • a movable object manager 150 can be used to access and control the movable object 300 .
  • the movable object manager 150 can be part of a software development kit (SDK) for supporting development of software applications for the movable object 300 .
  • SDK software development kit
  • the movable object manager 150 can include various modules, as needed for accessing or controlling the movable object 300 .
  • the movable object manager 150 can include a communication manager 151 for managing communication with the movable object 300 , and/or a data manager 152 for receiving, sending, and processing data and/or commands in relation to the movable object 300 .
  • the movable object manager 150 can be configured to communicate with an authentication server (not shown) for providing a secure environment for communication between the user terminal 100 and the movable object 300 .
  • the movable object manager 150 can be located anywhere that is convenient. In some embodiments, the movable object manager 150 can be installed on the user terminal 100 , as shown in the configuration in FIG. 3 . This configuration, in which the movable object manager 150 is physically located together with the applications 110 and interface 140 , is advantageous in that movable object tasks 160 can be configured with minimal latency and delay.
  • the movable object manager 150 can be equipped with logic for determining when communication with the movable object 300 is necessary, and when such communication is unnecessary.
  • the movable object manager 150 can be installed on the movable object 300 , as shown in the configuration in FIG. 4 .
  • This configuration in which the movable object manager 150 is located with the functional modules 310 a , 310 b , and 310 c of the movable object 300 , is advantageous in that high level commands can be issued to the movable object 300 remotely, while the movable object manager 150 can execute these commands while reacting to real-time conditions of the movable object 300 . Latency between the movable object manager 150 and the movable object 300 is thereby reduced.
  • a movable object manager 150 can be installed on both the movable object 300 and the user terminal 100 .
  • an exemplary user terminal 100 is shown as having an application 110 that is a movable object operating application 115 .
  • the movable object operating application 115 can send data to the movable object 300 to operate the movable object 300 .
  • Data sent by the movable object application 115 to the movable object 300 include, for example, data to move the movable object 300 (for example, data from a control stick) and data to set parameters of the movable object 300 .
  • the movable object operating application 115 can interface with a movable object task interface 145 to configure one or more movable object tasks 160 .
  • the movable object task interface 145 includes a predefined set of functions, methods, and/or variables.
  • the movable object operating application 115 can call such functions and methods, as well as set one or more variables.
  • the movable object task interface 145 then forms one or more movable object tasks 160 based on input to the movable object task interface 145 .
  • the input of the movable object task interface 145 can take any convenient format.
  • the input of the movable object task interface 145 can be specified at a high level (for example, perform reconnaissance mission), at a low level (for example, move from point A to point B), and/or in any combination thereof (for example, perform reconnaissance mission, then return to point A).
  • a high level for example, perform reconnaissance mission
  • a low level for example, move from point A to point B
  • any combination thereof for example, perform reconnaissance mission, then return to point A.
  • the movable object tasks 160 can include one or more software objects. In some embodiments, the movable object tasks 160 can be instantiated in software by the movable object operating application 115 . In other embodiments, the movable object tasks 160 can be instantiated in software by the movable object task interface 145 . After the movable object tasks 160 are configured using the movable object task interface 145 , the movable object tasks 160 can be transmitted to the movable object 300 for operating the movable object 300 .
  • the movable object task interface can be implemented using any convenient programming language, such as Java, C, C++, Python, and the like.
  • the movable object task interface 145 can advantageously allow multiple movable object tasks 160 to be executed in parallel. Such a movable object task interface 145 is superior to existing movable object task interfaces 145 that allow only one movable object task to be executed at a time. Allowing specification of multiple parallel movable object tasks 160 affords greater flexibility, versatility, and customizability for the movable object 300 .
  • the movable object tasks 160 constitute a parallel custom mission 165 (show in FIGS. 7 and 8 ) of the movable object 300 .
  • the parallel custom mission 165 includes one more movable object tasks 160 for accomplishing a specific objective.
  • the parallel custom mission 165 can advantageously be specified in the user terminal 100 at a high level, without specific knowledge of the functional modules of the movable object 300 .
  • FIG. 6 an illustrative and non-limiting example is shown of operating a UAV 350 based on movable object tasks 160 that are executed in parallel.
  • the exemplary UAV 350 is shown as having three functional modules: a flight control module 311 , a gimbal module 312 , and a camera module 313 .
  • the movable object tasks 160 can be transmitted to and distributed among appropriate functional modules.
  • the movable object tasks 160 can be separated into flight control tasks 161 , gimbal tasks 162 , and camera tasks 163 that can occur in parallel, corresponding to the functional modules of the UAV 300 .
  • the flight control tasks 161 can include, for example, tasks that control a movement of the movable object (for example, setting target destination, velocity, altitude, attitude (pitch, roll, and yaw), and the like).
  • the gimbal tasks 162 can include, for example, rotating a gimbal to specified position(s) and/or angle(s), or configuring the gimbal to automatically follow a given object of interest.
  • the camera tasks 163 can include, for example, turning a camera of the UAV 350 on and off, setting parameters of the camera (e.g., camera angle, camera mode, photo/video size/resolution, photo/video format, zoom settings, exposure settings, and the like), or instructing the camera to visually follow the object of interest.
  • the camera tasks 163 can include any control input for data collection instruments of the UAV 350 .
  • exemplary instruments for visual and non-visual data collection on the UAV 350 include, for example, electro-optical sensors, thermal/infrared sensors, color or monochrome sensors, multi-spectral imaging sensors, spectrophotometers, spectrometers, thermometers, illuminometers, microphones/sonic transducers, pressure sensors, altitude sensors, flow sensors, humidity sensors, precipitation sensors, wind speed sensors, wind direction sensors, anemometers, optical rain sensors, and/or others.
  • the movable object tasks 160 can be configured using the movable object manager 150 .
  • the movable object manager 150 can break up the movable object tasks 160 into components that correspond to the functional modules 310 of a movable object 300 , and distribute the tasks among the functional modules as appropriate.
  • the movable object manager 150 can distribute movable object tasks 160 among at least one of the flight control module 311 , gimbal module 312 , and camera module 313 of the UAV 350 .
  • Movable object tasks 160 can be divided into flight control tasks 161 , gimbal tasks 162 , and camera tasks 163 .
  • the tasks can be coordinated in chronological sequence to occur at certain time points.
  • the flight control tasks 161 include two tasks 161 a and 161 b : move to point A, and move to point B at a later time point.
  • the gimbal tasks 162 include one gimbal task 162 a : follow an object 20 of interest when the UAV 350 has moved to point A.
  • the camera tasks 163 include two tasks 163 a and 163 b : begin video capture at the time when the movable object 300 has moved to point A, and end video capture when the UAV 350 has moved to point B.
  • Parallel coordination of the tasks 161 , 162 , and 163 results in execution of the parallel custom mission 165 using multiple functional modules.
  • FIG. 8 another parallel custom mission 165 is illustrated using another, more complex, non-limiting example: “Video capture point A, then move to point A and follow and film object at point A until the UAV reaches point B.”
  • the flight control tasks 161 include two three 161 a , 161 b , and 161 c : move to point A, follow an object of interest 20 , and end following the object 20 when the UAV 350 reaches point B.
  • the gimbal tasks 162 likewise include three corresponding gimbal tasks 162 a , 162 b , and 162 c : put point A in view of the camera, adjust the gimbal to follow the object 20 when the UAV 350 begins to follow the object 20 , and end following the object 20 when the UAV 350 reaches point B.
  • the camera tasks 163 likewise include three corresponding tasks 163 a , 163 b , and 163 c : begin video capture at the time when the UAV 350 has point A in view, focus the camera when the UAV 350 begins to follow the object 20 , and end video capture when the UAV 350 has moved to point B.
  • the communication protocol 900 can include a data link layer 903 , a network layer 902 , and an application layer 901 .
  • the data link layer 903 can be used, for example, for handling data framing, data verification, and data retransmission.
  • the network layer 902 can be used, for example, for supporting data packet routing and relaying.
  • the application layer 901 can be used, for example, for handling applications logic, such as controlling behavior of functional modules of a movable object 300 (shown in FIG. 1 ).
  • the communication protocol 900 can support communication between various modules of a movable object 300 , such as a flight control module 311 , gimbal module 312 , camera module 313 (as shown in FIG. 6 ), and other modules.
  • the communication protocol 900 can be used with different communication link technologies, such as a universal asynchronous receiver/transmitter (UART) technology, a controller area network (CAN) technology, and an inter-integrated circuit (I2C) technology.
  • UART universal asynchronous receiver/transmitter
  • CAN controller area network
  • I2C inter-integrated circuit
  • the packet 1000 can include a header 1001 , an application header 1002 , data 1003 , and a tail 1004 .
  • the header 1001 and tail 1004 can include, for example, control information that a network needs for delivering the data 1003 .
  • the control information can include source and destination network addresses, error detection codes, and packet sequencing information.
  • the application header 1002 can include, for example, various sender and receiver information.
  • the sender and receiver can be among different modules of the movable object 300 and applications 110 on the user terminal 100 (shown in FIG. 1 ).
  • FIG. 11 shows an exemplary method 1100 for operating a movable object 300 on a user terminal 100 .
  • a plurality of movable object tasks 160 are configured on a user terminal 100 through an interface 140 for parallel execution by the movable object 300 .
  • the interface can be a movable object task interface 145 , as described above with reference to FIG. 5 .
  • the movable object tasks 160 can constitute a parallel custom mission 165 of a UAV 350 (shown in FIGS. 7 and 8 ) that is configured on the user terminal 100 for execution by the UAV 350 .
  • the configuring can use an interface 140 that is an application programming interface installed on the user terminal 100 .
  • the configuring can use an interface 140 that is part of a software development kit (SDK) installed on the user terminal 100 .
  • SDK software development kit
  • the configuring of the movable object tasks 160 can be based on various trigger events.
  • the movable object tasks 160 can be configured to occur at a predetermined time (for example, absolute time or time relative to an event of the movable object 300 , such as takeoff or landing).
  • the movable object tasks 160 can be further be configured to occur when the movable object 300 reaches a predetermined location.
  • the location can be specified in absolute coordinates or relative coordinates (for example, relative to the starting location of the movable object 300 ).
  • the trigger event can also be, for example, a completion of one or more other tasks by the movable object 300 , such as to provide sequential execution of movable object tasks 160 .
  • the trigger event can also be, for example, recognition by the movable object 300 of an object 20 (shown in FIGS. 7 and 8 with respect to an exemplary UAV 350 ) of interest within an environment of the movable object 300 .
  • the movable object tasks 160 are transmitted from the user terminal 100 to the movable object 300 for operating the movable object 300 .
  • the transmission can take place using wireless communication.
  • Various wireless communication protocols can be used, as described above with reference to FIG. 1 .
  • the movable object tasks 160 can be transmitted to the movable objects 300 together after the configuration of all of the movable object tasks 160 is completed.
  • the movable object tasks 160 can be transmitted to the movable objects 300 in one data packet, such as the packet 1000 shown in FIG. 10 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method of operating a UAV using a user terminal includes configuring, through an interface, a plurality of tasks on the user terminal for execution by the UAV. The plurality of tasks include a flight control task, a gimbal task, and a camera task; and at least two of the flight control task, the gimbal task, and the camera task are configured for parallel execution. The method also includes transmitting data packets containing the flight control task, the gimbal task, and the camera task from the user terminal to the UAV respectively. Each data packet includes a header including source and destination network addresses and an application header including sender information and receiver information, the sender information identifying an application installed on the user terminal and the receiver information identifying a receiver among a flight controller, a gimbal, and a camera of the UAV.

Description

    CROSS-REFERENCES TO RELATED APPLICATIONS
  • This nonprovisional application is a continuation of U.S. patent application Ser. No. 15/713,994, filed on Sep. 25, 2017, which claims the benefit of Provisional Application No. 62/399,854, filed on Sep. 26, 2016, the entire contents of both of which are incorporated herein by reference.
  • COPYRIGHT NOTICE
  • A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
  • FIELD
  • The disclosed embodiments relate generally to controlling movable objects and more particularly, but not exclusively, to systems and methods for configuring movable objects to perform parallel tasks.
  • BACKGROUND
  • Movable objects, such as unmanned aircraft, can be used in many different fields such as film production, sporting events, disaster relief, geological study, and more. In some cases, movable objects can be manually controlled by a remote operator to accomplish a desired purpose. In other cases, where manual control is cumbersome or impractical, movable objects can be preprogrammed with tasks, or missions, to execute autonomously or semi-autonomously. However, there currently lacks a suitable interface for easily and efficiently configuring a movable object to perform tasks, especially high level tasks, that may require parallel coordination of multiple functional modules of the movable object (for example, flight controls, a gimbal, and a camera of an unmanned aerial vehicle).
  • Accordingly, there is a need for systems and methods that allow for improved configuration of parallel tasks for movable objects.
  • SUMMARY
  • In accordance with a first aspect disclosed herein, there is set forth a method of operating a movable object using a user terminal, comprising: configuring a plurality of tasks on the user terminal for parallel execution by the movable object using an interface; and transmitting the tasks from the user terminal to the movable object for operating the movable object.
  • In accordance with another aspect disclosed herein, there is set forth a system for operating a movable object, comprising: an interface for using a user terminal to configure a plurality of tasks for parallel execution by the movable object; and one or more processors configured to operate the user interface and control transmission of the tasks to the movable object.
  • In accordance with another aspect disclosed herein, there is set forth a non-transitory computer-readable medium with instructions stored thereon that, when executed by a processor, perform the steps comprising: configuring a plurality of tasks on the user terminal for parallel execution by the movable object using an interface; and transmitting the tasks from the user terminal to the movable object for operating the movable object.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is an exemplary top-level block diagram illustrating an embodiment of a movable object operation system in communication with an exemplary movable object.
  • FIG. 2 is an exemplary diagram illustrating an embodiment of a movable object that is an unmanned aerial vehicle (UAV) having a plurality of functional modules.
  • FIG. 3 is an exemplary block diagram illustrating an embodiment of the movable object operation system of FIG. 1 having a user terminal interacting with a movable object, wherein the user terminal includes a movable object manager.
  • FIG. 4 is an exemplary block diagram illustrating an alternative embodiment of the movable object operation system of FIG. 1 having a user terminal interacting with a movable object, wherein the movable object includes a movable object manager.
  • FIG. 5 is an exemplary block diagram illustrating another alternative embodiment of the movable object operation system of FIG. 1, wherein the user terminal is shown as configuring movable object tasks that are transmitted to the movable object.
  • FIG. 6 is an exemplary block diagram illustrating another alternative embodiment of the movable object operation system of FIG. 1, wherein the user terminal is shown as configuring movable object tasks that are transmitted to a UAV for parallel execution.
  • FIG. 7 is an exemplary diagram illustrating an embodiment of the movable object tasks of FIG. 6, wherein the movable object tasks are a parallel custom mission for a UAV.
  • FIG. 8 is an exemplary diagram illustrating another embodiment of the movable object tasks of FIG. 6, wherein the movable object tasks are a parallel custom mission for a UAV.
  • FIG. 9 is an exemplary diagram illustrating an embodiment of a communication protocol suitable for use with the movable object operation system of FIG. 1.
  • FIG. 10 is an exemplary diagram illustrating an embodiment of a data packet suitable for use with the movable object operation system of FIG. 1.
  • FIG. 11 is an exemplary flow chart illustrating an embodiment of a method of configuring a plurality of tasks for parallel execution by a movable object using an interface in the movable object operation system of FIG. 1
  • It should be noted that the figures are not drawn to scale and that elements of similar structures or functions are generally represented by like reference numerals for illustrative purposes throughout the figures. It also should be noted that the figures are only intended to facilitate the description of the embodiments. The figures do not illustrate every aspect of the described embodiments and do not limit the scope of the present disclosure.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • The present disclosure sets forth systems and methods for configuring a movable object to perform tasks, particularly tasks in parallel, which overcome limitations of prior systems and methods. More particularly, prior systems and methods for configuring movable object tasks (also interchangeably referred to herein as “missions” or “movable object missions”), allow configuring only a single movable object task at a time. The present systems and methods enable configuring multiple movable object tasks at a time in parallel, greatly enhancing the versatility of the movable object for fulfilling a wide variety of movable object needs.
  • Turning now to FIG. 1, an exemplary movable object operation system 10 is shown in accordance with various embodiments of the present systems and methods. The movable object operation system 10 can include a user terminal 100, which can communicate with a movable object 300 via a communication link 200.
  • The user terminal 100 can be used to interact with a user (not shown) to operate the movable object 300 and/or present data collected by the movable object 300 to the user. The user terminal 100 can include, for example, remote controllers (not shown), portable computers, laptops, mobile devices, handheld devices, mobile telephones (for example, smartphones), tablet devices, tablet computers, personal digital assistants, handheld consoles, portable media players, wearable devices (for example, smartwatches and head-mounted displays), and the like.
  • In some embodiments, the user terminal 100 can include one or more applications 110, or application software installed on the user terminal 100. In some embodiments, an application 110 can be configured to invoke an application programming interface (API). The API can be part of a software development kit (SDK). The SDK can advantageously specify functions that are frequently invoked by certain types of applications 110. For example, applications 110 that are used to control a flying movable object 300 can invoke functions in an SDK involving navigation of the movable object 300. Where the user terminal 100 is a mobile device, the application 110 can be colloquially referred to as an “app.” The app can be made available and kept updated by a vendor through a mobile app store.
  • As shown in FIG. 1, the user terminal 100 (and/or components thereof) can include one or more processors 120 that can be used to execute the applications 110. The user terminal 100 can include any number of processors 120, as desired. Without limitation, each processor 120 can include one or more general purpose microprocessors (for example, single or multi-core processors), application-specific integrated circuits (ASIC), field-programmable gate arrays (FPGA), application-specific instruction-set processors, digital signal processing units, coprocessors, network processing units, audio processing units, encryption processing units, and the like. In certain embodiments, the processor 120 can include an image processing engine or media processing unit. The processors 120 can be configured to perform any of the methods described herein, including but not limited to a variety of tasks relating to mobile object operation and control. In some embodiments, the processors 120 can include specialized software and/or hardware, for example, for processing movable object tasks using an interface.
  • As shown in FIG. 1, the user terminal 100 can include one or more memories 130 (alternatively referred to herein as a non-transient computer readable medium). Suitable memories 130 can include, for example, random access memory (RAM), static RAM, dynamic RAM, read-only memory (ROM), programmable ROM, erasable programmable ROM, electrically erasable programmable ROM, flash memory, secure digital (SD) card, and the like. Instructions for performing any of the methods described herein can be stored in the memory 130. The memory 130 can be placed in operative communication with the processors 120, as desired, and instructions can be transmitted from the memory 130 to the processors 120 for execution, as desired.
  • The user terminal 100 can additionally have one or more input/output devices 140, such as buttons, a keyboard, keypad, trackball, displays, and/or a monitor. Various user interface elements (for example, windows, buttons, menus, icons, pop-ups, tabs, controls, cursors, insertion points, and the like) can be used to present data to and receive data from a user (not shown).
  • The user terminal 100 can be configured to communicate with the movable object 300 via a communication link 200. As shown in FIG. 1, the communication link 200 can include an uplink for transmitting data (such as control data and application data) from the user terminal 100 to the movable object 300, and a downlink for transmitting data (such as telemetry data, application data, image data, and video data) from the movable object 300 to the user terminal. In some embodiments, the uplink and downlink can share a single frequency using time modulation. In other embodiments, the uplink and downlink can use different frequencies.
  • In some embodiments, the communication link 200 can be a wireless communication link 200 over a wireless network. Suitable wireless communications can include, for example, radio, Wireless Fidelity (WiFi), cellular, satellite, and broadcasting. Exemplary suitable wireless communication technologies include, but are not limited to, Global System for Mobile Communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband CDMA (W-CDMA), CDMA2000, IMT Single Carrier, Enhanced Data Rates for GSM Evolution (EDGE), Long-Term Evolution (LTE), LTE Advanced, Time-Division LTE (TD-LTE), High Performance Radio Local Area Network (HiperLAN), High Performance Radio Wide Area Network (HiperWAN), High Performance Radio Metropolitan Area Network (HiperMAN), Local Multipoint Distribution Service (LMDS), Worldwide Interoperability for Microwave Access (WiMAX), ZigBee, Bluetooth, Flash Orthogonal Frequency-Division Multiplexing (Flash-OFDM), High Capacity Spatial Division Multiple Access (HC-SDMA), iBurst, Universal Mobile Telecommunications System (UMTS), UMTS Time-Division Duplexing (UMTS-TDD), Evolved High Speed Packet Access (HSPA+), Time Division Synchronous Code Division Multiple Access (TD-SCDMA), Evolution-Data Optimized (EV-DO), Digital Enhanced Cordless Telecommunications (DECT) and others.
  • In certain embodiments, the communication link 200 can be implemented over a 3G or 4G mobile telecommunications network, such as the UMTS system standardized by the 3rd Generation Partnership Project (3GPP), the W-CDMA radio interface, the TD-SCDMA radio interface, the HSPA+ UMTS release, the CDMA2000 system, EV-DO, EDGE, DECT, Mobile WiMAX, and technologies that comply with the International Mobile Telecommunications Advanced (IMT-Advanced) specification, such as LTE, Mobile WiMAX, and TD-LTE. In other embodiments, the communication link 200 can be implemented over a 5G mobile telecommunications network.
  • In some embodiments, the communication link 200 can advantageously be encrypted to prevent third party intrusion into movable object operations. Suitable encryption methods include, but are not limited to, internet key exchange, Internet Protocol Security (IPsec), Kerberos, point-to-point protocol, transport layer security, SSID hiding, MAC ID filtering, Static IP addressing, 802.11 security, Wired Equivalent Privacy (WEP), Wi-Fi Protected Access (WPA), WPA2, Temporal Key Integrity Protocol (TKIP), Extensible Authentication Protocol (EAP), Lightweight Extensible Authentication Protocol (LEAP), Protected Extensible Authentication Protocol (PEAP), and the like.
  • Suitable movable objects 300 that can be operated using the present systems and methods include, but are not limited to, bicycles, automobiles, trucks, ships, boats, trains, helicopters, aircraft, robotic devices, various hybrids thereof, and the like. In some embodiments, the movable object 300 can be an unmanned aerial vehicle (UAV).
  • Turning now to FIG. 2, an exemplary UAV 350 that is suitable for use with the present systems and methods is shown. Colloquially referred to as “drones,” UAVs 350 are aircraft without a human pilot onboard the vehicle whose flight is controlled autonomously or by a remote pilot (or sometimes both). UAVs 350 are now finding increased usage in civilian applications involving various aerial operations, such as data-gathering or delivery. The present movable object operation systems and methods are suitable for use with many types of UAVs 350 including, without limitation, quadcopters (also referred to a quadrotor helicopters or quad rotors), single rotor, dual rotor, trirotor, hexarotor, and octorotor rotorcraft UAVs 350, fixed wing UAVs 350, and hybrid rotorcraft-fixed wing UAVs 350.
  • The UAV 350 can include one or more functional modules 310, as suitable for the function of the UAV 350. As shown in the exemplary UAV 350 of FIG. 2, the UAV 350 includes a flight control module 311 for controlling flight operations, a gimbal module 312 for precise rotational and/or translational positioning of mounted objects, and a camera module 313 for capturing image and video information from the surroundings of the UAV 350.
  • Turning now to FIG. 3, an exemplary user terminal 100 is shown in relation to an exemplary movable object 300. The user terminal 100 is shown as including a plurality of exemplary applications 110 a, 110 b, and 110 c. Each of the applications 110 a, 110 b, and 110 c can be implemented using an interface 140. In some embodiments, the interface 140 can be an application programming interface (API). The interface 140 can include one or more predefined functions that are called by the applications 110 a, 110 b, and 110 c. For example, the interface 140 can include functions that allow an application 110 to configure one or more movable object tasks 160 (shown in FIG. 5) to be performed by the movable object 300. The movable object task 160 configured using the interface 140 can be a simple task (for example, move the movable object 300 to point A) or a complex task (move the movable object 300 from point A to point B while a camera of the movable object 300 follows and films an external scene of interest).
  • In some embodiments, the movable object task 160 or movable object tasks 160 configured using the interface 140 can entail coordination of multiple functional modules 310 of the movable object 300. The exemplary movable object 300 shown in FIG. 3 is depicted with three functional modules 310 a, 310 b, and 310 c. The interface 140 can further include functions that allow the movable object tasks 160 to be transmitted to the movable object 300. In some embodiments, the interface 140 can include a function that configures a movable object task 160 based on user input, and transmit the configured movable object task 160 to the movable object 300 without further user input.
  • In some embodiments, the movable object tasks 160 can be configured using a movable object manager 150. As shown in FIG. 3, a movable object manager 150 can be used to access and control the movable object 300. The movable object manager 150 can be part of a software development kit (SDK) for supporting development of software applications for the movable object 300. The movable object manager 150 can include various modules, as needed for accessing or controlling the movable object 300. For example, the movable object manager 150 can include a communication manager 151 for managing communication with the movable object 300, and/or a data manager 152 for receiving, sending, and processing data and/or commands in relation to the movable object 300. In some embodiments, the movable object manager 150 can be configured to communicate with an authentication server (not shown) for providing a secure environment for communication between the user terminal 100 and the movable object 300.
  • The movable object manager 150 can be located anywhere that is convenient. In some embodiments, the movable object manager 150 can be installed on the user terminal 100, as shown in the configuration in FIG. 3. This configuration, in which the movable object manager 150 is physically located together with the applications 110 and interface 140, is advantageous in that movable object tasks 160 can be configured with minimal latency and delay. The movable object manager 150 can be equipped with logic for determining when communication with the movable object 300 is necessary, and when such communication is unnecessary.
  • In some embodiments, the movable object manager 150 can be installed on the movable object 300, as shown in the configuration in FIG. 4. This configuration, in which the movable object manager 150 is located with the functional modules 310 a, 310 b, and 310 c of the movable object 300, is advantageous in that high level commands can be issued to the movable object 300 remotely, while the movable object manager 150 can execute these commands while reacting to real-time conditions of the movable object 300. Latency between the movable object manager 150 and the movable object 300 is thereby reduced. In some embodiments, a movable object manager 150 can be installed on both the movable object 300 and the user terminal 100.
  • Turning now to FIG. 5, an exemplary user terminal 100 is shown as having an application 110 that is a movable object operating application 115. The movable object operating application 115 can send data to the movable object 300 to operate the movable object 300. Data sent by the movable object application 115 to the movable object 300 include, for example, data to move the movable object 300 (for example, data from a control stick) and data to set parameters of the movable object 300.
  • As shown in FIG. 5, the movable object operating application 115 can interface with a movable object task interface 145 to configure one or more movable object tasks 160. In some embodiments, the movable object task interface 145 includes a predefined set of functions, methods, and/or variables. The movable object operating application 115 can call such functions and methods, as well as set one or more variables. The movable object task interface 145 then forms one or more movable object tasks 160 based on input to the movable object task interface 145. The input of the movable object task interface 145 can take any convenient format. The input of the movable object task interface 145 can be specified at a high level (for example, perform reconnaissance mission), at a low level (for example, move from point A to point B), and/or in any combination thereof (for example, perform reconnaissance mission, then return to point A).
  • In some embodiments, the movable object tasks 160 can include one or more software objects. In some embodiments, the movable object tasks 160 can be instantiated in software by the movable object operating application 115. In other embodiments, the movable object tasks 160 can be instantiated in software by the movable object task interface 145. After the movable object tasks 160 are configured using the movable object task interface 145, the movable object tasks 160 can be transmitted to the movable object 300 for operating the movable object 300. The movable object task interface can be implemented using any convenient programming language, such as Java, C, C++, Python, and the like.
  • In particular, in some embodiments, the movable object task interface 145 can advantageously allow multiple movable object tasks 160 to be executed in parallel. Such a movable object task interface 145 is superior to existing movable object task interfaces 145 that allow only one movable object task to be executed at a time. Allowing specification of multiple parallel movable object tasks 160 affords greater flexibility, versatility, and customizability for the movable object 300. In some embodiments, the movable object tasks 160 constitute a parallel custom mission 165 (show in FIGS. 7 and 8) of the movable object 300. The parallel custom mission 165 includes one more movable object tasks 160 for accomplishing a specific objective. The parallel custom mission 165 can advantageously be specified in the user terminal 100 at a high level, without specific knowledge of the functional modules of the movable object 300.
  • Turning now to FIG. 6, an illustrative and non-limiting example is shown of operating a UAV 350 based on movable object tasks 160 that are executed in parallel. The exemplary UAV 350 is shown as having three functional modules: a flight control module 311, a gimbal module 312, and a camera module 313. The movable object tasks 160 can be transmitted to and distributed among appropriate functional modules. For example, the movable object tasks 160 can be separated into flight control tasks 161, gimbal tasks 162, and camera tasks 163 that can occur in parallel, corresponding to the functional modules of the UAV 300.
  • The flight control tasks 161 can include, for example, tasks that control a movement of the movable object (for example, setting target destination, velocity, altitude, attitude (pitch, roll, and yaw), and the like). The gimbal tasks 162 can include, for example, rotating a gimbal to specified position(s) and/or angle(s), or configuring the gimbal to automatically follow a given object of interest. The camera tasks 163 can include, for example, turning a camera of the UAV 350 on and off, setting parameters of the camera (e.g., camera angle, camera mode, photo/video size/resolution, photo/video format, zoom settings, exposure settings, and the like), or instructing the camera to visually follow the object of interest. More generally, the camera tasks 163 can include any control input for data collection instruments of the UAV 350. Exemplary instruments for visual and non-visual data collection on the UAV 350 include, for example, electro-optical sensors, thermal/infrared sensors, color or monochrome sensors, multi-spectral imaging sensors, spectrophotometers, spectrometers, thermometers, illuminometers, microphones/sonic transducers, pressure sensors, altitude sensors, flow sensors, humidity sensors, precipitation sensors, wind speed sensors, wind direction sensors, anemometers, optical rain sensors, and/or others.
  • In some embodiments, the movable object tasks 160 can be configured using the movable object manager 150. In some embodiments, the movable object manager 150 can break up the movable object tasks 160 into components that correspond to the functional modules 310 of a movable object 300, and distribute the tasks among the functional modules as appropriate. For example, the movable object manager 150 can distribute movable object tasks 160 among at least one of the flight control module 311, gimbal module 312, and camera module 313 of the UAV 350.
  • Turning now to FIG. 7, a parallel custom mission 165 is illustrated using the non-limiting example “move from point A to point B while filming the following an object.” Movable object tasks 160 can be divided into flight control tasks 161, gimbal tasks 162, and camera tasks 163. The tasks can be coordinated in chronological sequence to occur at certain time points. Here, the flight control tasks 161 include two tasks 161 a and 161 b: move to point A, and move to point B at a later time point. The gimbal tasks 162 include one gimbal task 162 a: follow an object 20 of interest when the UAV 350 has moved to point A. The camera tasks 163 include two tasks 163 a and 163 b: begin video capture at the time when the movable object 300 has moved to point A, and end video capture when the UAV 350 has moved to point B. Parallel coordination of the tasks 161, 162, and 163 results in execution of the parallel custom mission 165 using multiple functional modules.
  • Turning now to FIG. 8, another parallel custom mission 165 is illustrated using another, more complex, non-limiting example: “Video capture point A, then move to point A and follow and film object at point A until the UAV reaches point B.” Here, the flight control tasks 161 include two three 161 a, 161 b, and 161 c: move to point A, follow an object of interest 20, and end following the object 20 when the UAV 350 reaches point B. The gimbal tasks 162 likewise include three corresponding gimbal tasks 162 a, 162 b, and 162 c: put point A in view of the camera, adjust the gimbal to follow the object 20 when the UAV 350 begins to follow the object 20, and end following the object 20 when the UAV 350 reaches point B. The camera tasks 163 likewise include three corresponding tasks 163 a, 163 b, and 163 c: begin video capture at the time when the UAV 350 has point A in view, focus the camera when the UAV 350 begins to follow the object 20, and end video capture when the UAV 350 has moved to point B. Once again, parallel coordination of the tasks 161, 162, and 163 results in execution of the parallel custom mission 165 using multiple functional modules.
  • Turning now to FIG. 9, an exemplary communication protocol 900 for the communication link 200 is shown in accordance with various embodiments of the present systems and methods. The communication protocol 900 can include a data link layer 903, a network layer 902, and an application layer 901. The data link layer 903 can be used, for example, for handling data framing, data verification, and data retransmission. The network layer 902 can be used, for example, for supporting data packet routing and relaying. The application layer 901 can be used, for example, for handling applications logic, such as controlling behavior of functional modules of a movable object 300 (shown in FIG. 1).
  • In some embodiments, the communication protocol 900 can support communication between various modules of a movable object 300, such as a flight control module 311, gimbal module 312, camera module 313 (as shown in FIG. 6), and other modules. The communication protocol 900 can be used with different communication link technologies, such as a universal asynchronous receiver/transmitter (UART) technology, a controller area network (CAN) technology, and an inter-integrated circuit (I2C) technology.
  • Turning now to FIG. 10, an exemplary illustration of a packet 1000 is shown that can be used in a communication link 200 (shown in FIG. 1). As shown in FIG. 10, the packet 1000 can include a header 1001, an application header 1002, data 1003, and a tail 1004. The header 1001 and tail 1004 can include, for example, control information that a network needs for delivering the data 1003. For example, the control information can include source and destination network addresses, error detection codes, and packet sequencing information. The application header 1002 can include, for example, various sender and receiver information. For example, the sender and receiver can be among different modules of the movable object 300 and applications 110 on the user terminal 100 (shown in FIG. 1).
  • In accordance with the above, FIG. 11 shows an exemplary method 1100 for operating a movable object 300 on a user terminal 100. At 1110, a plurality of movable object tasks 160 are configured on a user terminal 100 through an interface 140 for parallel execution by the movable object 300. The interface can be a movable object task interface 145, as described above with reference to FIG. 5. The movable object tasks 160 can constitute a parallel custom mission 165 of a UAV 350 (shown in FIGS. 7 and 8) that is configured on the user terminal 100 for execution by the UAV 350. In some embodiments, the configuring can use an interface 140 that is an application programming interface installed on the user terminal 100. In some embodiments, the configuring can use an interface 140 that is part of a software development kit (SDK) installed on the user terminal 100.
  • The configuring of the movable object tasks 160 can be based on various trigger events. For example, the movable object tasks 160 can be configured to occur at a predetermined time (for example, absolute time or time relative to an event of the movable object 300, such as takeoff or landing). The movable object tasks 160 can be further be configured to occur when the movable object 300 reaches a predetermined location. The location can be specified in absolute coordinates or relative coordinates (for example, relative to the starting location of the movable object 300). The trigger event can also be, for example, a completion of one or more other tasks by the movable object 300, such as to provide sequential execution of movable object tasks 160. The trigger event can also be, for example, recognition by the movable object 300 of an object 20 (shown in FIGS. 7 and 8 with respect to an exemplary UAV 350) of interest within an environment of the movable object 300.
  • At 1120, the movable object tasks 160 are transmitted from the user terminal 100 to the movable object 300 for operating the movable object 300. In some embodiments, the transmission can take place using wireless communication. Various wireless communication protocols can be used, as described above with reference to FIG. 1. In some embodiments, the movable object tasks 160 can be transmitted to the movable objects 300 together after the configuration of all of the movable object tasks 160 is completed. In some embodiments, the movable object tasks 160 can be transmitted to the movable objects 300 in one data packet, such as the packet 1000 shown in FIG. 10.
  • The disclosed embodiments are susceptible to various modifications and alternative forms, and specific examples thereof have been shown by way of example in the drawings and are herein described in detail. It should be understood, however, that the disclosed embodiments are not to be limited to the particular forms or methods disclosed, but to the contrary, the disclosed embodiments are to cover all modifications, equivalents, and alternatives.

Claims (20)

What is claimed is:
1. A method of operating an unmanned aerial vehicle (UAV) using a user terminal, comprising:
configuring, through an interface, a plurality of tasks on the user terminal for execution by the UAV, wherein the plurality of tasks include a flight control task, a gimbal task, and a camera task; and at least two of the flight control task, the gimbal task, and the camera task are configured for parallel execution; and
transmitting data packets containing the flight control task, the gimbal task, and the camera task from the user terminal to the UAV respectively, wherein each data packet includes a header including source and destination network addresses and an application header including sender information and receiver information, the receiver information identifying a receiver among a flight controller, a gimbal, and a camera of the UAV.
2. The method of claim 1, wherein the configuring comprises configuring a parallel custom mission on the user terminal for execution by the UAV.
3. The method of claim 1, wherein the configuring comprises configuring at least one of the tasks for parallel execution by the UAV at least one of:
at a predetermined time,
at a predetermined location of the UAV,
based on completion of one or more other tasks by the UAV, or
based on recognition by the UAV of an object of interest within an environment of the UAV.
4. The method of claim 1, wherein the configuring comprises configuring the UAV to follow an object of interest.
5. The method of claim 4, wherein the configuring comprises configuring a camera of the UAV to visually follow the object of interest.
6. The method of claim 1, wherein the configuring comprises configuring the plurality of tasks using a movable object manager to distribute the tasks among components of the UAV.
7. The method of claim 6, wherein the configuring comprises configuring the plurality of tasks using the movable object manager that is installed on the user terminal or on the UAV.
8. The method of claim 1, wherein the transmitting comprises transmitting the plurality of tasks from the user terminal to the UAV using wireless communication.
9. The method of claim 1, wherein the parallel execution of a gimbal task and a camera task comprises:
at a first time point, controlling the gimbal to direct the camera configured on the gimbal to put a first spatial point in view of the camera, and controlling the camera to start a video capturing process.
10. The method of claim 1, wherein the parallel execution of a flight task, a gimbal task, and a camera task comprises: after the UAV arrives at a first spatial point,
controlling a flight path of the UAV to follow a moving target object;
controlling the gimbal to direct the camera to the moving target object; and
controlling the camera to focus on the target object in a video capture process.
11. The method of claim 10, wherein the parallel execution of a flight task, a gimbal task, and a camera task comprises: after the UAV arrives at a second spatial point,
controlling the UAV to end following the moving target object;
controlling the gimbal to stop directing the camera to follow the moving target object; and
controlling the camera to stop the video capture process.
12. A system for operating an unmanned aerial vehicle (UAV), comprising:
an interface for using on a user terminal to configure a plurality of tasks for execution by the UAV, wherein the plurality of tasks include a flight control task, a gimbal task, and a camera task; and at least two of the flight control task, the gimbal task, and the camera task are configured for parallel execution; and
one or more processors configured to operate the user interface and control transmission of the tasks to the UAV, including: transmitting data packets containing the flight control task, the gimbal task, and the camera task to the UAV respectively, wherein each data packet includes a header including source and destination network addresses and an application header including sender information and receiver information, the receiver information identifying a receiver among a flight controller, a gimbal, and a camera of the UAV.
13. The system of claim 12, wherein the interface is for configuring a parallel custom mission on the user terminal for execution by the UAV.
14. The system of claim 12, wherein the interface is for configuring at least one of the tasks for parallel execution by the UAV at least one of:
at a predetermined time,
at a predetermined location of the UAV,
based on completion of one or more other tasks by the UAV, or
based on recognition by the UAV of an object of interest within an environment of the UAV.
15. The system of claim 12, wherein the interface is for configuring the UAV to follow an object of interest.
16. The system of claim 15, wherein the interface is for configuring a camera of the movable object to visually follow the object of interest.
17. The system of claim 12, wherein the parallel execution of a gimbal task and a camera task comprises:
at a first time point, controlling the gimbal to direct the camera configured on the gimbal to put a first spatial point in view of the camera, and controlling the camera to start a video capturing process.
18. The system of claim 12, wherein the parallel execution of a flight task, a gimbal task, and a camera task comprises: after the UAV arrives at a first spatial point,
controlling a flight path of the UAV to follow a moving target object;
controlling the gimbal to direct the camera to the moving target object; and
controlling the camera to focus on the target object in a video capture process.
19. The system of claim 18, wherein the parallel execution of a flight task, a gimbal task, and a camera task comprises: after the UAV arrives at a second spatial point,
controlling the UAV to end following the moving target object;
controlling the gimbal to stop directing the camera to follow the moving target object; and
controlling the camera to stop the video capture process.
20. A non-transitory computer-readable medium with instructions stored thereon that, when executed by a processor, perform a method comprising:
configuring, through an interface, a plurality of tasks on the user terminal for execution by a UAV, wherein the plurality of tasks include a flight control task, a gimbal task, and a camera task; and at least two of the flight control task, the gimbal task, and the camera task are configured for parallel execution; and
transmitting data packets containing the flight control task, the gimbal task, and the camera task from the user terminal to the UAV respectively, wherein each data packet includes a header including source and destination network addresses and an application header including sender information and receiver information, the receiver information identifying a receiver among a flight controller, a gimbal, and a camera of the UAV.
US17/209,104 2016-09-26 2021-03-22 System and method for movable object control Abandoned US20210294327A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/209,104 US20210294327A1 (en) 2016-09-26 2021-03-22 System and method for movable object control

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201662399854P 2016-09-26 2016-09-26
US15/713,994 US10955838B2 (en) 2016-09-26 2017-09-25 System and method for movable object control
US17/209,104 US20210294327A1 (en) 2016-09-26 2021-03-22 System and method for movable object control

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/713,994 Continuation US10955838B2 (en) 2016-09-26 2017-09-25 System and method for movable object control

Publications (1)

Publication Number Publication Date
US20210294327A1 true US20210294327A1 (en) 2021-09-23

Family

ID=60988435

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/713,994 Active 2038-03-22 US10955838B2 (en) 2016-09-26 2017-09-25 System and method for movable object control
US17/209,104 Abandoned US20210294327A1 (en) 2016-09-26 2021-03-22 System and method for movable object control

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/713,994 Active 2038-03-22 US10955838B2 (en) 2016-09-26 2017-09-25 System and method for movable object control

Country Status (1)

Country Link
US (2) US10955838B2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3329461B1 (en) * 2015-07-31 2021-04-07 SZ DJI Technology Co., Ltd. System and method for constructing optical flow fields
US11367361B2 (en) * 2019-02-22 2022-06-21 Kyndryl, Inc. Emulating unmanned aerial vehicle (UAV)

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL117792A (en) * 1995-05-08 2003-10-31 Rafael Armament Dev Authority Autonomous command and control unit for mobile platform
US7299130B2 (en) * 2003-12-12 2007-11-20 Advanced Ceramic Research, Inc. Unmanned vehicle
US7765062B2 (en) * 2006-04-25 2010-07-27 Honeywell International Inc. Method and system for autonomous tracking of a mobile target by an unmanned aerial vehicle
US8244469B2 (en) * 2008-03-16 2012-08-14 Irobot Corporation Collaborative engagement for target identification and tracking
US8521339B2 (en) * 2008-09-09 2013-08-27 Aeryon Labs Inc. Method and system for directing unmanned vehicles
US20100259614A1 (en) * 2009-04-14 2010-10-14 Honeywell International Inc. Delay Compensated Feature Target System
US11154981B2 (en) * 2010-02-04 2021-10-26 Teladoc Health, Inc. Robot user interface for telepresence robot system
US9930298B2 (en) * 2011-04-19 2018-03-27 JoeBen Bevirt Tracking of dynamic object of interest and active stabilization of an autonomous airborne platform mounted camera
US20120316680A1 (en) * 2011-06-13 2012-12-13 Microsoft Corporation Tracking and following of moving objects by a mobile robot
FR2985581B1 (en) * 2012-01-05 2014-11-28 Parrot METHOD FOR CONTROLLING A ROTARY SAILING DRONE FOR OPERATING A SHOOTING VIEW BY AN ON-BOARD CAMERA WITH MINIMIZATION OF DISTURBING MOVEMENTS
US9044543B2 (en) * 2012-07-17 2015-06-02 Elwha Llc Unmanned device utilization methods and systems
WO2014098477A1 (en) * 2012-12-18 2014-06-26 삼성전자 주식회사 Method and device for controlling home device remotely in home network system
JP6409003B2 (en) * 2013-01-18 2018-10-17 アイロボット コーポレイション Method using robot and computer-readable storage medium thereof
WO2014138472A2 (en) * 2013-03-06 2014-09-12 Robotex Inc. System and method for collecting and processing data and for utilizing robotic and/or human resources
US8903568B1 (en) * 2013-07-31 2014-12-02 SZ DJI Technology Co., Ltd Remote control method and terminal
US9824596B2 (en) * 2013-08-30 2017-11-21 Insitu, Inc. Unmanned vehicle searches
CA2927096C (en) * 2013-10-26 2023-02-28 Amazon Technologies, Inc. Unmanned aerial vehicle delivery system
US9769387B1 (en) * 2013-11-05 2017-09-19 Trace Live Network Inc. Action camera system for unmanned aerial vehicle
WO2015082595A1 (en) * 2013-12-06 2015-06-11 Bae Systems Plc Imaging method and apparatus
US9710709B1 (en) * 2014-03-07 2017-07-18 Trace Live Network Inc. Cascade recognition for personal tracking via unmanned aerial vehicle (UAV)
US9454151B2 (en) * 2014-05-20 2016-09-27 Verizon Patent And Licensing Inc. User interfaces for selecting unmanned aerial vehicles and mission plans for unmanned aerial vehicles
US9678506B2 (en) * 2014-06-19 2017-06-13 Skydio, Inc. Magic wand interface and other user interaction paradigms for a flying digital assistant
US9723831B2 (en) * 2014-07-30 2017-08-08 The Governing Council Of The University Of Toronto System and methods for automated vitrification of biological materials
JP6250166B2 (en) * 2015-04-20 2017-12-20 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Method, system and computer readable medium for supporting application development of moving objects
US9889566B2 (en) * 2015-05-01 2018-02-13 General Electric Company Systems and methods for control of robotic manipulation
US9898932B2 (en) * 2015-05-04 2018-02-20 International Business Machines Corporation Unmanned vehicle movement path assignment and management
EP3101889A3 (en) * 2015-06-02 2017-03-08 LG Electronics Inc. Mobile terminal and controlling method thereof
US9928649B2 (en) * 2015-08-03 2018-03-27 Amber Garage, Inc. Interface for planning flight path
WO2017103682A2 (en) * 2015-12-16 2017-06-22 Mbl Limited Robotic manipulation methods and systems for executing a domain-specific application in an instrumented environment with containers and electronic minimanipulation libraries
US9609288B1 (en) * 2015-12-31 2017-03-28 Unmanned Innovation, Inc. Unmanned aerial vehicle rooftop inspection system
WO2017131587A1 (en) * 2016-01-29 2017-08-03 Garuda Robotics Pte. Ltd. System and method for controlling an unmanned vehicle and releasing a payload from the same
US9977434B2 (en) * 2016-06-23 2018-05-22 Qualcomm Incorporated Automatic tracking mode for controlling an unmanned aerial vehicle
US10460279B2 (en) * 2016-06-28 2019-10-29 Wing Aviation Llc Interactive transport services provided by unmanned aerial vehicles

Also Published As

Publication number Publication date
US10955838B2 (en) 2021-03-23
US20180024547A1 (en) 2018-01-25

Similar Documents

Publication Publication Date Title
US10454564B2 (en) Facilitating communication with a vehicle via a UAV
US9955115B2 (en) Facilitating wide view video conferencing through a drone network
CN108377333B (en) Focus following remote controller, aircraft remote controller, aerial photography focus following control system and method
WO2017114504A1 (en) Facilitating wide-view video conferencing through a uav network
US20210294327A1 (en) System and method for movable object control
US20180281933A1 (en) Double Folding Drone Arms with Landing Gear
EP3570132A1 (en) System and method for data recording and analysis
US20200351615A1 (en) System and method for managing movable object communications
CN104808674A (en) Multi-rotor aircraft control system, terminal and airborne flight control system
WO2018177268A1 (en) Device and method for performing long-distance information interaction with unmanned aerial vehicle by means of 4g network
WO2019095738A1 (en) Communication method and communication device for unmanned aerial vehicle, and unmanned aerial vehicle
US20170374636A1 (en) Local network for the simultaneous exchange of data between a drone and a plurality of user terminals
WO2018177270A1 (en) Device and method for sharing control over unmanned aerial vehicle by means of mobile network
WO2020172873A1 (en) Communication method for unmanned aerial vehicle, and unmanned aerial vehicle
CN108700855A (en) The channels F control method and device
US11620913B2 (en) Movable object application framework
US20210294720A1 (en) System and method for movable object tracking and analysis
KR102242208B1 (en) Drone control system and its method
EP3919374B1 (en) Image capturing method
EP3790298B1 (en) Methods for acquiring and sending route information of unmanned aerial vehicle
WO2020132989A1 (en) Data processing method, control device, system and storage medium
EP4357868A1 (en) Multiple aircraft flight management system, method, and program
JP2018129577A (en) Imaging system, control method thereof, and program
KR20230108127A (en) System for controlling unmanned aerial vehicle
WO2020154834A1 (en) External load control method and device, unmanned aerial vehicle, and terminal device

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STCB Information on status: application discontinuation

Free format text: EXPRESSLY ABANDONED -- DURING EXAMINATION