WO2022141187A1 - Systems and methods for controlling an unmanned aerial vehicle using a body-attached remote control - Google Patents

Systems and methods for controlling an unmanned aerial vehicle using a body-attached remote control Download PDF

Info

Publication number
WO2022141187A1
WO2022141187A1 PCT/CN2020/141370 CN2020141370W WO2022141187A1 WO 2022141187 A1 WO2022141187 A1 WO 2022141187A1 CN 2020141370 W CN2020141370 W CN 2020141370W WO 2022141187 A1 WO2022141187 A1 WO 2022141187A1
Authority
WO
WIPO (PCT)
Prior art keywords
value
movable object
pitch
user
roll
Prior art date
Application number
PCT/CN2020/141370
Other languages
French (fr)
Inventor
Zhicong HUANG
Timothée PETER
Original Assignee
SZ DJI Technology Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co., Ltd. filed Critical SZ DJI Technology Co., Ltd.
Priority to PCT/CN2020/141370 priority Critical patent/WO2022141187A1/en
Publication of WO2022141187A1 publication Critical patent/WO2022141187A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0016Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0033Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by having the operator tracking the vehicle either by direct line of sight or via one or more cameras located remotely from the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target

Definitions

  • the disclosed embodiments relate generally to unmanned aerial vehicle (UAV) technology, and more specifically, to systems and methods for controlling a UAV using a body-attached controller device.
  • UAV unmanned aerial vehicle
  • Movable objects can be used for performing surveillance, reconnaissance, and exploration tasks for military and civilian applications.
  • An unmanned aerial vehicle (UAV) (e.g., a drone) is an example of a movable object.
  • UAV unmanned aerial vehicle
  • a movable object may carry a payload for performing specific functions such as capturing images and video of a surrounding environment of the movable object or for tracking a target object.
  • a movable object may track a target object that is stationary, or moving on the ground or in the air. Movement control information for controlling a movable object is typically received by the movable object via a remote device (e.g., a controller device) and/or determined by the movable object.
  • a remote device e.g., a controller device
  • UAV e.g., drone
  • UAV aerial photography and videography a user who intends to capture images and/or video of a specific target object using UAV aerial photography and videography technology will control the UAV to fly toward the target object, and/or provide to the UAV instructions such as positional information of the target object, so that the UAV can execute a flight toward the target object.
  • a user uses a remote controller device (e.g., a drone controller device) that includes input button (s) , joystick (s) , or a combination thereof, to input commands to control a flight of the UAV, such as to adjust a pitch, roll, yaw, and/or throttle of the UAV.
  • a remote controller device e.g., a drone controller device
  • the controller device is a handheld device that requires input from both hands of a user.
  • the controller device may include a left input (e.g., a left joystick) that enables the user to control a roll and pitch of the UAV.
  • the controller device may also include a right input (e.g., a right joystick) that enables the user to control a yaw and throttle of the UAV.
  • the controller device may also include additional buttons and/or knobs that provide additional features and/or options to adjust a sensitivity of the pitch, roll, yaw, and/or throttle inputs.
  • the UAV is also coupled to a camera (e.g., an imaging device) via a gimbal (e.g., a carrier) .
  • the user also controls a corresponding pitch /roll /yaw of the gimbal to adjust the camera to an optimal position for aerial photography and videography.
  • the numerous input controls of the UAV, and the combination of flight, gimbal, and camera parameters that need to be adjusted, make the operation of the UAV extremely cumbersome.
  • a controller device comprises a handheld device that requires inputs from one hand of the user to control a UAV.
  • a user can control a UAV by pointing the controller device toward the UAV, and/or by holding the controller device in the user’s hand and moving (e.g., gesturing) with the controller device in-hand.
  • the controller device detects (e.g., senses and measures) the movement (e.g., user movement) in three dimensions (e.g., in the x-, y-, and z-axes) , and maps (e.g., directly maps) the movement and/or rotation of the controller device to a corresponding movement and/or rotation of the UAV.
  • the controller device will detect all movements of the user as long the user is holding the controller device, regardless of whether the user movements are actually user instructions that are directed to controlling the UAV.
  • a user may want to control the UAV via the controller device by moving one or more fingers of the user and/or rotating a wrist of the user.
  • the user may also (e.g., naturally, inherently, unintentionally or inadvertently etc. ) move other portions of the user’s body, such as move (e.g., swing, rotate etc. ) an elbow and/or an arm.
  • the controller device may detect a combined movement from the user (e.g., from both fingers/wrist and elbow/arm) and map the combined user movement to a corresponding movement and/or rotation of the UAV.
  • a combined movement from the user e.g., from both fingers/wrist and elbow/arm
  • map the combined user movement to a corresponding movement and/or rotation of the UAV.
  • natural (e.g., unintentional) movements from other portions of the user’s body can sometimes lead to an exaggeration (e.g., over-amplification) of the actual user instruction to control the UAV.
  • the movements from other portions of the user’s body can also counteract (e.g., reduce) the actual user instruction.
  • the first component is a handheld component.
  • the second component is a wearable component that is worn on a wrist, forearm, arm, etc. of a user.
  • the first component includes a first sensor.
  • the second component includes a second sensor.
  • the first sensor detects user interaction (s) (e.g., movements and/or rotations) with the first component and maps the user interaction (s) to a corresponding movement and/or rotation of a UAV.
  • the user interactions are mapped to (e.g., translated into) a first roll value, a first pitch value, and/or a first yaw value.
  • the second sensor detects user movements that may arise due to (e.g., because of) the user interaction (s) with the first component.
  • the user movements are mapped to (e.g., translated into) a second roll value, a second pitch value, and/or a second yaw value.
  • the electronic device determines an adjusted roll, pitch, and/or yaw value that is based on a combination of the first roll value, the first pitch value, the first yaw value, the second roll value, the second pitch value, and/or the second yaw value, and transmits to the UAV one or more parameters that include the adjusted roll, pitch, and/or yaw value (s) , to control a flight of the UAV.
  • the disclosed systems, devices, and methods take into account the effects of unintended user movement.
  • the disclosed systems, devices, and methods also transmit to the UAV adjusted roll, pitch, and/or yaw value (s) that more accurately reflect the user instructions. Accordingly, the user experience is enhanced.
  • a method is performed at an electronic device that is communicatively connected with an unmanned aerial vehicle (UAV) .
  • the electronic device includes an input interface, a first sensor, a second sensor, one or more processors, and memory.
  • the memory stores one or more instructions for execution by the one or more processors.
  • the electronic device detects a first user movement from the first sensor attached to a first body portion of a user.
  • the electronic device detects a second user movement from the second sensor attached to a second body portion of the user.
  • the first user movement represents a user instruction to control a flight of the UAV.
  • the second body portion is connected to the first body portion.
  • the electronic device determines one or more parameters associated with the user instruction to control the flight of the UAV based on an interaction between the first user movement and the second user movement.
  • the electronic device transmits to the UAV a wireless signal that is based on the one or more parameters.
  • the UAV is configured to adjust the flight of the UAV in accordance with the one or more parameters.
  • adjusting the flight of the UAV comprises adjusting a pitch, roll, and/or yaw of the UAV.
  • the electronic device includes a first component and a second component that is communicatively connected with the first component.
  • the user instruction to control a flight of the UAV comprises a first pitch value, a first roll value, and/or a first yaw value.
  • the method further comprises determining a second pitch value, a second roll value, and/or a second yaw value based on the second user movement from the second sensor.
  • determining the one or more parameters comprises determining one or more of: an adjusted pitch value based on a combination of the first pitch value and the second pitch value; an adjusted roll value based on a combination of the first roll value and the second roll value; and an adjusted yaw value based on a combination of the first yaw value and the second yaw value.
  • determining the one or more parameters comprises determining a weighted combination that includes a plurality of the first pitch value, the first roll value, the first yaw value, the second pitch value, the second roll value, and the second yaw value.
  • the weighted combination comprises a weighted combination of the first pitch value and the second pitch value.
  • the weighted combination comprises a weighted combination of the first roll value and the second roll value.
  • the weighted combination comprises a weighted combination of the first yaw value and the second yaw value.
  • the method further comprises: prior to determining the weighted combination: assigning respective first weights to the first pitch value, the first roll value, and/or the first yaw value; and assigning respective second weights to the second pitch value, the second roll value, and/or the second yaw value.
  • the weighted combination is further determined based on the respective assigned first weights and the respective assigned second weights.
  • At least one of the first weights has a value of zero.
  • At least one of the second weights has a value of zero.
  • the electronic device includes a first component and a second component that is mechanically coupled to the first component via a link.
  • the link comprises a first end that is rotatably coupled to the first component.
  • the link also comprises a second end that is rotatably coupled to the second component.
  • the first component includes the input interface.
  • the electronic device further includes a third sensor.
  • the first sensor is positioned on the first component.
  • the second sensor is positioned on the second end of the link.
  • the third sensor is positioned on the first end of the link.
  • the second sensor is configured to measure respective rotation angles at the second end.
  • the third sensor is configured to measure respective rotation angles at the first end.
  • the respective rotation angles at the first end include two or more of: a first pitch angle, a first roll angle, and/or a first yaw angle.
  • the respective rotation angles at the second end include two or more of: a second pitch angle, a second roll angle, and/or a first second angle.
  • determining one or more parameters of a command comprises determining a combined rotation angle based on a combination of one or more of: the first pitch angle and the second pitch angle; the first roll angle and the second roll angle; and the first yaw angle and the second yaw angle.
  • the first sensor is an inertial measurement unit sensor.
  • an electronic device comprises an input interface, a first sensor, a second sensor, one or more processors, and memory.
  • the memory stores one or more programs configured for execution by the one or more processors.
  • the one or more programs include instructions for performing any of the methods described herein.
  • a non-transitory computer-readable storage medium stores one or more programs configured for execution by an electronic device having an input interface, a first sensor, a second sensor, one or more processors, and memory.
  • the one or more programs include instructions for performing any of the methods described herein.
  • Figure 1 illustrates an exemplary target identification and tracking system according to some embodiments.
  • Figures 2A to 2C illustrate respectively, an exemplary movable object, an exemplary carrier of a movable object, and an exemplary payload of a movable object according to some embodiments.
  • Figure 3 illustrates an exemplary sensing system of a movable object according to some embodiments.
  • Figures 4A and 4B illustrate a block diagram of an exemplary memory of a movable object according to some embodiments.
  • Figure 5 illustrates an exemplary control unit of a target tracking system according to some embodiments.
  • Figure 6 illustrates an exemplary computing device for controlling a movable object according to some embodiments.
  • Figures 7A and 7B illustrate an exemplary configuration of a movable object, a carrier, and a payload according to some embodiments.
  • Figure 8 illustrates an exemplary operating environment according to some embodiments.
  • Figure 9 is a block diagram illustrating a representative electronic device according to some embodiments.
  • Figure 10 illustrates an electronic device according to some embodiments.
  • Figure 11 illustrates an electronic device according to some embodiments.
  • Figures 12A and 12B illustrates representative views of an electronic device according to some embodiments.
  • Figures 13A-13C illustrate a flowchart for a method performed at an electronic device according to some embodiments.
  • UAV unmanned aerial vehicle
  • UAVs may include, for example, fixed-wing aircrafts and/or rotary-wing aircrafts such as helicopters, quadcopters, and aircraft having other numbers and/or configurations of rotors. It will be apparent to those skilled in the art that other types of movable objects may be substituted for UAVs as described below in accordance with embodiments of the invention.
  • FIG. 1 illustrates an exemplary target identification and tracking system 100 according to some embodiments.
  • the target identification and tracking system 100 includes a movable object 102 (e.g., a UAV) and a control unit 104.
  • the movable object 102 is also referred to as a movable device (e.g., a movable electronic device) .
  • the target identification and tracking system 100 is used for identifying a target object 106 and/or for initiating tracking of the target object 106.
  • the target object 106 includes natural and/or man-made objects, such geographical landscapes (e.g., mountains, vegetation, valleys, lakes, and/or rivers) , buildings, and/or vehicles (e.g., aircrafts, ships, cars, trucks, buses, vans, and/or motorcycles) .
  • the target object 106 includes live subjects such as people and/or animals.
  • the target object 106 is a moving object, e.g., moving relative to a reference frame (such as the Earth and/or movable object 102) .
  • the target object 106 is static.
  • the target object 106 includes an active positioning and navigational system (e.g., a GPS system) that transmits information (e.g., location, positioning, and/or velocity information) about the target object 106 to the movable object 102, a control unit 104, and/or a computing device 126.
  • information may be transmitted to the movable object 102 via wireless communication from a communication unit of the target object 106 to a communication system 120 of the movable object 102, as illustrated in Figure 2A.
  • the movable object 102 includes a carrier 108 and/or a payload 110.
  • the carrier 108 is used to couple the payload 110 to the movable object 102.
  • the carrier 108 includes an element (e.g., a gimbal and/or damping element) to isolate the payload 110 from movement of the movable object 102.
  • the carrier 108 includes an element for controlling movement of the payload 110 relative to the movable object 102.
  • the payload 110 is coupled (e.g., rigidly coupled) to the movable object 102 (e.g., coupled via the carrier 108) such that the payload 110 remains substantially stationary relative to movable object 102.
  • the carrier 108 may be coupled to the payload 110 such that the payload is not movable relative to the movable object 102.
  • the payload 110 is mounted directly to the movable object 102 without requiring the carrier 108.
  • the payload 106 is located partially or fully within the movable object 102.
  • the movable object 102 is configured to communicate with the control unit 104, e.g., via wireless communications 124.
  • the movable object 102 may receive control instructions from the control unit 104 (e.g., via a user of the movable object 102) and/or send data (e.g., data from a movable object sensing system 122, Figure 2A) to the control unit 104.
  • control instructions may include, e.g., navigation instructions for controlling one or more navigational parameters of the movable object 102 such as a position, an orientation, an altitude, an attitude (e.g., aviation) and/or one or more movement characteristics of the movable object 102.
  • control instructions may include instructions for controlling one or more parameters of a carrier 108 and/or a payload 110.
  • the control instructions include instructions for directing movement of one or more of movement mechanisms 114 ( Figure 2A) of the movable object 102.
  • the control instructions may be used to control a flight of the movable object 102.
  • control instructions may include information for controlling operations (e.g., movement) of the carrier 108.
  • control instructions may be used to control an actuation mechanism of the carrier 108 so as to cause angular and/or linear movement of the payload 110 relative to the movable object 102.
  • control instructions are used to adjust one or more operational parameters for the payload 110, such as instructions for capturing one or more images, capturing video, adjusting a zoom level, powering on or off a component of the payload, adjusting an imaging mode (e.g., capturing still images or capturing video) , adjusting an image resolution, adjusting a focus, adjusting a viewing angle, adjusting a field of view, adjusting a depth of field, adjusting an exposure time, adjusting a shutter speed, adjusting a lens speed, adjusting an ISO, changing a lens and/or moving the payload 110 (and/or a part of payload 110, such as imaging device 214 (shown in Figure 2C) ) .
  • the control instructions are used to control the communication system 120, the sensing system 122, and/or another component of the movable object 102.
  • control instructions from the control unit 104 may include instructions to initiate tracking of a target object 106.
  • the control instructions may include information about the target object 106, such as identification of the target object 106, a location of the target object 106, a time duration during which the target object 106 is to be tracked, and/or other information.
  • the movable object 102 identifies and initiates tracking in accordance with the instructions.
  • the movable object 102 may receive another set of instructions from the control unit 104 (e.g., via the user) to stop tracking the target object 106.
  • the movable object 102 may pause or stop tracking the target object 106 when the target object 106 is no longer present (or visible) in the field of view of the movable object 102 after a certain time period (e.g., 5 minutes or 10 minutes) . In some embodiments, after the tracking has been paused or stopped, the movable object 102 may receive further instructions to resume tracking the target object 106.
  • the movable object 102 is configured to communicate with a computing device 126 (e.g., an electronic device, a computing system, and/or a server system) .
  • the movable object 102 receives control instructions from the computing device 126 and/or sends data (e.g., data from the movable object sensing system 122) to the computing device 126.
  • data e.g., data from the movable object sensing system 122
  • communications from the computing device 126 to the movable object 102 are transmitted from computing device 126 to a cell tower 130 (e.g., via internet 128 and/or other cellular networks such as 4G and 5G networks) and from the cell tower 130 to the movable object 102 (e.g., via RF signals) .
  • a satellite is used in lieu of or in addition to cell tower 130.
  • the target identification and tracking system 100 includes additional control units 104 and/or computing devices 126 that are configured to communicate with the movable object 102.
  • Figure 2A illustrates an exemplary movable object 102 according to some embodiments.
  • the movable object 102 includes processor (s) 116, memory 118, a communication system 120, a sensing system 122, a clock 152, and radio (s) 154, which are connected by data connections such as a control bus 112.
  • the control bus 112 optionally includes circuitry (sometimes called a chipset) that interconnects and controls communications between system components.
  • the movable object 102 is a UAV and includes components to enable flight and/or flight control. Although the movable object 102 is depicted as an aircraft in this example, this depiction is not intended to be limiting, and any suitable type of movable object may be used.
  • the movable object 102 includes movement mechanisms 114 (e.g., propulsion mechanisms) .
  • movement mechanisms 114 may refer to a single movement mechanism (e.g., a single propeller) or multiple movement mechanisms (e.g., multiple rotors) .
  • the movement mechanisms 114 may include one or more movement mechanism types such as rotors, propellers, blades, engines, motors, wheels, axles, magnets, and nozzles.
  • the movement mechanisms 114 are coupled to the movable object 102 at, e.g., the top, bottom, front, back, and/or sides.
  • the movement mechanisms 114 of a single movable object 102 may include multiple movement mechanisms each having the same type. In some embodiments, the movement mechanisms 114 of a single movable object 102 include multiple movement mechanisms with different movement mechanism types.
  • the movement mechanisms 114 are coupled to the movable object 102 using any suitable means, such as support elements (e.g., drive shafts) or other actuating elements (e.g., one or more actuators 132) .
  • the actuator 132 e.g., movable object actuator
  • receives control signals from processor (s) 116 e.g., via control bus 112 that activates the actuator to cause movement of a movement mechanism 114.
  • the processor (s) 116 include an electronic speed controller that provides control signals to the actuators 132.
  • the movement mechanisms 114 enable the movable object 102 to take off vertically from a surface or land vertically on a surface without requiring any horizontal movement of the movable object 102 (e.g., without traveling down a runway) .
  • the movement mechanisms 114 are operable to permit the movable object 102 to hover in the air at a specified position and/or orientation.
  • one or more of the movement mechanisms 114 are controllable independently of one or more of the other movement mechanisms 114. For example, when the movable object 102 is a quadcopter, each rotor of the quadcopter is controllable independently of the other rotors of the quadcopter.
  • multiple movement mechanisms 114 are configured for simultaneous movement.
  • the movement mechanisms 114 include multiple rotors that provide lift and/or thrust to the movable object 102.
  • the multiple rotors are actuated to provide, e.g., vertical takeoff, vertical landing, and hovering capabilities to the movable object 102.
  • one or more of the rotors spin in a clockwise direction, while one or more of the rotors spin in a counterclockwise direction.
  • the number of clockwise rotors is equal to the number of counterclockwise rotors.
  • the rotation rate of each of the rotors is independently variable, e.g., for controlling the lift and/or thrust produced by each rotor, and thereby adjusting the spatial disposition, velocity, and/or acceleration of the movable object 102 (e.g., with respect to up to three degrees of translation and/or up to three degrees of rotation) .
  • the memory 118 stores one or more instructions, programs (e.g., sets of instructions) , modules, controlling systems and/or data structures, collectively referred to as “elements” herein.
  • One or more elements described with regard to the memory 118 are optionally stored by the control unit 104, the computing device 126, and/or another device.
  • an imaging device 214 ( Figure 2C) includes memory that stores one or more parameters described with regard to the memory 118.
  • the memory 118 stores a controlling system configuration that includes one or more system settings (e.g., as configured by a manufacturer, administrator, and/or user) .
  • identifying information for the movable object 102 is stored as a system setting of the system configuration.
  • the controlling system configuration includes a configuration for the imaging device 214.
  • the configuration for the imaging device 214 stores parameters such as position (e.g., relative to the image sensor 216) , a zoom level and/or focus parameters (e.g., amount of focus, selecting autofocus or manual focus, and/or adjusting an autofocus target in an image) .
  • Imaging property parameters stored by the imaging device configuration include, e.g., image resolution, image size (e.g., image width and/or height) , aspect ratio, pixel count, quality, focus distance, depth of field, exposure time, shutter speed, and/or white balance.
  • parameters stored by the imaging device configuration are updated in response to control instructions (e.g., generated by processor (s) 116 and/or received by the movable object 102 from the control unit 104 and/or the computing device 126) .
  • control instructions e.g., generated by processor (s) 116 and/or received by the movable object 102 from the control unit 104 and/or the computing device 126) .
  • parameters stored by the imaging device configuration are updated in response to information received from the movable object sensing system 122 and/or the imaging device 214.
  • the carrier 108 is coupled to the movable object 102 and a payload 110 is coupled to the carrier 108.
  • the carrier 108 includes one or more mechanisms that enable the payload 110 to move relative to the movable object 102, as described further with respect to Figure 2B.
  • the payload 110 is rigidly coupled to the movable object 102 such that the payload 110 remains substantially stationary relative to the movable object 102.
  • the carrier 108 is coupled to the payload 110 such that the payload 110 is not movable relative to the movable object 102.
  • the payload 110 is coupled to the movable object 102 without requiring the use of the carrier 108.
  • the movable object 102 also includes the communication system 120, which enables communication with between the movable object 102 and the control unit 104, and/or the computing device 126 (e.g., via wireless signals 124) , and/or the electronic device 810 ( Figure 8) .
  • the communication system 120 includes transmitters, receivers, and/or transceivers for wireless communication.
  • the communication is a one-way communication, such that data is transmitted only from the movable object 102 to the control unit 104, or vice-versa.
  • communication is a two-way communication, such that data is transmitted from the movable object 102 to the control unit 104, as well as from the control unit 104 to the movable object 102.
  • the movable object 102 communicates with the computing device 126.
  • the movable object 102, the control unit 104, and/or the computing device 126 are connected to the Internet or other telecommunications network, e.g., such that data generated by the movable object 102, the control unit 104, and/or the computing device 126 is transmitted to a server for data storage and/or data retrieval (e.g., for display by a website) .
  • data generated by the movable object 102, the control unit 104, and/or the computing device 126 is stored locally on each of the respective devices.
  • the movable object 102 comprises a sensing system (e.g., the movable object sensing system 122) that includes one or more sensors, as described further with reference to Figure 3.
  • the movable object 102 and/or the control unit 104 use sensing data generated by sensors of sensing system 122 to determine information such as a position of the movable object 102, an orientation of the movable object 102, movement characteristics of the movable object 102 (e.g., an angular velocity, an angular acceleration, a translational velocity, a translational acceleration and/or a direction of motion along one or more axes) , a distance between the movable object 102 to a target object, proximity (e.g., distance) of the movable object 102 to potential obstacles, weather conditions, locations of geographical features and/or locations of manmade structures.
  • the movable object 102 comprises radio (s) 154.
  • the radio (s) 154 enable one or more communication networks, and allow the movable object 102 to communicate with other devices (e.g., electronic device 810, Figure 8) .
  • the radio (s) 154 are capable of data communications using any of a variety of custom or standard wireless protocols (e.g., IEEE 802.15.4, Wi-Fi, ZigBee, 6LoWPAN, Thread, Z-Wave, Bluetooth Smart, ISA100.5A, WirelessHART, MiWi, Ultrawide Band (UWB) , software defined radio (SDR) etc. ) custom or standard wired protocols (e.g., Ethernet, HomePlug, etc. ) , and/or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
  • custom or standard wireless protocols e.g., IEEE 802.15.4, Wi-Fi, ZigBee, 6LoWPAN, Thread, Z-Wave, Bluetooth Smart,
  • the movable object 102 includes a clock 152.
  • the clock 152 synchronizes (e.g., coordinates) time with a clock 922 of an electronic device 810 ( Figure 9) .
  • Figure 2B illustrates an exemplary carrier 108 according to some embodiments.
  • the carrier 108 couples the payload 110 to the movable object 102.
  • the carrier 108 includes a frame assembly having one or more frame members 202.
  • the frame member (s) 202 are coupled with the movable object 102 and the payload 110.
  • the frame member (s) 202 support the payload 110.
  • the carrier 108 includes one or more mechanisms, such as one or more actuators 204, to cause movement of the carrier 108 and/or the payload 110.
  • the actuator 204 is, e.g., a motor, such as a hydraulic, pneumatic, electric, thermal, magnetic, and/or mechanical motor.
  • the actuator 204 causes movement of the frame member (s) 202.
  • the actuator 204 rotates the payload 110 with respect to one or more axes, such as one or more of: an X axis ( “pitch axis” ) , a Z axis ( “roll axis” ) , and a Y axis ( “yaw axis” ) , relative to the movable object 102. In some embodiments, the actuator 204 translates the payload 110 along one or more axes relative to the movable object 102.
  • the carrier 108 includes a carrier sensing system 206 for determining a state of the carrier 108 or the payload 110.
  • the carrier sensing system 206 includes one or more of: motion sensors (e.g., accelerometers) , rotation sensors (e.g., gyroscopes) , potentiometers, and/or inertial sensors.
  • the carrier sensing system 206 includes one or more sensors of the movable object sensing system 122 as described below with respect to Figure 3.
  • Sensor data determined by the carrier sensing system 206 may include spatial disposition (e.g., position, orientation, or attitude) , movement information such as velocity (e.g., linear or angular velocity) and/or acceleration (e.g., linear or angular acceleration) of the carrier 108 and/or the payload 110.
  • movement information such as velocity (e.g., linear or angular velocity) and/or acceleration (e.g., linear or angular acceleration) of the carrier 108 and/or the payload 110.
  • the sensing data as well as state information calculated from the sensing data are used as feedback data to control the movement of one or more components (e.g., the frame member 202 (s) , the actuator 204, and/or the damping element 208) of the carrier 108.
  • the carrier sensing system 206 is coupled to the frame member (s) 202, the actuator 204, the damping element 208, and/or the payload 110.
  • a sensor in the carrier sensing system 206 may measure movement of the actuator 204 (e.g., the relative positions of a motor rotor and a motor stator) and generate a position signal representative of the movement of the actuator 204 (e.g., a position signal representative of relative positions of the motor rotor and the motor stator) .
  • data generated by the sensors is received by processor (s) 116 and/or memory 118 of the movable object 102.
  • the coupling between the carrier 108 and the movable object 102 includes one or more damping elements 208.
  • the damping element (s) 208 are configured to reduce or eliminate movement of the load (e.g., the payload 110 and/or the carrier 108) caused by movement of the movable object 102.
  • the damping element (s) 208 may include active damping elements, passive damping elements, and/or hybrid damping elements having both active and passive damping characteristics.
  • the motion damped by the damping element (s) 208 may include vibrations, oscillations, shaking, and/or impacts. Such motions may originate from motions of the movable object 102, which are transmitted to the payload 110.
  • the motion may include vibrations caused by the operation of a propulsion system and/or other components of the movable object 102.
  • the damping element (s) 208 provide motion damping by isolating the payload 110 from the source of unwanted motion, by dissipating or reducing the amount of motion transmitted to the payload 110 (e.g., vibration isolation) .
  • the damping element 208 (s) reduce a magnitude (e.g., an amplitude) of the motion that would otherwise be experienced by the payload 110.
  • the motion damping applied by the damping element (s) 208 is used to stabilize the payload 110, thereby improving the quality of video and/or images captured by the payload 110 (e.g., using the imaging device 214, Figure 2C) .
  • the improved video and/or image quality reduces the computational complexity of processing steps required to generate an edited video based on the captured video, or to generate a panoramic image based on the captured images.
  • the damping element (s) 208 may be manufactured using any suitable material or combination of materials, including solid, liquid, or gaseous materials.
  • the materials used for the damping element (s) 208 may be compressible and/or deformable.
  • the damping element (s) 208 may be made of sponge, foam, rubber, gel, and the like.
  • the damping element (s) 208 may include rubber balls that are substantially spherical in shape.
  • the damping element (s) 208 may be substantially spherical, rectangular, and/or cylindrical in shape.
  • the damping element (s) 208 may include piezoelectric materials or shape memory materials.
  • the damping element (s) 208 may include one or more mechanical elements, such as springs, pistons, hydraulics, pneumatics, dashpots, shock absorbers, and/or isolators.
  • properties of the damping element (s) 208 are selected so as to provide a predetermined amount of motion damping.
  • the damping element (s) 208 have viscoelastic properties.
  • the properties of damping element (s) 208 may be isotropic or anisotropic.
  • the damping element (s) 208 provide motion damping equally along all directions of motion.
  • the damping element (s) 208 provide motion damping only along a subset of the directions of motion (e.g., along a single direction of motion) .
  • the damping element (s) 208 may provide damping primarily along the Y (yaw) axis. In this manner, the illustrated damping element (s) 208 reduce vertical motions.
  • the carrier 108 further includes a controller 210.
  • the controller 210 may include one or more controllers and/or processors.
  • the controller 210 receives instructions from the processor (s) 116 of the movable object 102.
  • the controller 210 may be connected to the processor (s) 116 via the control bus 112.
  • the controller 210 may control movement of the actuator 204, adjust one or more parameters of the carrier sensing system 206, receive data from carrier sensing system 206, and/or transmit data to the processor (s) 116.
  • Figure 2C illustrates an exemplary payload 110 according to some embodiments.
  • the payload 110 includes a payload sensing system 212 and a controller 218.
  • the payload sensing system 212 may include an imaging device 214 (e.g., a camera) having an image sensor 216 with a field of view.
  • the payload sensing system 212 includes one or more sensors of the movable object sensing system 122, as described below with respect to Figure 3.
  • the payload sensing system 212 generates static sensing data (e.g., a single image captured in response to a received instruction) and/or dynamic sensing data (e.g., a series of images captured at a periodic rate, such as a video) .
  • static sensing data e.g., a single image captured in response to a received instruction
  • dynamic sensing data e.g., a series of images captured at a periodic rate, such as a video
  • the image sensor 216 is, e.g., a sensor that detects light, such as visible light, infrared light, and/or ultraviolet light.
  • the image sensor 216 includes, e.g., semiconductor charge-coupled device (CCD) , active pixel sensors using complementary metal–oxide–semiconductor (CMOS) or N-type metal-oxide-semiconductor (NMOS, Live MOS) technologies, or any other types of sensors.
  • CMOS complementary metal–oxide–semiconductor
  • NMOS N-type metal-oxide-semiconductor
  • Live MOS Live MOS
  • Adjustable parameters of imaging device 214 include, e.g., width, height, aspect ratio, pixel count, resolution, quality, imaging mode, focus distance, depth of field, exposure time, shutter speed and/or lens configuration.
  • the imaging device 214 may configured to capture videos and/or images at different resolutions (e.g., low, medium, high, or ultra-high resolutions, and/or high-definition or ultra-high-definition videos such as 720p, 1080i, 1080p, 1440p, 2000p, 2160p, 2540p, 4000p, and 4320p) .
  • the payload 110 includes the controller 218.
  • the controller 218 may include one or more controllers and/or processors.
  • the controller 218 receives instructions from the processor (s) 116 of the movable object 102.
  • the controller 218 is connected to the processor (s) 116 via the control bus 112.
  • the controller 218 may adjust one or more parameters of one or more sensors of the payload sensing system 212, receive data from one or more sensors of payload sensing system 212, and/or transmit data, such as image data from the image sensor 216, to the processor (s) 116, the memory 118, and/or the control unit 104.
  • data generated by one or more sensors of the payload sensor system 212 is stored, e.g., by the memory 118.
  • data generated by the payload sensor system 212 are transmitted to the control unit 104 (e.g., via communication system 120) .
  • video is streamed from the payload 110 (e.g., the imaging device 214) to the control unit 104.
  • the control unit 104 displays, e.g., real-time (or slightly delayed) video received from the imaging device 214.
  • an adjustment of the orientation, position, altitude, and/or one or more movement characteristics of the movable object 102, the carrier 108, and/or the payload 110 is generated (e.g., by the processor (s) 116) based at least in part on configurations (e.g., preset and/or user configured in system configuration 400, Figure 4) of the movable object 102, the carrier 108, and/or the payload 110.
  • an adjustment that involves a rotation with respect to two axes is achieved solely by corresponding rotation of movable object around the two axes if the payload 110 including imaging device 214 is rigidly coupled to the movable object 102 (and hence not movable relative to movable object 102) and/or the payload 110 is coupled to the movable object 102 via a carrier 108 that does not permit relative movement between the imaging device 214 and the movable object 102.
  • the same two-axis adjustment may be achieved by, e.g., combining adjustments of both the movable object 102 and the carrier 108 if the carrier 108 permits the imaging device 214 to rotate around at least one axis relative to the movable object 102.
  • the carrier 108 can be controlled to implement the rotation around one or two of the two axes required for the adjustment and the movable object 120 can be controlled to implement the rotation around one or two of the two axes.
  • the carrier 108 may include a one-axis gimbal that allows the imaging device 214 to rotate around one of the two axes required for adjustment while the rotation around the remaining axis is achieved by the movable object 102.
  • the same two-axis adjustment is achieved by the carrier 108 alone when the carrier 108 permits the imaging device 214 to rotate around two or more axes relative to the movable object 102.
  • the carrier 108 may include a two-axis or three-axis gimbal that enables the imaging device 214 to rotate around two or all three axes.
  • Figure 3 illustrates an exemplary sensing system 122 of a movable object 102 according to some embodiments.
  • one or more sensors of the movable object sensing system 122 are mounted to an exterior, or located within, or otherwise coupled to the movable object 102.
  • one or more sensors of movable object sensing system are components of carrier sensing system 206 and/or payload sensing system 212. Where sensing operations are described as being performed by the movable object sensing system 122 herein, it will be recognized that such operations are optionally performed by the carrier sensing system 206 and/or the payload sensing system 212.
  • the movable object sensing system 122 generates static sensing data (e.g., a single image captured in response to a received instruction) and/or dynamic sensing data (e.g., a series of images captured at a periodic rate, such as a video) .
  • the movable object sensing system 122 includes one or more image sensors 302, such as image sensor 308 (e.g., a left stereographic image sensor) and/or image sensor 310 (e.g., a right stereographic image sensor) .
  • the image sensors 302 capture, e.g., images, image streams (e.g., videos) , stereographic images, and/or stereographic image streams (e.g., stereographic videos) .
  • the image sensors 302 detect light, such as visible light, infrared light, and/or ultraviolet light.
  • the movable object sensing system 122 includes one or more optical devices (e.g., lenses) to focus or otherwise alter the light onto the one or more image sensors 302.
  • the image sensors 302 include, e.g., semiconductor charge-coupled devices (CCD) , active pixel sensors using complementary metal–oxide–semiconductor (CMOS) or N-type metal-oxide-semiconductor (NMOS, Live MOS) technologies, or any other types of sensors.
  • CCD semiconductor charge-coupled devices
  • CMOS complementary metal–oxide–semiconductor
  • NMOS N-type metal-oxide-semiconductor
  • Live MOS Live MOS
  • the movable object sensing system 122 includes one or more audio transducers 304.
  • the audio transducers 304 may include an audio output transducer 312 (e.g., a speaker) , and an audio input transducer 314 (e.g. a microphone, such as a parabolic microphone) .
  • the audio output transducer 312 and the audio input transducer 314 are used as components of a sonar system for tracking a target object (e.g., detecting location information of a target object) .
  • the movable object sensing system 122 includes one or more infrared sensors 306.
  • a distance measurement system includes a pair of infrared sensors e.g., infrared sensor 316 (such as a left infrared sensor) and infrared sensor 318 (such as a right infrared sensor) or another sensor or sensor pair. The distance measurement system is used for measuring a distance between the movable object 102 and the target object 106.
  • the movable object sensing system 122 may include other sensors for sensing a distance between the movable object 102 and the target object 106, such as a Radio Detection and Ranging (RADAR) sensor, a Light Detection and Ranging (LiDAR) sensor, or any other distance sensor.
  • RADAR Radio Detection and Ranging
  • LiDAR Light Detection and Ranging
  • a system to produce a depth map includes one or more sensors or sensor pairs of movable object sensing system 122 (such as a left stereographic image sensor 308 and a right stereographic image sensor 310; an audio output transducer 312 and an audio input transducer 314; and/or a left infrared sensor 316 and a right infrared sensor 318.
  • a pair of sensors in a stereo data system e.g., a stereographic imaging system
  • a depth map is generated by a stereo data system using the simultaneously captured data.
  • a depth map is used for positioning and/or detection operations, such as detecting a target object 106, and/or detecting current location information of a target object 106.
  • the movable object sensing system 122 includes one or more global positioning system (GPS) sensors, motion sensors (e.g., accelerometers) , rotation sensors (e.g., gyroscopes) , inertial sensors, proximity sensors (e.g., infrared sensors) and/or weather sensors (e.g., pressure sensor, temperature sensor, moisture sensor, and/or wind sensor) .
  • GPS global positioning system
  • motion sensors e.g., accelerometers
  • rotation sensors e.g., gyroscopes
  • inertial sensors e.g., inertial sensors
  • proximity sensors e.g., infrared sensors
  • weather sensors e.g., pressure sensor, temperature sensor, moisture sensor, and/or wind sensor
  • sensing data generated by one or more sensors of the movable object sensing system 122 and/or information determined using sensing data from one or more sensors of the movable object sensing system 122 are transmitted to the control unit 104 (e.g., via the communication system 120) .
  • data generated one or more sensors of the movable object sensing system 122 and/or information determined using sensing data from one or more sensors of the movable object sensing system 122 is stored by the memory 118.
  • Figures 4A and 4B illustrate a block diagram of an exemplary memory 118 of a movable object 102 according to some embodiments.
  • one or more elements illustrated in Figures 4A and/or 4B may be located in the control unit 104, the computing device 126, and/or another device.
  • the memory 118 stores a system configuration 400.
  • the system configuration 400 includes one or more system settings (e.g., as configured by a manufacturer, administrator, and/or user of the movable object 102) .
  • system settings e.g., as configured by a manufacturer, administrator, and/or user of the movable object 102
  • a constraint on one or more of orientation, position, attitude, and/or one or more movement characteristics of the movable object 102, the carrier 108, and/or the payload 110 is stored as a system setting of the system configuration 400.
  • the memory 118 stores a radio communication module 401.
  • the radio communication module 401 connects to and communicates with other network devices (e.g., a local network, such as a router that provides Internet connectivity, networked storage devices, network routing devices, electronic device 810 etc. ) that are coupled to one or more communication networks (e.g., communication network (s) 810, Figure 8) via the communication system 120 (wired or wireless) .
  • network devices e.g., a local network, such as a router that provides Internet connectivity, networked storage devices, network routing devices, electronic device 810 etc.
  • communication networks e.g., communication network (s) 810, Figure 8
  • the memory 118 stores a motion control module 402.
  • the motion control module 402 stores control instructions that are received from the control module 104 and/or the computing device 126. The control instructions are used for controlling operation of the movement mechanisms 114, the carrier 108, and/or the payload 110.
  • memory 118 stores a tracking module 404.
  • the tracking module 404 generates tracking information for a target object 106 that is being tracked by the movable object 102.
  • the tracking information is generated based on images captured by the imaging device 214 and/or based on output from an video analysis module 406 (e.g., after pre-processing and/or processing operations have been performed on one or more images) and/or based on input of a user.
  • the tracking information may be generated based on analysis of gestures of a human target, which are captured by the imaging device 214 and/or analyzed by a gesture analysis module 403.
  • the tracking information generated by the tracking module 404 may include a location, a size, and/or other characteristics of the target object 106 within one or more images.
  • the tracking information generated by the tracking module 404 is transmitted to the control unit 104 and/or the computing device 126 (e.g., augmenting or otherwise combined with images and/or output from the video analysis module 406) .
  • the tracking information may be transmitted to the control unit 104 in response to a request from the control unit 104 and/or on a periodic basis (e.g., every 2 seconds, 5 seconds, 10 seconds, or 30 seconds) .
  • the memory 118 includes a video analysis module 406.
  • the video analysis module 406 performs processing operations on videos and images, such as videos and images captured by the imaging device 214.
  • the video analysis module 406 performs pre-processing on raw video and/or image data, such as re-sampling to assure the correctness of the image coordinate system, noise reduction, contrast enhancement, and/or scale space representation.
  • the processing operations performed on video and image data include feature extraction, image segmentation, data verification, image recognition, image registration, and/or image matching.
  • the output from the video analysis module 406 (e.g., after the pre-processing and/or processing operations have been performed) is transmitted to the control unit 104 and/or the computing device 126.
  • feature extraction is performed by the control unit 104, the processor (s) 116 of the movable object 102, and/or the computing device 126.
  • the video analysis module 406 may use neural networks to perform image recognition and/or classify object (s) that are included in the videos and/or images.
  • the video analysis module 406 may extract frames that include the target object 106, analyze features of the target object 106, and compare the features with characteristics of one or more predetermined recognizable target object types, thereby enabling the target object 106 to be recognized at a certain confidence level.
  • the memory 118 includes a gesture analysis module 403.
  • the gesture analysis module 403 processes gestures of one or more human targets.
  • the gestures may be captured by the imaging device 214.
  • the gesture analysis results may be fed into the tracking module 404 and/or the motion control module 402 to generate, respectively, tracking information and/or control instructions that are used for controlling operations of the movement mechanisms 114, the carrier 108, and/or the payload 110 of the movable object 102.
  • a calibration process may be performed before using gestures of a human target to control the movable object 102.
  • the gesture analysis module 403 may capture certain features of human gestures associated with a certain control command and stores the gesture features in the memory 118. When a human gesture is received, the gesture analysis module 403 may extract features of the human gesture and compare these features with the stored features to determine whether the certain command may be performed by the user.
  • the correlations between gestures and control commands associated with a certain human target may or may not be different from such correlations associated with another human target.
  • the memory 118 includes a spatial relationship determination module 405.
  • the spatial relationship determination module 405 calculates one or more spatial relationships between the target object 106 and the movable object 102, such as a horizontal distance between the target object 106 and the movable object 102, and/or a pitch angle between the target object 106 and the movable object 102.
  • the memory 118 includes a signal processing module 407.
  • the signal processing module 407 processes signals (e.g., wireless signals) that are received by the movable object 102 (e.g., from an electronic device 810 of Figure 8, from the control unit 104, etc. ) .
  • the movable object 102 uses the signals to determine position (e.g., positional coordinates) of the target object 106.
  • the signals may include direction (s) of illumination, pattern (s) of illumination, wavelength (s) (e.g., color) of illumination, and/or temporal frequencies of illumination, and/or times of illumination, and/or intensities of illumination.
  • the signals may include a position of the electronic device 810.
  • the signals may include one or more parameters from the electronic device 810 to control a flight of the movable object 102.
  • the one or more parameters include a pitch, roll, and/or yaw value (s) for the movable object 102 from sensors 920, and/or adjusted pitch, roll, and/or yaw value (s) that are based on a combination of the sensors 920.
  • the one or more parameters include raw data from the sensors 920 of the electronic device 810.
  • the signal processing module 407 processes the raw sensor data to determine a flight of the movable object 102.
  • the memory 118 stores target information 408.
  • the target information 408 is received by the movable object 102 (e.g., via communication system 120) from the control unit 104, the computing device 126, the target object 106, and/or another movable object.
  • the target information 408 includes a time value (e.g., a time duration) and/or an expiration time indicating a period of time during which the target object 106 is to be tracked.
  • the target information 408 includes a flag (e.g., a label) indicating whether a target information entry includes specific tracked target information 412 and/or target type information 410.
  • the target information 408 includes target type information 410 such as color, texture, pattern, size, shape, and/or dimension.
  • the target type information 410 includes, but is not limited to, a predetermined recognizable object type and a general object type as identified by the video analysis module 406.
  • the target type information 410 includes features or characteristics for each type of target and is preset and stored in the memory 118.
  • the target type information 410 is provided to a user input device (e.g., the control unit 104) via user input.
  • the user may select a pre-existing target pattern or type (e.g., an object or a round object with a radius greater or less than a certain value) .
  • the target information 408 includes tracked target information 412 for a specific target object 106 being tracked.
  • the target information 408 may be identified by the video analysis module 406 by analyzing the target in a captured image.
  • the tracked target information 412 includes, e.g., an image of the target object 106, an initial position (e.g., location coordinates, such as pixel coordinates within an image) of the target object 106, and/or a size of the target object 106 within one or more images (e.g., images captured by the imaging device 214) .
  • a size of the target object 106 is stored, e.g., as a length (e.g., mm or other length unit) , an area (e.g., mm 2 or other area unit) , a number of pixels in a line (e.g., indicating a length, width, and/or diameter) , a ratio of a length of a representation of the target in an image relative to a total image length (e.g., a percentage) , a ratio of an area of a representation of the target in an image relative to a total image area (e.g., a percentage) , a number of pixels indicating an area of target object 106, and/or a corresponding spatial relationship (e.g., a vertical distance and/or a horizontal distance) between the target object 106 and the movable object 102 (e.g., an area of the target object 106 changes based on a distance of the target object 106 from the movable object 102) .
  • a length
  • one or more features (e.g., characteristics) of the target object 106 are determined from an image of the target object 106 (e.g., using image analysis techniques on images captured by the imaging device 112) .
  • one or more features of the target object 106 are determined from an orientation and/or part or all of identified boundaries of the target object 106.
  • the tracked target information 412 includes pixel coordinates and/or a number of pixel counts to indicate, e.g., a size parameter, position, and/or shape of the target object 106.
  • one or more features of the tracked target information 412 are to be maintained as the movable object 102 tracks the target object 106 (e.g., the tracked target information 412 are to be maintained as images of the target object 106 are captured by the imaging device 214) .
  • the tracked target information 412 is used to adjust the movable object 102, the carrier 108, and/or the imaging device 214, such that specific features of the target object 106 are substantially maintained.
  • the tracked target information 412 is determined based on one or more of the target types 410.
  • the memory 118 also includes predetermined recognizable target type information 414.
  • the predetermined recognizable target type information 414 specifies one or more characteristics of certain predetermined recognizable target types (e.g., target type 1, target type 2, ..., target type n) .
  • Each predetermined recognizable target type may include one or more characteristics such as a size parameter (e.g., area, diameter, height, length and/or width) , position (e.g., relative to an image center and/or image boundary) , movement (e.g., speed, acceleration, altitude) and/or shape.
  • target type 1 may be a human target.
  • One or more characteristics associated with a human target may include a height in a range from about 1.4 meters to about 2 meters, a pattern comprising a head, shoulders, a torso, joints and/or limbs, and/or a moving speed having a range from about 2 kilometers/hour to about 25 kilometers/hour.
  • target type 2 may be a car target.
  • One or more characteristics associated with a car target may include a height in a range from about 1.4 meters to about 4.5 meters, a length having a range from about 3 meters to about 10 meters, a moving speed of 5 kilometers/hour to about 140 kilometers/hour, and/or a pattern of a sedan, a SUV, a truck, or a bus.
  • target type 3 may be a ship target.
  • Other types of predetermined recognizable target object may also include: an airplane target, an animal target, other moving targets, and stationary (e.g., non-moving) targets such as a building and a statue.
  • Each predetermined target type may further include one or more subtypes, each of the subtypes having more specific characteristics thereby providing more accurate target classification results.
  • the target information 408 (including, e.g., the target type information 410 and the tracked target information 412) , and/or predetermined recognizable target information 414 is generated based on user input, such as a user input received at user input device 506 ( Figure 5) of the control unit 104. Additionally or alternatively, the target information 408 may be generated based on data from sources other than the control unit 104.
  • the target type information 410 may be based on previously stored images of the target object 106 (e.g., images captured by the imaging device 214 and stored by the memory 118) , other data stored by the memory 118, and/or data from data stores that are remote from the control unit 104 and/or the movable object 102.
  • the target type information 408 is generated using a computer-generated image of the target object 106.
  • the target information 408 is used by the movable object 102 (e.g., the tracking module 404) to track the target object 106.
  • the target information 408 is used by a video analysis module 406 to identify and/or classify the target object 106.
  • target identification involves image recognition and/or matching algorithms based on, e.g., CAD-like object models, appearance-based methods, feature-based methods, and/or genetic algorithms.
  • target identification includes comparing two or more images to determine, extract, and/or match features contained therein.
  • the memory 118 also includes flight routes 416 (e.g., predefined flight routes) of the movable object 102, such as a portrait flight route 418 (e.g., when the target object 106 is a person) , a long range flight route 420, and a normal flight route 422.
  • flight routes 416 includes one or more flight paths, each of the one or more paths having a corresponding trajectory mode.
  • the movable object 102 automatically selects one of the predefined flight routes according to a target type of the target object 106 and executes an autonomous flight according to the predefined flight route.
  • the movable object 102 after automatically selecting a flight route 416 for the movable object 102, the movable object 102 further performs an automatic customization of the flight route taking into consideration factors such as a distance between the movable object 102 and the target object 106, presence of potential obstacle (s) and/or other structures (e.g., buildings and trees) , or weather conditions.
  • customization of the flight route includes modifying a rate of ascent of the movable object 102, an initial velocity of the movable object 102, and/or an acceleration of the movable object 102.
  • the customization is provided in part by a user.
  • the movable object 102 may cause the computing device 126 to display a library of trajectories that can be selected by the user.
  • the movable object 102 then automatically generates the paths of the flight route based on the user selections.
  • the flight routes 416 also include user defined flight route (s) 424, which are routes that are defined and customized by the user.
  • the user may define a flight route using the control unit 104 (e.g., by identifying two or more points of interests on a map that is displayed on the control unit 104) .
  • the control unit 104 may transmit to the movable object 102 a user defined flight route 424 that includes the identified points of interest and/or positional coordinates of the identified points of interest.
  • the memory stores data 426 that are captured by the image sensor 216 during an autonomous flight, including video data 428 and image (s) 430.
  • the data 426 also includes audio data 432 that are captured by a microphone of the movable object 102 (e.g., the audio input transducer 314) .
  • the data 426 is simultaneously stored on the moving object 102 as it is being captured.
  • the memory 118 further stores with the data 426 metadata information.
  • the video data 428 may include tag information (e.g., metadata) that identifies the flight path and trajectory mode corresponding to a respective segment of the video data 428.
  • the data 426 further includes mapping data 434.
  • the mapping data comprises mapping relationships between user movements (e.g., movements detected by the electronic device 810) and corresponding pitch, roll, and/or yaw values for the movable object 102.
  • the memory 118 may store a subset of the modules and data structures identified above. Furthermore, the memory 118 may store additional modules and data structures not described above.
  • the programs, modules, and data structures stored in the memory 118, or a non-transitory computer readable storage medium of the memory 118 provide instructions for implementing respective operations in the methods described below. In some embodiments, some or all of these modules may be implemented with specialized hardware circuits that subsume part or all of the module functionality.
  • One or more of the above identified elements may be executed by the one or more processors 116 of the movable object 102. In some embodiments, one or more of the above identified elements is executed by one or more processors of a device remote from the movable object 102, such as the control unit 104 and/or the computing device 126.
  • Figure 5 illustrates an exemplary control unit 104 of the target identification and tracking system 100, in accordance with some embodiments.
  • the control unit 104 communicates with the movable object 102 via the communication system 120, e.g., to provide control instructions to the movable object 102.
  • the control unit 104 is typically a portable (e.g., handheld) device, the control unit 104 need not be portable.
  • control unit 104 is a dedicated control device (e.g., dedicated to operation of movable object 102) , a laptop computer, a desktop computer, a tablet computer, a gaming system, a wearable device (e.g., watches, glasses, gloves, and/or helmet) , a microphone, and/or a combination thereof.
  • a dedicated control device e.g., dedicated to operation of movable object 102
  • a laptop computer e.g., a desktop computer
  • a tablet computer e.g., a gaming system
  • a wearable device e.g., watches, glasses, gloves, and/or helmet
  • a microphone e.g., and/or a combination thereof.
  • the control unit 104 typically includes one or more processor (s) 502, a communication system 510 (e.g., including one or more network or other communications interfaces) , memory 504, one or more input/output (I/O) interfaces (e.g., an input device 506 and/or a display 506) , and one or more communication buses 512 for interconnecting these components.
  • the input device 506 and/or the display 508 comprises a touchscreen display.
  • the touchscreen display optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments.
  • the touchscreen display and the processor (s) 502 optionally detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touchscreen display.
  • the input device 506 includes one or more: joysticks, switches, knobs, slide switches, buttons, dials, keypads, keyboards, mice, audio transducers (e.g., microphones for voice control systems) , motion sensors, and/or gesture controls.
  • an I/O interface of the control unit 104 includes sensors (e.g., GPS sensors, and/or accelerometers) , audio output transducers (e.g., speakers) , and/or one or more tactile output generators for generating tactile outputs.
  • the input device 506 receives user input to control aspects of the movable object 102, the carrier 108, the payload 110, or a component thereof. Such aspects include, e.g., attitude (e.g., aviation) , position, orientation, velocity, acceleration, navigation, and/or tracking.
  • attitude e.g., aviation
  • the input device 506 is manually set to one or more positions by a user. Each of the positions may correspond to a predetermined input for controlling the movable object 102.
  • the input device 506 is manipulated by a user to input control instructions for controlling the navigation of the movable object 102.
  • the input device 506 is used to input a flight mode for the movable object 102, such as auto pilot or navigation according to a predetermined navigation path.
  • the input device 506 is used to input a target tracking mode for the movable object 102, such as a manual tracking mode or an automatic tracking mode.
  • the user controls the movable object 102, e.g., the position, attitude, and/or orientation of the movable object 102, by changing a position of the control unit 104 (e.g., by tilting or otherwise moving the control unit 104) .
  • a change in a position of the control unit 104 may detected by one or more inertial sensors and output of the one or more inertial sensors may be used to generate command data.
  • the input device 506 is used to adjust an operational parameter of the payload, such as a parameter of the payload sensing system 212 (e.g., to adjust a zoom parameter of the imaging device 214) and/or an attitude of the payload 110 relative to the carrier 108 and/or the movable object 102.
  • an operational parameter of the payload such as a parameter of the payload sensing system 212 (e.g., to adjust a zoom parameter of the imaging device 214) and/or an attitude of the payload 110 relative to the carrier 108 and/or the movable object 102.
  • the input device 506 is used to indicate information about the target object 106, e.g., to select a target object 106 to track and/or to indicate the target type information 412.
  • the input device 506 is used for interaction with augmented image data.
  • an image displayed by the display 508 includes representations of one or more target objects 106.
  • representations of the one or more target objects 106 are augmented to indicate identified objects for potential tracking and/or a target object 106 that is currently being tracked. Augmentation includes, for example, a graphical tracking indicator (e.g., a box) adjacent to or surrounding a respective target object 106.
  • the input device 506 is used to select a target object 106 to track or to change the target object being tracked.
  • a target object 106 is selected when an area corresponding to a representation of the target object 106 is selected by e.g., a finger, stylus, mouse, joystick, or other component of the input device 506.
  • the specific target information 412 is generated when a user selects a target object 106 to track.
  • the control unit 104 may also be configured to allow a user to enter target information using any suitable method.
  • the input device 506 receives a selection of a target object 106 from one or more images (e.g., video or snapshot) displayed by the display 508.
  • the input device 506 receives input including a selection performed by a gesture around the target object 106 and/or a contact at a location corresponding to the target object 106 in an image.
  • computer vision or other techniques are used to determine a boundary of the target object 106.
  • input received at the input device 506 defines a boundary of the target object 106.
  • multiple targets are simultaneously selected.
  • a selected target is displayed with a selection indicator (e.g., a bounding box) to indicate that the target is selected for tracking.
  • the input device 506 receives input indicating information such as color, texture, shape, dimension, and/or other characteristics associated with a target object 106.
  • the input device 506 includes a keyboard to receive typed input indicating the target information 408.
  • the control unit 104 provides an interface that enables a user to select (e.g., using the input device 506) between a manual tracking mode and an automatic tracking mode.
  • the interface enables the user to select a target object 106 to track.
  • a user is enabled to manually select a representation of a target object 106 from an image displayed by the display 508 of the control unit 104.
  • Specific target information 412 associated with the selected target object 106 is transmitted to the movable object 102, e.g., as initial expected target information.
  • the input device 506 receives target type information 410 from a user input.
  • the movable object 102 uses the target type information 410, e.g., to automatically identify the target object 106 to be tracked and/or to track the identified target object 106.
  • manual tracking requires more user control of the tracking of the target and less automated processing or computation (e.g., image or target recognition) by the processor (s) 116 of the movable object 102, while automatic tracking requires less user control of the tracking process but more computation performed by the processor (s) 116 of the movable object 102 (e.g., by the video analysis module 406) .
  • allocation of control over the tracking process between the user and the onboard processing system is adjusted, e.g., depending on factors such as the surroundings of movable object 102, motion of the movable object 102, altitude of the movable object 102, the system configuration 400 (e.g., user preferences) , and/or available computing resources (e.g., CPU or memory) of the movable object 102, the control unit 104, and/or the computing device 126.
  • relatively more control is allocated to the user when movable object is navigating in a relatively complex environment (e.g., with numerous buildings or obstacles or indoor) than when movable object is navigating in a relatively simple environment (e.g., wide open space or outdoor) .
  • control is allocated to the user when the movable object 102 is at a lower altitude than when the movable object 102 is at a higher altitude.
  • more control is allocated to the movable object 102 if the movable object 102 is equipped with a high-speed processor adapted to perform complex computations relatively quickly.
  • the allocation of control over the tracking process between the user and the movable object 102 is dynamically adjusted based on one or more of the factors described herein.
  • the control unit 104 includes an electronic device (e.g., a portable electronic device) and an input device 506 that is a peripheral device that is communicatively coupled (e.g., via a wireless and/or wired connection) and/or mechanically coupled to the electronic device.
  • the control unit 104 includes a portable electronic device (e.g., a cellphone or a smart phone) and a remote control device (e.g., a standard remote control with a joystick) coupled to the portable electronic device.
  • a portable electronic device e.g., a cellphone or a smart phone
  • a remote control device e.g., a standard remote control with a joystick
  • the display device 508 displays information about the movable object 102, the carrier 108, and/or the payload 110, such as position, attitude, orientation, movement characteristics of the movable object 102, and/or distance between the movable object 102 and another object (e.g., the target object 106 and/or an obstacle) .
  • information displayed by the display device 508 includes images captured by the imaging device 214, tracking data (e.g., a graphical tracking indicator applied to a representation of the target object 106, such as a box or other shape around the target object 106 shown to indicate that target object 106 is currently being tracked) , and/or indications of control data transmitted to the movable object 102.
  • the images including the representation of the target object 106 and the graphical tracking indicator are displayed in substantially real-time as the image data and tracking information are received from the movable object 102 and/or as the image data is acquired.
  • the communication system 510 enables communication with the communication system 120 of the movable object 102, the communication system 610 ( Figure 6) of the computing device 126, and/or a base station (e.g., computing device 126) via a wired or wireless communication connection.
  • the communication system 510 transmits control instructions (e.g., navigation control instructions, target information, and/or tracking instructions) .
  • the communication system 510 receives data (e.g., tracking data from the payload imaging device 214, and/or data from movable object sensing system 122) .
  • the control unit 104 receives tracking data (e.g., via the wireless communications 124) from the movable object 102.
  • Tracking data is used by the control unit 104 to, e.g., display the target object 106 as the target is being tracked.
  • data received by the control unit 104 includes raw data (e.g., raw sensing data as acquired by one or more sensors) and/or processed data (e.g., raw data as processed by, e.g., the tracking module 404) .
  • the memory 504 stores instructions for generating control instructions automatically and/or based on input received via the input device 506.
  • the control instructions may include control instructions for operating the movement mechanisms 114 of the movable object 102 (e.g., to adjust the position, attitude, orientation, and/or movement characteristics of the movable object 102, such as by providing control instructions to the actuators 132) .
  • the control instructions adjust movement of the movable object 102 with up to six degrees of freedom.
  • the control instructions are generated to initialize and/or maintain tracking of the target object 106.
  • control instructions include instructions for adjusting the carrier 108 (e.g., instructions for adjusting the damping element 208, the actuator 204, and/or one or more sensors of the carrier sensing system 206) .
  • control instructions include instructions for adjusting the payload 110 (e.g., instructions for adjusting one or more sensors of the payload sensing system 212) .
  • control instructions include control instructions for adjusting the operations of one or more sensors of movable the object sensing system 122.
  • the memory 504 also stores instructions for performing image recognition, target classification, spatial relationship determination, and/or gesture analysis that are similar to the corresponding functionalities discussed with reference to Figure 4.
  • the memory 504 may also store target information, such as tracked target information and/or predetermined recognizable target type information, as discussed in Figure 4.
  • the input device 506 receives user input to control one aspect of the movable object 102 (e.g., the zoom of the imaging device 214) while a control application generates the control instructions for adjusting another aspect of movable the object 102 (e.g., to control one or more movement characteristics of movable object 102) .
  • the control application includes, e.g., control module 402, tracking module 404 and/or a control application of control unit 104 and/or computing device 126.
  • input device 506 receives user input to control one or more movement characteristics of movable object 102 while the control application generates the control instructions for adjusting a parameter of imaging device 214. In this manner, a user is enabled to focus on controlling the navigation of movable object without having to provide input for tracking the target (e.g., tracking is performed automatically by the control application) .
  • allocation of tracking control between user input received at the input device 506 and the control application varies depending on factors such as, e.g., surroundings of the movable object 102, motion of the movable object 102, altitude of the movable object 102, system configuration (e.g., user preferences) , and/or available computing resources (e.g., CPU or memory) of the movable object 102, the control unit 104, and/or the computing device 126.
  • relatively more control is allocated to the user when movable object is navigating in a relatively complex environment (e.g., with numerous buildings or obstacles or indoor) than when movable object is navigating in a relatively simple environment (e.g., wide open space or outdoor) .
  • control is allocated to the user when the movable object 102 is at a lower altitude than when the movable object 102 is at a higher altitude.
  • more control is allocated to the movable object 102 if movable object 102 is equipped with a high-speed processor adapted to perform complex computations relatively quickly.
  • the allocation of control over the tracking process between the user and the movable object is dynamically adjusted based on one or more of the factors described herein.
  • FIG. 6 illustrates an exemplary computing device 126 for controlling movable object 102 according to some embodiments.
  • the computing device 126 may be a server computer, a laptop computer, a desktop computer, a tablet, or a phone.
  • the computing device 126 typically includes one or more processor (s) 602 (e.g., processing units) , memory 604, a communication system 610 and one or more communication buses 612 for interconnecting these components.
  • the computing device 126 includes input/output (I/O) interfaces 606, such as a display 614 and/or an input device 616.
  • I/O input/output
  • the computing device 126 is a base station that communicates (e.g., wirelessly) with the movable object 102 and/or the control unit 104.
  • the computing device 126 provides data storage, data retrieval, and/or data processing operations, e.g., to reduce the processing power and/or data storage requirements of movable object 102 and/or control unit 104.
  • computing device 126 is communicatively connected to a database (e.g., via communication 610) and/or computing device 126 includes database (e.g., database is connected to communication bus 612) .
  • the communication system 610 includes one or more network or other communications interfaces.
  • the computing device 126 receives data from the movable object 102 (e.g., from one or more sensors of the movable object sensing system 122) and/or the control unit 104.
  • the computing device 126 transmits data to the movable object 102 and/or the control unit 104.
  • computing device provides control instructions to the movable object 102.
  • the memory 604 stores instructions for performing image recognition, target classification, spatial relationship determination, and/or gesture analysis that are similar to the corresponding functionalities discussed with respect to Figure 4.
  • the memory 604 may also store target information, such as the tracked target information 408 and/or the predetermined recognizable target type information 414 as discussed in Figure 4.
  • the memory 604 or a non-transitory computer-readable storage medium of the memory 604 stores an application 620, which enables interactions with and control over the movable object 102, and which enables data (e.g., audio, video and/or image data) captured by the movable object 102 to be displayed, downloaded, and/or post-processed.
  • the application 620 may include a user interface 630, which enables interactions between a user of the computing device 126 and the movable object 126.
  • the application 630 may include a video editing module 640, which enables a user of the computing device 126 to edit videos and/or images that have been captured by the movable object 102 during a flight associated with a target object 102, e.g., captured using the image sensor 216.
  • a video editing module 640 which enables a user of the computing device 126 to edit videos and/or images that have been captured by the movable object 102 during a flight associated with a target object 102, e.g., captured using the image sensor 216.
  • the memory 604 also stores templates 650, which may be used for generating edited videos.
  • the memory 604 also stores data 660 that have been captured by the movable object 102 during a flight associated with a target object 106, which include videos 661 that have been captured by the movable object 102 during a flight associated with a target object 106.
  • the data 660 may be organized according to flights 661 (e.g., for each flight route) by the movable object 102.
  • the data for each of the flights 661 may include video data 662, images 663, and/or audio data 664. and/or.
  • the memory 604 further stores with the video data 662, the images 663, and the audio data 664 tag information 666 (e.g., metadata information) .
  • the video data 662-1 corresponding to flight 1 661-1 may include tag information (e.g., metadata) that identifies the flight path and trajectory mode corresponding to the flight 661-1.
  • the memory 604 also stores a web browser 670 (or other application capable of displaying web pages) , which enables a user to communicate over a network with remote computers or devices.
  • a web browser 670 or other application capable of displaying web pages
  • Each of the above identified executable modules, applications, or sets of procedures may be stored in one or more of the memory devices, and corresponds to a set of instructions for performing a function described above.
  • the above identified modules or programs i.e., sets of instructions
  • the memory 604 stores a subset of the modules and data structures identified above.
  • the memory 604 may store additional modules or data structures not described above.
  • Figure 7A illustrates an exemplary configuration 700 of a movable object 102, a carrier 108, and a payload 110 according to some embodiments.
  • the configuration 700 is used to illustrate exemplary adjustments to an orientation, position, attitude, and/or one or more movement characteristics of the movable object 102, the carrier 108, and/or the payload 110, e.g., as used to perform initialization of target tracking and/or to track a target object 106.
  • the movable object 102 rotates around up to three orthogonal axes, such as X1 (pitch) 710, Y1 (yaw) 708 and Z1 (roll) 712 axes.
  • the rotations around the three axes are referred to herein as a pitch rotation 722, a yaw rotation 720, and a roll rotation 724, respectively.
  • Angular velocities of the movable object 102 around the X1, Y1, and Z1 axes are referred to herein as ⁇ X1, ⁇ Y1, and ⁇ Z1, respectively.
  • the movable object 102 engages in translational movements 728, 726, and 730 along the X1, Y1, and Z1 axes, respectively.
  • Linear velocities of the movable object 102 along the X1, Y1, and Z1 axes are referred to herein as VX1, VY1, and VZ1, respectively.
  • the payload 110 is coupled to the movable object 102 via the carrier 108. In some embodiments, the payload 110 moves relative to the movable object 102 (e.g., the payload 110 is caused by the actuator 204 of the carrier 108 to move relative to the movable object 102) .
  • the payload 110 moves around and/or along up to three orthogonal axes, e.g., an X2 (pitch) axis 716, a Y2 (yaw) axis 714, and a Z2 (roll) axis 718.
  • the X2, Y2, and Z2 axes are parallel to the X1, Y1, and Z1 axes respectively.
  • the payload 110 includes the imaging device 214 (e.g., an optical module 702)
  • the roll axis Z2 718 is substantially parallel to an optical path or optical axis for the optical module 702.
  • the optical module 702 is optically coupled to the image sensor 216 (and/or one or more sensors of the movable object sensing system 122) .
  • the carrier 108 causes the payload 110 to rotate around up to three orthogonal axes, X2 (pitch) 716, Y2 (yaw) 714 and Z2 (roll) 718, e.g., based on control instructions provided to the actuator 204 of the carrier 108.
  • the rotations around the three axes are referred to herein as the pitch rotation 734, yaw rotation 732, and roll rotation 736, respectively.
  • the angular velocities of the payload 110 around the X2, Y2, and Z2 axes are referred to herein as ⁇ X2, ⁇ Y2, and ⁇ Z2, respectively.
  • the carrier 108 causes the payload 110 to engage in translational movements 740, 738, and 742, along the X2, Y2, and Z2 axes, respectively, relative to the movable object 102.
  • the linear velocity of the payload 110 along the X2, Y2, and Z2 axes is referred to herein as VX2, VY2, and VZ2, respectively.
  • the movement of the payload 110 may be restricted (e.g., the carrier 108 restricts movement of the payload 110, e.g., by constricting movement of the actuator 204 and/or by lacking an actuator capable of causing a particular movement) .
  • the movement of the payload 110 may be restricted to movement around and/or along a subset of the three axes X2, Y2, and Z2 relative to the movable object 102.
  • the payload 110 is rotatable around the X2, Y2, and Z2 axes (e.g., the movements 732, 734, 736) or any combination thereof, the payload 110 is not movable along any of the axes (e.g., the carrier 108 does not permit the payload 110 to engage in the movements 738, 740, 742) .
  • the payload 110 is restricted to rotation around one of the X2, Y2, and Z2 axes.
  • the payload 110 is only rotatable about the Y2 axis (e.g., rotation 732) .
  • the payload 110 is restricted to rotation around only two of the X2, Y2, and Z2 axes.
  • the payload 110 is rotatable around all three of the X2, Y2, and Z2 axes.
  • the payload 110 is restricted to movement along the X2, Y2, or Z2 axis (e.g., the movements 738, 740, or 742) , or any combination thereof, and the payload 110 is not rotatable around any of the axes (e.g., the carrier 108 does not permit the payload 110 to engage in the movements 732, 734, or 736) .
  • the payload 110 is restricted to movement along only one of the X2, Y2, and Z2 axes. For example, movement of the payload 110 is restricted to the movement 740 along the X2 axis) .
  • the payload 110 is restricted to movement along only two of the X2, Y2, and Z2 axes. In some embodiments, the payload 110 is movable along all three of the X2, Y2, and Z2 axes.
  • the payload 110 is able to perform both rotational and translational movement relative to the movable object 102.
  • the payload 110 is able to move along and/or rotate around one, two, or three of the X2, Y2, and Z2 axes.
  • the payload 110 is coupled to the movable object 102 directly without the carrier 108, or the carrier 108 does not permit the payload 110 to move relative to the movable object 102. In some embodiments, the attitude, position and/or orientation of the payload 110 is fixed relative to the movable object 102 in such cases.
  • adjustment of attitude, orientation, and/or position of the payload 110 is performed by adjustment of the movable object 102, the carrier 108, and/or the payload 110, such as an adjustment of a combination of two or more of the movable object 102, the carrier 108, and/or the payload 110.
  • a rotation of 60 degrees around a given axis for the payload is achieved by a 60-degree rotation by the movable object 102 alone, a 60-degree rotation by the payload relative to the movable object 102 as effectuated by the carrier, or a combination of 40-degree rotation by the movable object and a 20-degree rotation by the payload 110 relative to the movable object 102.
  • a translational movement for the payload 110 is achieved via adjustment of the movable object 102, the carrier 108, and/or the payload 110 such as an adjustment of a combination of two or more of the movable object 102, carrier 108, and/or the payload 110.
  • a desired adjustment is achieved by adjustment of an operational parameter of the payload 110, such as an adjustment of a zoom level or a focal length of the imaging device 214.
  • Figure 7B illustrates movement of a movable object 102 with respect to a pitch axis 760, a roll axis 762, and/or a yaw axis 764 according to some embodiments.
  • the movable object 102 comprises a front 754 (e.g., a front end) , a top 752, and a side 756.
  • the front 754 of the movable object 102 is also known as the nose of the movable object 102.
  • the front 754 (e.g., a front end) corresponds to the portion of the movable object 102 that is facing the target object 106 as the movable object 102 during a flight route (e.g., as the movable object 102 travels toward the target object 106) .
  • the movable object 102 resembles the shape of an aircraft, and the side 756 corresponds to the position of one of the wings of the aircraft (e.g., movable object 102) , with the other wing positioned on the other side of the aircraft that is opposite to the side 756.
  • the pitch axis 760 corresponds to the axis that runs from one wing to the other wing.
  • Figure 7B (ii) illustrates movement (e.g., rotation) of the movable object 102 about the pitch axis 760.
  • the direction of the pitch axis 760 is pointing out of the plane of the paper.
  • Figure 7B (ii) illustrates that when the movable object 102 rotates about the pitch axis 760, the front 754 of the movable object 102 (e.g., a nose of the movable object 102) moves (e.g., rotates) up or down about the pitch axis 760.
  • the pitch axis 760 is also known as a transverse axis.
  • Figure 7B (iii) illustrates movement (e.g., rotation) of the movable object 102 about the roll axis 762.
  • the roll axis 762 runs from the back to the front 754 of the movable object 102.
  • the direction of the roll axis 762 is pointing out of the plane of the paper.
  • Figure 7B (iii) illustrates that when the movable object 102 rotates about the roll axis 762, the body of the movable object 102 rotates side to side, about the roll axis 762.
  • the roll axis 762 is also known as a longitudinal axis.
  • Figure 7B (iv) illustrates movement (e.g., rotation) of the movable object 102 about the yaw axis 764.
  • the yaw axis 764 runs from the bottom to the top 752 of the movable object 102.
  • the direction of the yaw axis 764 is pointing out of the plane of the paper.
  • Figure 7B (iv) illustrates that when the movable object 102 rotates about the yaw axis 764, the top 754 (e.g., the nose) of the movable object 102 moves (e.g., rotates) from side to side (e.g., towards the side 756 or away from the side 756) with respect to the yaw axis 764.
  • the yaw axis 764 is also known as a vertical axis.
  • Figure 8 illustrates an exemplary operating environment 800 according to some embodiments.
  • the operating environment comprises a movable object 102, an electronic device 810, and a target object 106.
  • the movable object 102 is communicatively connected to the electronic device 810 via communication network (s) 802.
  • the electronic device 810 is a user-operated device.
  • the electronic device 810 is a controller device (e.g., control unit 104, and/or any remote controller device that is used operatively with a drone drone) for controlling a flight (e.g., a flight path, a flight route etc. ) of the movable object 102, such as a speed, trajectory, elevation, attitude, direction, and/or rotation etc. of the movable object 102.
  • a flight e.g., a flight path, a flight route etc.
  • the electronic device 810 includes a first component 820 and a second component 830.
  • the electronic device 810 e.g., the first component 820
  • the electronic device 810 comprises a hand-held device (e.g., hand-held component) that is held (e.g., attached to, coupled to) using a hand, a palm, and/or fingers of the user.
  • the first component 820 includes an input interface (e.g., input interface 910, Figure 9) that enables input of user instructions.
  • the input interface can include input buttons (e.g., buttons 912, Figure 9) , a display interface (e.g., touchscreen interface 914, Figure 9) , a joystick, control knobs, and/or an audio input interface etc.
  • the second component 830 comprises a wearable component.
  • the second component 830 comprises a wristband-like structure that is worn on a wrist or an arm (e.g., a forearm) of the user, for detecting arm/wrist movement as the user is controlling the first component.
  • the second component 830 is worn on the same arm that is used to hold the first component 820.
  • the second component 830 is worn on an arm that is different from the arm used to hold the first component 820.
  • the first component 820 and the second component 830 are communicatively connected, for example via wireless signals such as Bluetooth, WiFi, and/or other wireless signals. In some embodiments, the first component 820 and the second component 830 are not physically connected to each other (e.g., they are physically decoupled from each other) . In some embodiments, the first component 820 and the second component 830 are communicatively connected via signals that are transmitted using a hard-wired cable. In some embodiments, the first component 820 and the second component 830 are physically connected to each other (e.g., via a cable) . In some embodiments, the first component 820 and the second component 830 are components of two distinct electronic devices that are communicatively connected to each other (e.g., via Bluetooth, WiFi, other cellular connections, or a wired cable connection etc. ) .
  • the first component 820 includes a first sensor 840.
  • the second component 830 includes a second sensor 850.
  • the first sensor 840 detects (e.g., senses and measures) a first user movement from the fingers, palm, hand (e.g., fingers, thumb and/or palm) etc. of the user.
  • the first user movement may correspond to a user instruction to control a flight of the movable object 102 (e.g., a speed, trajectory, elevation, direction, rotation, and/or attitude etc. of the movable object 102) .
  • the first user movement may correspond to a user instruction to control a flight path or a flight route of the movable object 102.
  • the first user movement can comprise user activation of one or more input controls (e.g., buttons, knobs, joystick etc. ) on the input interface of the electronic device 810 using one or more fingers of the user, which are detected by the first sensor 840.
  • the first user movement comprises user movement (e.g., user rotation) of the first component 820 (e.g., waving the first component 820 or gesturing using the first component 820) in a certain direction and/or with a certain speed, which are detected by the first sensor 840.
  • the second sensor 850 detects a second user movement.
  • the second user movement comprises movement from a wrist, forearm, elbow, arm etc. of the user.
  • the second user movement comprises movements from the user due to (e.g., because of) the first user movement.
  • the second user movement may comprise natural movements (e.g., unexpected or inevitable movements) from the wrist and/or forearm of the user when the user activates the inputs controls on the first component 820 using the user’s fingers.
  • the second user movement may also comprise natural movements (e.g., unexpected or inevitable movements) from the wrist, forearm and/or elbow of the user when the user moves (e.g., waves) the first component to control the flight of the movable object 102.
  • the second user movement causes the user instruction (e.g., from the first user movement) to be over-amplified.
  • the second user movement causes the user instruction (e.g., from the first user movement) to be understated.
  • the electronic device 810 determines one or more parameters associated with the user instruction to control the flight of the movable object 102 based on an interaction between the first user movement and the second user movement. Details of the interaction between the first user movement and the second user movement, and the user instruction are described in more detail in Figures 10, 11, and 13.
  • the one or more parameters include a velocity (e.g., speed) of the movable object 102 (e.g., having units of meters per second, miles per hour, kilometers per hour etc. ) , or a speed setting for the movable object 102 (e.g., a low speed, a medium speed, or a high-speed setting) .
  • each of the speed settings corresponds to an actual speed (or a range of speeds) of the movable object 102, and is predetermined by a manufacturer of the movable object 102.
  • the one or more parameters include a trajectory, an attitude, an elevation, a flight direction, and/or a rotation (e.g., an angular rotation) of the movable object 102.
  • the one or more parameters comprise a pitch, yaw, and/or roll value (e.g., pitch, roll and/or yaw angles) .
  • the one or more parameters comprise sensor values that are measured by the first sensor and/or the second sensor.
  • the one or more parameters include a pitch /roll and/or yaw of a gimbal (e.g., the carrier 108) that is attached to the movable object 102.
  • the electronic device 810 transmits to the movable object 102 a wireless signal 860 that is based on (e.g., that includes) the one or more parameters.
  • the movable object 102 is configured to adjust the flight of the movable object 102 in accordance with the one or more parameters.
  • Figure 9 is a block diagram illustrating a representative electronic device 810 according to some embodiments.
  • the electronic device 810 includes one or more processor (s) 902, one or more communication interface (s) 904 (e.g., network interface (s) ) , memory 906, and one or more communication buses 908 for interconnecting these components (sometimes called a chipset) .
  • the electronic device 810 includes an input interface 910 that facilitates user input and/or audio input.
  • the input interface 910 includes microphones, button (s) 912, and/or a touchscreen interface 914.
  • the electronic device 810 includes output device (s) 916 that facilitate visual output and/or audio output.
  • the output device (s) 916 include speaker (s) and/or a display 918.
  • the electronic device 810 includes one or more sensors 920, such as the first sensor 840 and the second sensor 850 that are shown in Figure 8.
  • the sensors 920 include one or more movement sensors (e.g., accelerometers) , light sensors, time-of-flight (ToF) sensors, positioning sensors (e.g., GPS) , inertial sensors (e.g., an inertial measurement unit (IMU) , a magnetometer etc. ) , and/or audio sensors.
  • the positioning sensors include one or more location sensors (e.g., passive infrared (PIR) sensors) and/or one or more orientation sensors (e.g., gyroscopes) .
  • the sensors 920 include an inertial measurement unit (IMU) .
  • the IMU uses a combination of sensors (e.g., an accelerometer, a gyroscope, and/or a magnetometer) to measure orientation of the electronic device 810 or orientation of a component of the electronic device (e.g., the first component 820 and/or the second component 830) .
  • the IMU uses a combination of accelerometer, a gyroscope, a magnetometer.
  • the accelerometer measures the amount of force (e.g., acceleration) it is experiencing in X, Y and Z directions.
  • the IMU determines a roll value and a pitch value based on the measured acceleration.
  • the gyroscope measures an angular velocity along the X, Y and Z axes.
  • the IMU determines an angle by integrating the angular velocity over time, which is used to measure the change in roll, pitch and/or yaw of the electronic device 810.
  • the magnetometer measures magnetism.
  • the magnetometer determines an orientation using the earth’s magnetic field.
  • the X, Y and Z magnetometer readings are used to calculate yaw.
  • the electronic device 810 includes radios 930.
  • the radios 930 enable one or more communication networks, and allow the electronic device 810 to communicate with other devices, such as the movable object 102.
  • the radios 930 are capable of data communications using any of a variety of custom or standard wireless protocols (e.g., IEEE 802.15.4, Wi-Fi, ZigBee, 6LoWPAN, Thread, Z-Wave, Bluetooth Smart, ISA100.5A, WirelessHART, MiWi, Ultrawide Band (UWB) , software defined radio (SDR) etc. ) custom or standard wired protocols (e.g., Ethernet, HomePlug, etc. ) , and/or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
  • custom or standard wireless protocols e.g., IEEE 802.15.4, Wi-Fi, ZigBee, 6LoWPAN, Thread, Z-Wave, Bluetooth Smart, ISA100.5A, WirelessHART, MiWi, Ultrawide
  • the electronic device 810 includes a clock 922.
  • the clock 922 synchronizes (e.g., coordinates) time with the clock 152 of the movable object 102.
  • the memory 906 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other non-volatile solid state storage devices.
  • the memory 906, optionally, includes one or more storage devices remotely located from one or more processor (s) 902.
  • the memory 906, or alternatively the non-volatile memory within the memory 906, includes a non-transitory computer-readable storage medium.
  • the memory 906, or the non-transitory computer-readable storage medium of the memory 906, stores the following programs, modules, and data structures, or a subset or superset thereof:
  • operating logic 932 including procedures for handling various basic system services and for performing hardware dependent tasks
  • a radio communication module 934 for connecting to and communicating with other network devices (e.g., a local network, such as a router that provides Internet connectivity, networked storage devices, network routing devices, server systems movable object 102 etc. ) coupled to one or more communication networks 810 via one or more communication interfaces 904 (wired or wireless) ;
  • network devices e.g., a local network, such as a router that provides Internet connectivity, networked storage devices, network routing devices, server systems movable object 102 etc.
  • ⁇ positioning module 936 for determining a position (e.g., positional coordinates) of the electronic device 810.
  • ⁇ device data 938 for the electronic device 810 including but not limited to:
  • ⁇ device settings 940 for the electronic device 810 such as default options and preferred user settings
  • ⁇ user settings 942 such as a proficiency level of the user (e.g., beginner user /low proficiency, medium proficiency, and/or expert user /high proficiency) ;
  • the sensor data 944 includes data from the first sensor 840 and data from the second sensor 850;
  • mapping data 948 that comprises mapping relationships between user movements (e.g., movements detected by the electronic device 810) and corresponding pitch, roll, and/or yaw values for the movable object 102.
  • a computation module 950 for translating the user movements into corresponding flight instructions (e.g., instructions to the movable object 102) .
  • the computation module 950 computes one or more parameters for controlling the movable object 102 based on the sensor data 944, such as an adjusted pitch value, an adjusted yaw value, and/or an adjusted roll value that is based on a combination of the measurements from the sensors 920, including the first sensor 840 and the second sensor 850.
  • each of the above identified modules are optionally stored in one or more of the memory devices described herein, and corresponds to a set of instructions for performing the functions described above.
  • the above identified modules or programs need not be implemented as separate software programs, procedures, modules or data structures, and thus various subsets of these modules may be combined or otherwise re-arranged in various implementations.
  • the memory 906 stores a subset of the modules and data structures identified above.
  • the memory 906, optionally, stores additional modules and data structures not described above (e.g., a microphone module for obtaining and/or analyzing audio signals in conjunction with microphone input devices, a module for voice detection and/or speech recognition in a voice-enabled smart speaker) .
  • a subset of the programs, modules, and/or data stored in the memory 906 are stored on and/or executed by a server system (e.g., computing device 126) .
  • Figure 10 illustrates an electronic device according to some embodiments.
  • Figure 10A illustrates the electronic device 810 that includes the first component 820 and the second component 830.
  • Figure 10B illustrates a first component 820 of the electronic device 810 according to some embodiments.
  • the first component 820 includes the first sensor 840 (e.g., sensors 920, Figure 9) .
  • the first sensor 840 is an orientation sensor.
  • the first sensor 840 is an IMU that comprises a combination of one or more of: an accelerometer, a gyroscope and a magnetometer.
  • the first sensor 840 determines a first roll value 1002, a first pitch value 1012, and/or a first yaw value 1022 from the measured sensor data.
  • the first sensor 840 detects user interaction (s) (e.g., movements and/or rotations) with the first component 820 and maps the user interaction (s) to a corresponding movement and/or rotation of the movable object 102.
  • user interaction e.g., movements and/or rotations
  • the first component 820 is a handheld component and includes a front and a back.
  • the front and the back of the first component 820 directly maps to the front and the back of the movable object 102 (see, e.g., Figure 7B) .
  • a movement of the first component 820 is directly mapped to a corresponding movement (e.g., rotation) of the movable object 102.
  • a user moves the first component 820 in a forward or backward direction (e.g., by flexing her wrist toward the front or back of the first component 820) .
  • the forward or backward movement corresponds to a first pitch value 1012 (e.g., a first pitch angle) .
  • the user moves the first component 820 from side to side (e.g., by rotating her wrist with respect to a long axis that is formed by her arm) .
  • the side-to-side movement corresponds to a first roll value 1002 (e.g., a first roll angle) .
  • the user may also rotate the first component 820 with respect to a vertical axis of the first component 820. In this instance, the movement corresponds to the first yaw value 1022.
  • Figure 10C illustrates a second component 830 of the electronic device 810 according to some embodiments.
  • the second component 830 is a wearable component that is configured to be worn by a user (e.g., on a wrist or a forearm of the user) .
  • the second component 830 includes the second sensor 850 (e.g., sensors 920, Figure 9) .
  • the second sensor 850 is an orientation sensor.
  • the second sensor 850 is an IMU that comprises a combination of one or more of:an accelerometer, a gyroscope and a magnetometer.
  • the second sensor 850 determines a second roll value 1004, a second pitch value 1014, and/or a second yaw value 1024 from the second sensor data.
  • the second sensor 850 detects user movements that may arise due to (e.g., because of) the user interaction (s) with the first component 820. For example, when the user moves the first component 820 in a forward or backward direction (e.g., by flexing her wrist) , the user may also naturally move her arm. In some embodiments, the arm movement is detected by the second sensor 850.
  • the electronic device 810 uses a combination of sensor data from the first sensor 840 and the second sensor 850 (e.g., a combination of the first roll value 1002, the first pitch value 1012, the first yaw value 1022, the second roll value 1004, the second pitch value 1014, and/or the second yaw value 1024) to control the movable object 102.
  • a combination of sensor data from the first sensor 840 and the second sensor 850 e.g., a combination of the first roll value 1002, the first pitch value 1012, the first yaw value 1022, the second roll value 1004, the second pitch value 1014, and/or the second yaw value 1024
  • the electronic device 810 selects either the first roll value 1002 or the second roll value 1004 as the roll value of the movable object 102.
  • the electronic device 810 selects (e.g., uses) either the first pitch value 1012 or the second pitch value 1014 as the pitch value of the movable object 102.
  • the electronic device 810 also selects either the first yaw value 1022 or the second yaw value 1024 as the yaw value of the movable object 102.
  • an adjusted roll value e.g., a total roll value
  • a combination e.g., addition or subtraction
  • an adjusted pitch value e.g., a total pitch value
  • an adjusted yaw value e.g., a total yaw value
  • a combination e.g., addition or subtraction
  • the electronic device 810 transmits to the movable object 102 one or more parameters that includes one or more of the adjusted roll value, the adjusted pitch value, and the adjusted yaw value.
  • the first example depicts a first scenario whereby a user generally raises or lowers her arm naturally when controlling the up and down movement of the movable object 102.
  • the first pitch value (e.g., pitch 1022) that is measured by the first sensor 820 may be bigger than an intended pitch control value for the movable object 102 because it includes the effects of the user arm movement.
  • the adjusted pitch value e.g., Pitch_total
  • Pitch_total more accurately reflects the true pitch value that is intended for the movable object 102.
  • the flight of the movable object 102 can be more accurately determined. As a result, the user experience is enhanced.
  • the second example depicts a second scenario whereby a user raises or lowers her arm naturally (e.g., unintentionally) when controlling an upward or downward movement (e.g. a pitch) of the movable object and therefore the first pitch value 1012 may be over-amplified.
  • the first yaw value e.g., Yaw 1022
  • the first yaw value may be larger than the actual yaw value that is intended for the movable object 102 because it includes the effects of user body movement when the user is manipulating the remote controller (e.g., the first component 820) .
  • the electronic device 810 uses a weighted combination (e.g., weighted sum, weighted aggregation) of sensor values from the first sensor 840 and the second sensor 850 to control the movable object 102. (e.g., a weighted combination of the first roll value 1002, the first pitch value 1012, the first yaw value 1022, the second roll value 1004, the second pitch value 1014, and/or the second yaw value 1024)
  • a weighted combination e.g., weighted sum, weighted aggregation
  • W A , W B , W C and W D are respective weights assigned to the first roll value 1002 and the second roll value 1004 and each of the weights W A , W B , W C and W D has a respective value between 0%and 100%inclusive.
  • the electronic device 810 determines an adjusted pitch value (e.g., a total pitch value) that is based on a weighted combination of the first pitch value 1012 and the second pitch value 1014.
  • Pitch_total W E ⁇ Pitch 1012 + W F ⁇ Pitch 1014
  • Pitch_total W G ⁇ Pitch 1012 -W H ⁇ Pitch 1014
  • W E , W F , W G and W H are respective weights assigned to the first pitch value 1012 and the second pitch value 1014 and each of the weights W E , W F , W G and W H has a respective value between 0%and 100%inclusive.
  • the electronic device 810 determines an adjusted yaw value (e.g., a total yaw value) that is based on a weighted combination of the first yaw value 1022 and the second yaw value 1024.
  • an adjusted yaw value e.g., a total yaw value
  • Yaw_total W L ⁇ Yaw 1022 + W J ⁇ Yaw 1024
  • Yaw_total W K ⁇ Yaw 1022 -W L ⁇ Yaw 1024
  • W L , W J , W K and W L are respective weights assigned to the first yaw value 1022 and the second yaw value 1024 and each of the weights W L , W J , W K and W L has a respective value between 0%and 100%inclusive.
  • the electronic device 810 transmits to the movable object 102 one or more parameters that includes one or more of the adjusted roll value, the adjusted pitch value, and the adjusted yaw value.
  • the one or more parameters include an adjusted pitch value based on a weighted combination of the first pitch value 1012 and the second pitch value 1014.
  • the one or more parameters also include an adjusted roll value that is based on a weighted combination of the first roll value 1002 and the second roll value 1004.
  • the electronic device 810 determines (e.g., assigns) relative weights (e.g., weights 946. Figure 9) in the weighted combination according to a level of proficiency of the user (e.g., familiarity of the user) in using the electronic device 810 to control the movable object 102. For example, in some embodiments, before a user starts to interact with the electronic device 810, the user may be asked (e.g., via the display 918) to input a level of proficiency of controlling the movable object 102.
  • a level of proficiency of the user e.g., familiarity of the user
  • the electronic device 810 can assign lower weights to values from the second sensor 850 and assign higher weights to the values from the first sensor 840.
  • the adjusted pitch value comprises a subtraction of the weighted second pitch value from the weighted first pitch value because the wrist of the user may move in the same direction as the direction of the Pitch 1012, thus causing the pitch value to be amplified.
  • the hand movement e.g.., movement of the first component 820
  • the electronic device 910 may assign a weight of zero to the Roll 1004 value.
  • Figure 11 illustrates an electronic device 810 according to some embodiments.
  • the electronic device includes a first component 820 and a second component 830.
  • the first component 820 comprises a main body of the electronic device 810.
  • the first component 820 is a handheld component and includes a front region and a back region, as illustrated in Figure 11 (and explained previously in Figure 10) .
  • the first component 820 can include an input interface (e.g., input interface 910) .
  • the first component 820 may also include input button (s) (e.g., button (s) 912) and/or a touchscreen interface (e.g., touchscreen interface 914) in some embodiments.
  • the first component includes the first sensor 840.
  • the second component 830 comprises a wearable component.
  • the second component 830 can be attached to (e.g., worn on) a wrist or a forearm of the user.
  • Figure 11 shows that the first component 820 and the second component 820 are mechanically coupled to each other via a link 1110.
  • the link 1110 comprises a first end 1112 that is rotatably coupled to the first component 820 via a first rotation node 1142.
  • the link 1110 also includes a second end 1114 that is rotatably coupled to the second component 830 via a second rotation node 1144.
  • each of the first rotation node 1142 and the second rotation node 1144 has rotational degrees of freedom in a respective pitch axis and a respective yaw axis.
  • first rotational node 1142 and the second rotational node 1144 comprise rotational sensors that detect (e.g., sense and measure) rotation in a respective pitch, yaw, and/or roll direction (e.g., axis of rotation) .
  • the first rotational node 1142 measures a third pitch value 1122 and a third yaw value 1132.
  • the second rotational node 1144 measures a fourth pitch value 1124 and a fourth yaw value 1134.
  • Figures 12A and 12B illustrate representative views of the electronic device 810 according to some embodiments.
  • Figures 13A-13C illustrate a flowchart for a method 1300 performed (1302) at an electronic device (e.g., the electronic device 810 as described in Figures 8, 9, 10, 11, and 12) according to some embodiments.
  • the electronic device comprises a controller device for controlling a movable object (e.g., a UAV, a movable object 102 etc. ) .
  • the electronic device is communicatively connected (1304) (e.g., wireless connected, through the Internet, other cellular networks such as 4G and 5G networks, Bluetooth etc. ) with an unmanned aerial vehicle (UAV) (e.g., movable object 102) .
  • UAV unmanned aerial vehicle
  • the electronic device includes (1306) an input interface (e.g., input interface 910, Figure 9) , a first sensor (e.g., first sensor 840 in Figures 8 and 10 or sensors 920, Figure 9) , a second sensor (e.g., second sensor 850 in Figures 8 and 10 or sensors 920 in Figure 9) , one or more processors (e.g., processor (s) 902, Figure 9) , and memory (e.g., memory 906, Figure 9) .
  • the input interface may also include input control button (s) (e.g., button (s) 912, Figure 9) and/or a touchscreen interface (e.g., touchscreen interface 914, Figure 9) .
  • the first sensor is (1308) an inertial measurement unit (IMU) that is attached to (e.g., mounted on and/or embedded in) the electronic device or to a component of the electronic device.
  • IMU inertial measurement unit
  • the first sensor measures and reports a force, angular rate, and/or orientation of the electronic device to which it is attached.
  • the first sensor comprises a combination of one or more of: an accelerometer, a gyroscope, and a magnetometer.
  • the second sensor is an inertial measurement unit (IMU) that is attached to (e.g., mounted on and/or embedded in) the electronic device or to a component of the electronic device.
  • IMU inertial measurement unit
  • the second sensor measures and reports a force, angular rate, and/or orientation of the electronic device to which it is attached.
  • the second sensor comprises a combination of one or more of: an accelerometer, a gyroscope, and a magnetometer.
  • the memory stores one or more programs and/or instructions that are executed by the one or more processors.
  • the electronic device includes (1310) a first component (e.g., first component 820) and a second component (e.g., second component 830) .
  • the first component 820 and the second component 830 are communicatively connected to each other (e.g., via Bluetooth, WiFi, other wireless signals, or via a hard-wired cable etc. ) .
  • the first component comprises a handheld component.
  • the first component includes the input interface.
  • the second component comprises a wearable component, such as a wristband structure that is worn on (e.g., attached to) a wrist or a forearm of the user.
  • the second component includes a sensor (e.g., the second sensor) for detecting movement (e.g., wrist and/or forearm movement, or elbow movement) as the user interacts with the first component.
  • the first component comprises a handheld component and the second component comprises a wearable component is worn on the same arm that is used to hold the first component.
  • the first component comprises a handheld component and the second component is a wearable component that is worn on an arm different from the arm used to hold the first component.
  • the first component is utilized by a first user and the second component is utilized by a second user, distinct from the first user.
  • the second component is worn by a finger, a head, a leg, or a foot of the first user or the second user.
  • the first component and the second component are components of two distinct electronic devices that are communicatively connected to each other (e.g., via Bluetooth, WiFi, other cellular connections, or a wired cable connection etc. ) .
  • the electronic device detects (1312) a first user movement from the first sensor attached to a first body portion (e.g., fingers, palm, hand (e.g., fingers, thumb, and palm) ) of a user.
  • the electronic device also detects a second user movement from the second sensor attached to a second body portion (e.g., wrist, forearm, arm, elbow etc. ) of the user.
  • the first user movement represents (1314) a user instruction to control a flight of the UAV (e.g., movable object 102) .
  • the first user movement represents a user instruction to control a speed, trajectory, elevation, attitude, direction, and/or rotation etc. of the UAV.
  • the UAV flies (e.g., autonomously and/or by executing a flight path) in accordance with the user instruction.
  • the first user movement represents a user instruction to control a flight path of the UAV, such as a starting point (e.g., position and/or location of the UAV) , and/or an ending point (e.g., position and/or location of the UAV) , and/or a flight route of the UAV.
  • the second body portion is connected (1316) to the first body portion.
  • the electronic device determines (1318) one or more parameters associated with the user instruction to control the flight of the UAV based on an interaction between the first user movement and the second user movement.
  • the one or more parameters include a velocity (e.g., speed) of the UAV (e.g., having units of meters per second, miles per hour, kilometers per hour etc. ) , or a speed setting for the UAV (e.g., a low speed, a medium speed, or a high speed setting) .
  • each of the speed settings corresponds to an actual speed (or a range of speeds) of the UAV that are predetermined by a manufacturer of the UAV.
  • the one or more parameters include a trajectory, an attitude, an elevation, a flight direction, and/or a rotation (e.g., an angular rotation) of the UAV.
  • the one or more parameters comprise a pitch, yaw, and/or roll value (e.g., pitch, roll and/or yaw angles) .
  • the one or more parameters comprise sensor values that are measured by the first sensor and/or the second sensor.
  • the one or more parameters include a pitch /roll and/or yaw of a gimbal (e.g., a carrier 108) that is attached to the UAV (e.g., movable object 102) .
  • the electronic device transmits (1320) to the UAV a wireless signal that is based on (e.g., includes) the one or more parameters.
  • the UAV is configured to adjust (1322) the flight of the UAV in accordance with the one or more parameters.
  • the electronic device detects the first user movement and the second user movement simultaneously. In some embodiments, the electronic device detects the first user movement and the second user movement over a predefined time window (e.g., within 5 seconds, 10 seconds, 20 seconds, 30 seconds etc. of each other) .
  • the second user movement comprises movements from the user due to (e.g., because of) the first user movement.
  • the second user movements comprise natural movements (e.g., unexpected, inevitable, unintentional movements) of the second body portion during the first user movement.
  • the second user movement over-amplifies (e.g., exaggerates) the user instruction to control the flight of the UAV. In some embodiments, the second user movement counteracts (e.g., reduces) the user instruction to control the flight of the UAV.
  • the one or more parameters comprise sensor parameters that are detected by the first and second sensors.
  • the electronic device transmits the sensor parameters to the UAV.
  • the movement comprises user movement of the hand (e.g., the first body portion) .
  • the electronic device determines one or more parameters associated with the user instruction to control the flight of the UAV
  • the electronic device after the electronic device determines one or more parameters associated with the user instruction to control the flight of the UAV, the electronic device generates a command in accordance with the one more parameters.
  • the electronic device then transmits to the UAV a wireless signal that includes the command.
  • the UAV is configured to adjust the flight of the UAV in accordance with the one or more parameters.
  • the electronic device is used in conjunction with another electronic device (e.g., a second electronic device, such as another controller device of the UAV, a head-mounted display, and/or a combination thereof etc. ) .
  • a second electronic device such as another controller device of the UAV, a head-mounted display, and/or a combination thereof etc.
  • the electronic device after determining the one or more parameters, the electronic device generates a command in accordance with the one more parameters.
  • the electronic device transmits the command to a second electronic device, which in turn transmits the command to the UAV.
  • the one or more parameters comprise sensor parameters (e.g., sensor values) that are detected (e.g., measured) by the first and second sensors.
  • the electronic device transmits the sensor parameters to another electronic device, which in turn transmits the sensor parameters to the UAV.
  • the UAV generates a command based on the sensor parameters, and adjusts the flight of the UAV in accordance with the command.
  • the one or more parameters comprise sensor parameters that are detected by the first and second sensors.
  • the electronic device transmits the sensor parameters to a second electronic device (e.g., another controller device of the UAV, a head-mounted display, and/or a combination thereof etc. ) , which in turn transmits the sensor parameters to the UAV.
  • the UAV generates a command based on the sensor parameters, and adjusts the flight of the UAV in accordance with the command.
  • the second electronic device transmits the sensor parameters of the electronic device as well as its own detected measurements.
  • the UAV generates a command based on the sensor parameters and the detected measurements from the second electronic device.
  • the UAV adjusts (1324) the flight of the UAV by adjusting a pitch, roll, and/or yaw of the UAV.
  • the user instruction to control a flight of the UAV comprises (1326) a first pitch value (e.g., pitch 1012) , a first roll value (e.g., roll 1002) , and/or a first yaw value (e.g., yaw 1022) .
  • the electronic device further determines (1328) a second pitch value (e.g., pitch 1014) , a second roll value (e.g., pitch 1004) , and/or a second yaw value (e.g., 1024) based on the second user movement from the second sensor.
  • the electronic device transmits to the UAV (e.g., either directly or indirectly via an intermediate device) one or more parameters that comprises the first pitch value, the first roll value, the first yaw value, the second pitch value, the second roll value, and/or the second yaw value for processing at the UAV.
  • the electronic device processes the first pitch value, the first roll value, the first yaw value, the second pitch value, the second roll value, and/or the second yaw value on the electronic device (e.g., via the computation module 950) .
  • the electronic device may compute combinations (e.g., a summation or a subtraction) or a weighted combination of one or more of: the first pitch value, the first roll value, the first yaw value, the second pitch value, the second roll value, and/or the second yaw value.
  • the electronic device transmits the processed values to the UAV.
  • the UAV adjusts a flight according to a combination of the first pitch value, the first roll value, and/or the first yaw value and the second pitch value, second roll value, and/or second yaw value.
  • the second pitch value, the second roll value, and/or the second yaw value can be used to adjust a corresponding pitch /roll and/or yaw of a payload of the UAV (e.g., payload 110, Figure 2) .
  • the payload 110 can include an imaging device (e.g., imaging device 214, Figure 2C) . Therefore, by adjusting a corresponding pitch /roll and/or yaw of the payload 110. the field of view of the imaging device (e.g., image sensor 216) is also modified (e.g., adjusted) .
  • the first example depicts a first scenario whereby a user generally raises or lowers her arm naturally when controlling the up and down movement of the movable object 102.
  • the first pitch value (e.g., pitch 1022) that is measured by the first sensor 820 may be bigger than an intended pitch control value for the movable object 102 because it includes the effects of the user arm movement.
  • the adjusted pitch value e.g., Pitch_total
  • Pitch_total more accurately reflects the true pitch value that is intended for the movable object 102.
  • the flight of the movable object 102 can be more accurately determined. As a result, the user experience is enhanced.
  • the second example depicts a second scenario whereby a user raises or lowers her arm naturally (e.g., unintentionally) when controlling an upward or downward movement (e.g. a pitch) of the movable object and therefore the first pitch value 1012 may be over-amplified.
  • the first yaw value e.g., Yaw 1022
  • the first yaw value may be larger than the actual yaw value that is intended for the movable object 102 because it includes the effects of user body movement when the user is manipulating the remote controller (e.g., the first component 820) .
  • the electronic device determines the one or more parameters by determining (1332) a weighted combination (e.g., weighted sum, weighted aggregation) that includes a plurality of the first pitch value, the first roll value, the first yaw value, the second pitch value, the second roll value, and the second yaw value.
  • a weighted combination e.g., weighted sum, weighted aggregation
  • the weighted combination comprises (1334) a weighted combination of the first pitch value (e.g., pitch 1112) and the second pitch value (e.g., pitch 1114) .
  • the weighted combination comprises (1336) a weighted combination of the first roll value (e.g., roll 1102) and the second roll value (e.g., roll 1102) .
  • the weighted combination comprises (1338) a weighted combination of the first yaw value (e.g., yaw 1022) and the second yaw value (e.g., yaw 1024) .
  • the electronic device assigns (1342) respective first weights to the first pitch value, the first roll value, and/or the first yaw value.
  • the electronic device also assigns (1346) respective second weights to the second pitch value, the second roll value, and/or the second yaw value.
  • the weighted combination is further determined (1350) based on the respective assigned first weights and the respective assigned second weights.
  • the electronic device 810 determines (e.g., assigns) relative weights (e.g., weights 946. Figure 9) in the weighted combination according to a level of proficiency of the user (e.g., familiarity of the user) in using the electronic device 810 to control the movable object 102. For example, in some embodiments, before a user starts to interact with the electronic device 810, the user may be asked (e.g., via the display 918) to input a level of proficiency of controlling the movable object 102.
  • a level of proficiency of the user e.g., familiarity of the user
  • the electronic device 810 can assign lower weights to values from the second sensor 850 and assign higher weights to the values from the first sensor 840.
  • the adjusted pitch value comprises a subtraction of the weighted second pitch value from the weighted first pitch value because the wrist of the user may move in the same direction as the direction of the Pitch 1012, thus causing the pitch value to be amplified.
  • the hand movement e.g.., movement of the first component 820
  • the electronic device 910 may assign a weight of zero to the second roll (e.g., Roll 1004) value.
  • At least one of the first weights has (1344) a value of zero.
  • At least one of the second weights has (1348) a value of zero.
  • the electronic device includes (1352) a first component (e.g., first component 820, Figure 11) and a second component (e.g., second component 830, Figure 11) that is mechanically coupled to the first component via a link.
  • the first component 820 comprises a main body of the electronic device 810.
  • the first component includes the input interface (e.g., input interface 910) .
  • the first component 820 may also include input button (s) (e.g., button (s) 912) and/or a touchscreen interface (e.g., touchscreen interface 914) in some embodiments.
  • the second component 830 is attached to (e.g., worn by) a user.
  • the second component can be worn on the wrist or a forearm of the user.
  • the link (e.g., link 1110, Figure 11) comprises (1354) a first end (e.g., first end 1112, Figure 11) that is rotatably coupled to the first component and a second end (e.g., second end 1114, Figure 11) that is rotatably coupled to the second component.
  • the first end 1112 is rotatably coupled to the first component 820 via a first rotation node 1142.
  • the second end 1114 is rotatably coupled to the second component 830 via a second rotation node 1144.
  • the first rotational node 1142 and the second rotational node 1144 comprise rotational sensors that detect (e.g., sense and measure) rotation in a respective pitch, yaw, and/or roll direction (e.g., axis of rotation) .
  • the first component includes the input interface.
  • the electronic device further includes a third sensor.
  • the first sensor 840 is positioned on the first component.
  • the second sensor 850 e.g., a rotational sensor, such as a pitch/yaw/roll sensor, or an IMU
  • the third sensor e.g., a rotational sensor, such as a pitch/yaw/roll sensor
  • is positioned on the first end of the link e.g., at the first rotational node 1142
  • the second sensor (e.g., a rotational sensor, such as a pitch/yaw/roll sensor) is configured to measure respective rotation angles at the second end.
  • the second sensor is a pitch/yaw/roll angle sensor and is configured to measure the respective rotation angles with respect to each axis of rotation at the second end.
  • the axis of rotation at the second end includes a pitch axis of rotation, a yaw axis of rotation, and/or a roll axis of rotation.
  • the third sensor is configured to measure respective rotation angles at the first end.
  • the third sensor is a pitch/yaw/roll angle sensor and is configured to measure the respective rotation angles with respect to each axis of rotation at the first end.
  • the axis of rotation at the first end includes a pitch axis of rotation, a yaw axis of rotation, and/or a roll axis of rotation.
  • the respective rotation angles at the first end include two or more of: a first pitch angle, a first roll angle, and/or a first yaw angle.
  • the respective rotation angles at the second end include two or more of: a second pitch angle, a second roll angle, and/or a first second angle.
  • the electronic device determines the one or more parameters of a command by determining (1356) a combined rotation angle based on a combination (e.g., an addition, a subtraction, a weighted sum, a weighted subtraction etc. ) of one or more of: the first pitch angle and the second pitch angle; the first roll angle and the second roll angle; and the first yaw angle and the second yaw angle.
  • a combination e.g., an addition, a subtraction, a weighted sum, a weighted subtraction etc.
  • Exemplary processing systems include, without limitation, one or more general purpose microprocessors (for example, single or multi-core processors) , application-specific integrated circuits, application-specific instruction-set processors, field-programmable gate arrays, graphics processing units, physics processing units, digital signal processing units, coprocessors, network processing units, audio processing units, encryption processing units, and the like.
  • general purpose microprocessors for example, single or multi-core processors
  • application-specific integrated circuits for example, application-specific instruction-set processors, field-programmable gate arrays
  • graphics processing units for example, single or multi-core processors
  • physics processing units for example, digital signal processing units, coprocessors, network processing units, audio processing units, encryption processing units, and the like.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present invention.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present invention.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present invention.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present invention.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present inventions.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present invention.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present invention.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present invention.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present invention.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present invention.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present invention.
  • memory 118, 504, 604) can include, but is not limited to, any type of disk including floppy disks, optical discs, DVD, CD-ROMs, microdrive, and magneto-optical disks, ROMs, RAMs, EPROMs, EEPROMs, DRAMs, VRAMs, DDR RAMs, flash memory devices, magnetic or optical cards, nanosystems (including molecular memory ICs) , or any type of media or device suitable for storing instructions and/or data.
  • any type of disk including floppy disks, optical discs, DVD, CD-ROMs, microdrive, and magneto-optical disks, ROMs, RAMs, EPROMs, EEPROMs, DRAMs, VRAMs, DDR RAMs, flash memory devices, magnetic or optical cards, nanosystems (including molecular memory ICs) , or any type of media or device suitable for storing instructions and/or data.
  • features of the present invention can be incorporated in software and/or firmware for controlling the hardware of a processing system, and for enabling a processing system to interact with other mechanism utilizing the results of the present invention.
  • software or firmware may include, but is not limited to, application code, device drivers, operating systems and execution environments/containers.
  • Communication systems as referred to herein optionally communicate via wired and/or wireless communication connections.
  • communication systems optionally receive and send RF signals, also called electromagnetic signals.
  • RF circuitry of the communication systems convert electrical signals to/from electromagnetic signals and communicate with communications networks and other communications devices via the electromagnetic signals.
  • RF circuitry optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth.
  • an antenna system an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth.
  • SIM subscriber identity module
  • Communication systems optionally communicate with networks, such as the Internet, also referred to as the World Wide Web (WWW) , an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN) , and other devices by wireless communication.
  • networks such as the Internet, also referred to as the World Wide Web (WWW)
  • WWW World Wide Web
  • a wireless network such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN)
  • LAN wireless local area network
  • MAN metropolitan area network
  • Wireless communication connections optionally use any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM) , Enhanced Data GSM Environment (EDGE) , high-speed downlink packet access (HSDPA) , high-speed uplink packet access (HSUPA) , Evolution, Data-Only (EV-DO) , HSPA, HSPA+, Dual-Cell HSPA (DC-HSPDA) , long term evolution (LTE) , near field communication (NFC) , wideband code division multiple access (W-CDMA) , code division multiple access (CDMA) , time division multiple access (TDMA) , Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 102.11a, IEEE 102.11ac, IEEE 102.11ax, IEEE 102.11b, IEEE 102.11g and/or IEEE 102.11n) , voice over Internet Protocol (VoIP) , Wi-MAX, a protocol for e-mail
  • the term “if” may be construed to mean “when” or “upon” or “in response to determining” or “in accordance with a determination” or “in response to detecting, ” that a stated condition precedent is true, depending on the context.
  • the phrase “if it is determined [that a stated condition precedent is true] ” or “if [astated condition precedent is true] ” or “when [astated condition precedent is true] ” may be construed to mean “upon determining” or “in response to determining” or “in accordance with a determination” or “upon detecting” or “in response to detecting” that the stated condition precedent is true, depending on the context.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

An electronic device is communicatively connected with an unmanned aerial vehicle (UAV). The device includes an input interface, a first sensor, a second sensor, one or more processors, and memory. The device detects a first user movement from the first sensor attached to a first body portion of a user. It also detects a second user movement from the second sensor attached to a second body portion of the user. The first user movement represents a user instruction to control a flight of the UAV. The device determines one or more parameters associated with the user instruction to control the flight of the UAV based on an interaction between the first user movement and the second user movement. The device transmits to the UAV a wireless signal that is based on the parameters. The UAV adjusts the flight of the UAV in accordance with the parameters.

Description

Systems and Methods for Controlling an Unmanned Aerial Vehicle Using a Body-Attached Remote Control TECHNICAL FIELD
The disclosed embodiments relate generally to unmanned aerial vehicle (UAV) technology, and more specifically, to systems and methods for controlling a UAV using a body-attached controller device.
BACKGROUND
Movable objects can be used for performing surveillance, reconnaissance, and exploration tasks for military and civilian applications. An unmanned aerial vehicle (UAV) (e.g., a drone) is an example of a movable object. A movable object may carry a payload for performing specific functions such as capturing images and video of a surrounding environment of the movable object or for tracking a target object. For example, a movable object may track a target object that is stationary, or moving on the ground or in the air. Movement control information for controlling a movable object is typically received by the movable object via a remote device (e.g., a controller device) and/or determined by the movable object.
SUMMARY
The advancement of UAV (e.g., drone) technology has enabled UAV aerial photography and videography. Generally, a user who intends to capture images and/or video of a specific target object using UAV aerial photography and videography technology will control the UAV to fly toward the target object, and/or provide to the UAV instructions such as positional information of the target object, so that the UAV can execute a flight toward the target object. In some embodiments, a user uses a remote controller device (e.g., a drone controller device) that includes input button (s) , joystick (s) , or a combination thereof, to input commands to control a flight of the UAV, such as to adjust a pitch, roll, yaw, and/or throttle  of the UAV. In some embodiments, the controller device is a handheld device that requires input from both hands of a user. For example, the controller device may include a left input (e.g., a left joystick) that enables the user to control a roll and pitch of the UAV. The controller device may also include a right input (e.g., a right joystick) that enables the user to control a yaw and throttle of the UAV. Furthermore, the controller device may also include additional buttons and/or knobs that provide additional features and/or options to adjust a sensitivity of the pitch, roll, yaw, and/or throttle inputs. In some embodiments, the UAV is also coupled to a camera (e.g., an imaging device) via a gimbal (e.g., a carrier) . The user also controls a corresponding pitch /roll /yaw of the gimbal to adjust the camera to an optimal position for aerial photography and videography. In some embodiments, the numerous input controls of the UAV, and the combination of flight, gimbal, and camera parameters that need to be adjusted, make the operation of the UAV extremely cumbersome.
In some embodiments, a controller device comprises a handheld device that requires inputs from one hand of the user to control a UAV. For example, in some embodiments, a user can control a UAV by pointing the controller device toward the UAV, and/or by holding the controller device in the user’s hand and moving (e.g., gesturing) with the controller device in-hand. In some embodiments, the controller device detects (e.g., senses and measures) the movement (e.g., user movement) in three dimensions (e.g., in the x-, y-, and z-axes) , and maps (e.g., directly maps) the movement and/or rotation of the controller device to a corresponding movement and/or rotation of the UAV. However, such mapping techniques can lead to errors and inaccuracies in some circumstances. For example, the controller device will detect all movements of the user as long the user is holding the controller device, regardless of whether the user movements are actually user instructions that are directed to controlling the UAV. In some situations, a user may want to control the UAV via the controller device by moving one or more fingers of the user and/or rotating a wrist of the user. During the user interaction with the controller device, the user may also (e.g., naturally, inherently, unintentionally or inadvertently etc. ) move other portions of the user’s body, such as move (e.g., swing, rotate etc. ) an elbow and/or an arm. In some instances, the controller device may detect a combined movement from the user (e.g., from  both fingers/wrist and elbow/arm) and map the combined user movement to a corresponding movement and/or rotation of the UAV. Thus, natural (e.g., unintentional) movements from other portions of the user’s body can sometimes lead to an exaggeration (e.g., over-amplification) of the actual user instruction to control the UAV. In some instances, the movements from other portions of the user’s body can also counteract (e.g., reduce) the actual user instruction.
Accordingly, there is a need for improved systems, devices, and methods that enable easier and more convenient control of a UAV. There is also a need for improved controller devices and systems that detect user instructions to control the UAV (through user movement) with greater accuracy, and to take into account possible errors (e.g., systematic errors) and inaccuracies that may arise due to user movements that are not actual instructions to control the UAV.
Aspects of the present disclosure describe an electronic device that comprises a first component and a second component. In some embodiments, the first component is a handheld component. The second component is a wearable component that is worn on a wrist, forearm, arm, etc. of a user. In some embodiments, the first component includes a first sensor. The second component includes a second sensor. The first sensor detects user interaction (s) (e.g., movements and/or rotations) with the first component and maps the user interaction (s) to a corresponding movement and/or rotation of a UAV. For example, in some embodiments, the user interactions are mapped to (e.g., translated into) a first roll value, a first pitch value, and/or a first yaw value. The second sensor detects user movements that may arise due to (e.g., because of) the user interaction (s) with the first component. For example, in some embodiments, the user movements are mapped to (e.g., translated into) a second roll value, a second pitch value, and/or a second yaw value. In some embodiments, the electronic device determines an adjusted roll, pitch, and/or yaw value that is based on a combination of the first roll value, the first pitch value, the first yaw value, the second roll value, the second pitch value, and/or the second yaw value, and transmits to the UAV one or more parameters that include the adjusted roll, pitch, and/or yaw value (s) , to control a flight of the UAV. Thus, the disclosed systems, devices, and methods take into account the  effects of unintended user movement. The disclosed systems, devices, and methods also transmit to the UAV adjusted roll, pitch, and/or yaw value (s) that more accurately reflect the user instructions. Accordingly, the user experience is enhanced.
In accordance with some embodiments of the present disclosure, a method is performed at an electronic device that is communicatively connected with an unmanned aerial vehicle (UAV) . The electronic device includes an input interface, a first sensor, a second sensor, one or more processors, and memory. The memory stores one or more instructions for execution by the one or more processors. The electronic device detects a first user movement from the first sensor attached to a first body portion of a user. The electronic device detects a second user movement from the second sensor attached to a second body portion of the user. The first user movement represents a user instruction to control a flight of the UAV. The second body portion is connected to the first body portion. The electronic device determines one or more parameters associated with the user instruction to control the flight of the UAV based on an interaction between the first user movement and the second user movement. The electronic device transmits to the UAV a wireless signal that is based on the one or more parameters. The UAV is configured to adjust the flight of the UAV in accordance with the one or more parameters.
In some embodiments, adjusting the flight of the UAV comprises adjusting a pitch, roll, and/or yaw of the UAV.
In some embodiments, the electronic device includes a first component and a second component that is communicatively connected with the first component.
In some embodiments, the user instruction to control a flight of the UAV comprises a first pitch value, a first roll value, and/or a first yaw value. The method further comprises determining a second pitch value, a second roll value, and/or a second yaw value based on the second user movement from the second sensor.
In some embodiments, determining the one or more parameters comprises determining one or more of: an adjusted pitch value based on a combination of the first pitch value and the second pitch value; an adjusted roll value based on a combination of the first roll value and  the second roll value; and an adjusted yaw value based on a combination of the first yaw value and the second yaw value.
In some embodiments, determining the one or more parameters comprises determining a weighted combination that includes a plurality of the first pitch value, the first roll value, the first yaw value, the second pitch value, the second roll value, and the second yaw value.
In some embodiments, the weighted combination comprises a weighted combination of the first pitch value and the second pitch value.
In some embodiments, the weighted combination comprises a weighted combination of the first roll value and the second roll value.
In some embodiments, the weighted combination comprises a weighted combination of the first yaw value and the second yaw value.
In some embodiments, the method further comprises: prior to determining the weighted combination: assigning respective first weights to the first pitch value, the first roll value, and/or the first yaw value; and assigning respective second weights to the second pitch value, the second roll value, and/or the second yaw value. The weighted combination is further determined based on the respective assigned first weights and the respective assigned second weights.
In some embodiments, at least one of the first weights has a value of zero.
In some embodiments, at least one of the second weights has a value of zero.
In some embodiments, the electronic device includes a first component and a second component that is mechanically coupled to the first component via a link. The link comprises a first end that is rotatably coupled to the first component. The link also comprises a second end that is rotatably coupled to the second component.
In some embodiments, the first component includes the input interface.
In some embodiments, the electronic device further includes a third sensor. The first sensor is positioned on the first component. The second sensor is positioned on the second end of the link. The third sensor is positioned on the first end of the link.
In some embodiments, the second sensor is configured to measure respective rotation angles at the second end. The third sensor is configured to measure respective rotation angles at the first end.
In some embodiments, the respective rotation angles at the first end include two or more of:a first pitch angle, a first roll angle, and/or a first yaw angle. The respective rotation angles at the second end include two or more of: a second pitch angle, a second roll angle, and/or a first second angle.
In some embodiments, determining one or more parameters of a command comprises determining a combined rotation angle based on a combination of one or more of: the first pitch angle and the second pitch angle; the first roll angle and the second roll angle; and the first yaw angle and the second yaw angle.
In some embodiments, the first sensor is an inertial measurement unit sensor.
In some embodiments, an electronic device comprises an input interface, a first sensor, a second sensor, one or more processors, and memory. The memory stores one or more programs configured for execution by the one or more processors. The one or more programs include instructions for performing any of the methods described herein.
In some embodiments, a non-transitory computer-readable storage medium stores one or more programs configured for execution by an electronic device having an input interface, a first sensor, a second sensor, one or more processors, and memory. The one or more programs include instructions for performing any of the methods described herein.
Thus, methods, systems, and devices are disclosed that enable easier, more accurate, and more convenient control of a UAV, thereby facilitating aerial photography and videography using a UAV.
BRIEF DESCRIPTION OF THE DRAWINGS
For a better understanding of the aforementioned systems, methods, and graphical user interfaces, as well as additional systems, methods, and graphical user interfaces that provide UAV video capture and video editing, reference should be made to the Description of  Embodiments below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
Figure 1 illustrates an exemplary target identification and tracking system according to some embodiments.
Figures 2A to 2C illustrate respectively, an exemplary movable object, an exemplary carrier of a movable object, and an exemplary payload of a movable object according to some embodiments.
Figure 3 illustrates an exemplary sensing system of a movable object according to some embodiments.
Figures 4A and 4B illustrate a block diagram of an exemplary memory of a movable object according to some embodiments.
Figure 5 illustrates an exemplary control unit of a target tracking system according to some embodiments.
Figure 6 illustrates an exemplary computing device for controlling a movable object according to some embodiments.
Figures 7A and 7B illustrate an exemplary configuration of a movable object, a carrier, and a payload according to some embodiments.
Figure 8 illustrates an exemplary operating environment according to some embodiments.
Figure 9 is a block diagram illustrating a representative electronic device according to some embodiments.
Figure 10 illustrates an electronic device according to some embodiments.
Figure 11 illustrates an electronic device according to some embodiments.
Figures 12A and 12B illustrates representative views of an electronic device according to some embodiments.
Figures 13A-13C illustrate a flowchart for a method performed at an electronic device according to some embodiments.
DESCRIPTION OF EMBODIMENTS
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that the present invention may be practiced without requiring these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
The following description uses an unmanned aerial vehicle (UAV) (e.g., a drone) as an example of a movable object. UAVs may include, for example, fixed-wing aircrafts and/or rotary-wing aircrafts such as helicopters, quadcopters, and aircraft having other numbers and/or configurations of rotors. It will be apparent to those skilled in the art that other types of movable objects may be substituted for UAVs as described below in accordance with embodiments of the invention.
Figure 1 illustrates an exemplary target identification and tracking system 100 according to some embodiments. The target identification and tracking system 100 includes a movable object 102 (e.g., a UAV) and a control unit 104. In some embodiments, the movable object 102 is also referred to as a movable device (e.g., a movable electronic device) . In some embodiments, the target identification and tracking system 100 is used for identifying a target object 106 and/or for initiating tracking of the target object 106.
In some embodiments, the target object 106 includes natural and/or man-made objects, such geographical landscapes (e.g., mountains, vegetation, valleys, lakes, and/or rivers) , buildings, and/or vehicles (e.g., aircrafts, ships, cars, trucks, buses, vans, and/or motorcycles) . In some embodiments, the target object 106 includes live subjects such as people and/or animals. In some embodiments, the target object 106 is a moving object, e.g., moving relative to a reference frame (such as the Earth and/or movable object 102) . In some embodiments, the target object 106 is static. In some embodiments, the target object 106 includes an active positioning and navigational system (e.g., a GPS system) that transmits  information (e.g., location, positioning, and/or velocity information) about the target object 106 to the movable object 102, a control unit 104, and/or a computing device 126. For example, information may be transmitted to the movable object 102 via wireless communication from a communication unit of the target object 106 to a communication system 120 of the movable object 102, as illustrated in Figure 2A.
In some embodiments, the movable object 102 includes a carrier 108 and/or a payload 110. The carrier 108 is used to couple the payload 110 to the movable object 102. In some embodiments, the carrier 108 includes an element (e.g., a gimbal and/or damping element) to isolate the payload 110 from movement of the movable object 102. In some embodiments, the carrier 108 includes an element for controlling movement of the payload 110 relative to the movable object 102.
In some embodiments, the payload 110 is coupled (e.g., rigidly coupled) to the movable object 102 (e.g., coupled via the carrier 108) such that the payload 110 remains substantially stationary relative to movable object 102. For example, the carrier 108 may be coupled to the payload 110 such that the payload is not movable relative to the movable object 102. In some embodiments, the payload 110 is mounted directly to the movable object 102 without requiring the carrier 108. In some embodiments, the payload 106 is located partially or fully within the movable object 102.
In some embodiments, the movable object 102 is configured to communicate with the control unit 104, e.g., via wireless communications 124. For example, the movable object 102 may receive control instructions from the control unit 104 (e.g., via a user of the movable object 102) and/or send data (e.g., data from a movable object sensing system 122, Figure 2A) to the control unit 104.
In some embodiments, the control instructions may include, e.g., navigation instructions for controlling one or more navigational parameters of the movable object 102 such as a position, an orientation, an altitude, an attitude (e.g., aviation) and/or one or more movement characteristics of the movable object 102. In some embodiments, the control instructions may include instructions for controlling one or more parameters of a carrier 108 and/or a payload 110. In some embodiments, the control instructions include instructions for directing  movement of one or more of movement mechanisms 114 (Figure 2A) of the movable object 102. For example, the control instructions may be used to control a flight of the movable object 102. In some embodiments, the control instructions may include information for controlling operations (e.g., movement) of the carrier 108. For example, the control instructions may be used to control an actuation mechanism of the carrier 108 so as to cause angular and/or linear movement of the payload 110 relative to the movable object 102. In some embodiments, the control instructions are used to adjust one or more operational parameters for the payload 110, such as instructions for capturing one or more images, capturing video, adjusting a zoom level, powering on or off a component of the payload, adjusting an imaging mode (e.g., capturing still images or capturing video) , adjusting an image resolution, adjusting a focus, adjusting a viewing angle, adjusting a field of view, adjusting a depth of field, adjusting an exposure time, adjusting a shutter speed, adjusting a lens speed, adjusting an ISO, changing a lens and/or moving the payload 110 (and/or a part of payload 110, such as imaging device 214 (shown in Figure 2C) ) . In some embodiments, the control instructions are used to control the communication system 120, the sensing system 122, and/or another component of the movable object 102.
In some embodiments, the control instructions from the control unit 104 may include instructions to initiate tracking of a target object 106. For example, the control instructions may include information about the target object 106, such as identification of the target object 106, a location of the target object 106, a time duration during which the target object 106 is to be tracked, and/or other information. The movable object 102 identifies and initiates tracking in accordance with the instructions. In some embodiments, after tracking of the target object has been initiated, the movable object 102 may receive another set of instructions from the control unit 104 (e.g., via the user) to stop tracking the target object 106. In some circumstances, the movable object 102 may pause or stop tracking the target object 106 when the target object 106 is no longer present (or visible) in the field of view of the movable object 102 after a certain time period (e.g., 5 minutes or 10 minutes) . In some embodiments, after the tracking has been paused or stopped, the movable object 102 may receive further instructions to resume tracking the target object 106.
In some embodiments, as illustrated in Figure 1, the movable object 102 is configured to communicate with a computing device 126 (e.g., an electronic device, a computing system, and/or a server system) . For example, the movable object 102 receives control instructions from the computing device 126 and/or sends data (e.g., data from the movable object sensing system 122) to the computing device 126. In some embodiments, communications from the computing device 126 to the movable object 102 are transmitted from computing device 126 to a cell tower 130 (e.g., via internet 128 and/or other cellular networks such as 4G and 5G networks) and from the cell tower 130 to the movable object 102 (e.g., via RF signals) . In some embodiments, a satellite is used in lieu of or in addition to cell tower 130.
In some embodiments, the target identification and tracking system 100 includes additional control units 104 and/or computing devices 126 that are configured to communicate with the movable object 102.
Figure 2A illustrates an exemplary movable object 102 according to some embodiments. In some embodiments, the movable object 102 includes processor (s) 116, memory 118, a communication system 120, a sensing system 122, a clock 152, and radio (s) 154, which are connected by data connections such as a control bus 112. The control bus 112 optionally includes circuitry (sometimes called a chipset) that interconnects and controls communications between system components.
In some embodiments, the movable object 102 is a UAV and includes components to enable flight and/or flight control. Although the movable object 102 is depicted as an aircraft in this example, this depiction is not intended to be limiting, and any suitable type of movable object may be used.
In some embodiments, the movable object 102 includes movement mechanisms 114 (e.g., propulsion mechanisms) . Although the plural term “movement mechanisms” is used herein for convenience of reference, “movement mechanisms 114” may refer to a single movement mechanism (e.g., a single propeller) or multiple movement mechanisms (e.g., multiple rotors) . The movement mechanisms 114 may include one or more movement mechanism types such as rotors, propellers, blades, engines, motors, wheels, axles, magnets, and nozzles. The movement mechanisms 114 are coupled to the movable object 102 at, e.g.,  the top, bottom, front, back, and/or sides. In some embodiments, the movement mechanisms 114 of a single movable object 102 may include multiple movement mechanisms each having the same type. In some embodiments, the movement mechanisms 114 of a single movable object 102 include multiple movement mechanisms with different movement mechanism types. The movement mechanisms 114 are coupled to the movable object 102 using any suitable means, such as support elements (e.g., drive shafts) or other actuating elements (e.g., one or more actuators 132) . For example, the actuator 132 (e.g., movable object actuator) receives control signals from processor (s) 116 (e.g., via control bus 112) that activates the actuator to cause movement of a movement mechanism 114. For example, the processor (s) 116 include an electronic speed controller that provides control signals to the actuators 132.
In some embodiments, the movement mechanisms 114 enable the movable object 102 to take off vertically from a surface or land vertically on a surface without requiring any horizontal movement of the movable object 102 (e.g., without traveling down a runway) . In some embodiments, the movement mechanisms 114 are operable to permit the movable object 102 to hover in the air at a specified position and/or orientation. In some embodiments, one or more of the movement mechanisms 114 are controllable independently of one or more of the other movement mechanisms 114. For example, when the movable object 102 is a quadcopter, each rotor of the quadcopter is controllable independently of the other rotors of the quadcopter. In some embodiments, multiple movement mechanisms 114 are configured for simultaneous movement.
In some embodiments, the movement mechanisms 114 include multiple rotors that provide lift and/or thrust to the movable object 102. The multiple rotors are actuated to provide, e.g., vertical takeoff, vertical landing, and hovering capabilities to the movable object 102. In some embodiments, one or more of the rotors spin in a clockwise direction, while one or more of the rotors spin in a counterclockwise direction. For example, the number of clockwise rotors is equal to the number of counterclockwise rotors. In some embodiments, the rotation rate of each of the rotors is independently variable, e.g., for controlling the lift and/or thrust produced by each rotor, and thereby adjusting the spatial  disposition, velocity, and/or acceleration of the movable object 102 (e.g., with respect to up to three degrees of translation and/or up to three degrees of rotation) .
In some embodiments, the memory 118 stores one or more instructions, programs (e.g., sets of instructions) , modules, controlling systems and/or data structures, collectively referred to as “elements” herein. One or more elements described with regard to the memory 118 are optionally stored by the control unit 104, the computing device 126, and/or another device. In some embodiments, an imaging device 214 (Figure 2C) includes memory that stores one or more parameters described with regard to the memory 118.
In some embodiments, the memory 118 stores a controlling system configuration that includes one or more system settings (e.g., as configured by a manufacturer, administrator, and/or user) . For example, identifying information for the movable object 102 is stored as a system setting of the system configuration. In some embodiments, the controlling system configuration includes a configuration for the imaging device 214. The configuration for the imaging device 214 stores parameters such as position (e.g., relative to the image sensor 216) , a zoom level and/or focus parameters (e.g., amount of focus, selecting autofocus or manual focus, and/or adjusting an autofocus target in an image) . Imaging property parameters stored by the imaging device configuration include, e.g., image resolution, image size (e.g., image width and/or height) , aspect ratio, pixel count, quality, focus distance, depth of field, exposure time, shutter speed, and/or white balance. In some embodiments, parameters stored by the imaging device configuration are updated in response to control instructions (e.g., generated by processor (s) 116 and/or received by the movable object 102 from the control unit 104 and/or the computing device 126) . In some embodiments, parameters stored by the imaging device configuration are updated in response to information received from the movable object sensing system 122 and/or the imaging device 214.
In some embodiments, the carrier 108 is coupled to the movable object 102 and a payload 110 is coupled to the carrier 108. In some embodiments, the carrier 108 includes one or more mechanisms that enable the payload 110 to move relative to the movable object 102, as described further with respect to Figure 2B. In some embodiments, the payload 110 is rigidly coupled to the movable object 102 such that the payload 110 remains substantially  stationary relative to the movable object 102. For example, the carrier 108 is coupled to the payload 110 such that the payload 110 is not movable relative to the movable object 102. In some embodiments, the payload 110 is coupled to the movable object 102 without requiring the use of the carrier 108.
As further depicted in Figure 2A, the movable object 102 also includes the communication system 120, which enables communication with between the movable object 102 and the control unit 104, and/or the computing device 126 (e.g., via wireless signals 124) , and/or the electronic device 810 (Figure 8) . In some embodiments, the communication system 120 includes transmitters, receivers, and/or transceivers for wireless communication. In some embodiments, the communication is a one-way communication, such that data is transmitted only from the movable object 102 to the control unit 104, or vice-versa. In some embodiments, communication is a two-way communication, such that data is transmitted from the movable object 102 to the control unit 104, as well as from the control unit 104 to the movable object 102.
In some embodiments, the movable object 102 communicates with the computing device 126. In some embodiments, the movable object 102, the control unit 104, and/or the computing device 126 are connected to the Internet or other telecommunications network, e.g., such that data generated by the movable object 102, the control unit 104, and/or the computing device 126 is transmitted to a server for data storage and/or data retrieval (e.g., for display by a website) . In some embodiments, data generated by the movable object 102, the control unit 104, and/or the computing device 126 is stored locally on each of the respective devices.
In some embodiments, the movable object 102 comprises a sensing system (e.g., the movable object sensing system 122) that includes one or more sensors, as described further with reference to Figure 3. In some embodiments, the movable object 102 and/or the control unit 104 use sensing data generated by sensors of sensing system 122 to determine information such as a position of the movable object 102, an orientation of the movable object 102, movement characteristics of the movable object 102 (e.g., an angular velocity, an angular acceleration, a translational velocity, a translational acceleration and/or a direction of  motion along one or more axes) , a distance between the movable object 102 to a target object, proximity (e.g., distance) of the movable object 102 to potential obstacles, weather conditions, locations of geographical features and/or locations of manmade structures.
In some embodiments, the movable object 102 comprises radio (s) 154. The radio (s) 154 enable one or more communication networks, and allow the movable object 102 to communicate with other devices (e.g., electronic device 810, Figure 8) . In some embodiments, the radio (s) 154 are capable of data communications using any of a variety of custom or standard wireless protocols (e.g., IEEE 802.15.4, Wi-Fi, ZigBee, 6LoWPAN, Thread, Z-Wave, Bluetooth Smart, ISA100.5A, WirelessHART, MiWi, Ultrawide Band (UWB) , software defined radio (SDR) etc. ) custom or standard wired protocols (e.g., Ethernet, HomePlug, etc. ) , and/or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
In some embodiments, the movable object 102 includes a clock 152. In some embodiments, the clock 152 synchronizes (e.g., coordinates) time with a clock 922 of an electronic device 810 (Figure 9) .
Figure 2B illustrates an exemplary carrier 108 according to some embodiments. In some embodiments, the carrier 108 couples the payload 110 to the movable object 102.
In some embodiments, the carrier 108 includes a frame assembly having one or more frame members 202. In some embodiments, the frame member (s) 202 are coupled with the movable object 102 and the payload 110. In some embodiments, the frame member (s) 202 support the payload 110.
In some embodiments, the carrier 108 includes one or more mechanisms, such as one or more actuators 204, to cause movement of the carrier 108 and/or the payload 110. In some embodiments, the actuator 204 is, e.g., a motor, such as a hydraulic, pneumatic, electric, thermal, magnetic, and/or mechanical motor. In some embodiments, the actuator 204 causes movement of the frame member (s) 202. In some embodiments, the actuator 204 rotates the payload 110 with respect to one or more axes, such as one or more of: an X axis ( “pitch axis” ) , a Z axis ( “roll axis” ) , and a Y axis ( “yaw axis” ) , relative to the movable object 102. In  some embodiments, the actuator 204 translates the payload 110 along one or more axes relative to the movable object 102.
In some embodiments, the carrier 108 includes a carrier sensing system 206 for determining a state of the carrier 108 or the payload 110. The carrier sensing system 206 includes one or more of: motion sensors (e.g., accelerometers) , rotation sensors (e.g., gyroscopes) , potentiometers, and/or inertial sensors. In some embodiments, the carrier sensing system 206 includes one or more sensors of the movable object sensing system 122 as described below with respect to Figure 3. Sensor data determined by the carrier sensing system 206 may include spatial disposition (e.g., position, orientation, or attitude) , movement information such as velocity (e.g., linear or angular velocity) and/or acceleration (e.g., linear or angular acceleration) of the carrier 108 and/or the payload 110. In some embodiments, the sensing data as well as state information calculated from the sensing data are used as feedback data to control the movement of one or more components (e.g., the frame member 202 (s) , the actuator 204, and/or the damping element 208) of the carrier 108. In some embodiments, the carrier sensing system 206 is coupled to the frame member (s) 202, the actuator 204, the damping element 208, and/or the payload 110. In some instances, a sensor in the carrier sensing system 206 (e.g., a potentiometer) may measure movement of the actuator 204 (e.g., the relative positions of a motor rotor and a motor stator) and generate a position signal representative of the movement of the actuator 204 (e.g., a position signal representative of relative positions of the motor rotor and the motor stator) . In some embodiments, data generated by the sensors is received by processor (s) 116 and/or memory 118 of the movable object 102.
In some embodiments, the coupling between the carrier 108 and the movable object 102 includes one or more damping elements 208. The damping element (s) 208 are configured to reduce or eliminate movement of the load (e.g., the payload 110 and/or the carrier 108) caused by movement of the movable object 102. The damping element (s) 208 may include active damping elements, passive damping elements, and/or hybrid damping elements having both active and passive damping characteristics. The motion damped by the damping element (s) 208 may include vibrations, oscillations, shaking, and/or impacts. Such motions  may originate from motions of the movable object 102, which are transmitted to the payload 110. For example, the motion may include vibrations caused by the operation of a propulsion system and/or other components of the movable object 102.
In some embodiments, the damping element (s) 208 provide motion damping by isolating the payload 110 from the source of unwanted motion, by dissipating or reducing the amount of motion transmitted to the payload 110 (e.g., vibration isolation) . In some embodiments, the damping element 208 (s) reduce a magnitude (e.g., an amplitude) of the motion that would otherwise be experienced by the payload 110. In some embodiments, the motion damping applied by the damping element (s) 208 is used to stabilize the payload 110, thereby improving the quality of video and/or images captured by the payload 110 (e.g., using the imaging device 214, Figure 2C) . In some embodiments, the improved video and/or image quality reduces the computational complexity of processing steps required to generate an edited video based on the captured video, or to generate a panoramic image based on the captured images.
In some embodiments, the damping element (s) 208 may be manufactured using any suitable material or combination of materials, including solid, liquid, or gaseous materials. The materials used for the damping element (s) 208 may be compressible and/or deformable. In one example, the damping element (s) 208 may be made of sponge, foam, rubber, gel, and the like. In another example, the damping element (s) 208 may include rubber balls that are substantially spherical in shape. In other instances, the damping element (s) 208 may be substantially spherical, rectangular, and/or cylindrical in shape. In some embodiments, the damping element (s) 208 may include piezoelectric materials or shape memory materials. In some embodiments, the damping element (s) 208 may include one or more mechanical elements, such as springs, pistons, hydraulics, pneumatics, dashpots, shock absorbers, and/or isolators. In some embodiments, properties of the damping element (s) 208 are selected so as to provide a predetermined amount of motion damping. In some instances, the damping element (s) 208 have viscoelastic properties. The properties of damping element (s) 208 may be isotropic or anisotropic. In some embodiments, the damping element (s) 208 provide motion damping equally along all directions of motion. In some embodiments, the damping  element (s) 208 provide motion damping only along a subset of the directions of motion (e.g., along a single direction of motion) . For example, the damping element (s) 208 may provide damping primarily along the Y (yaw) axis. In this manner, the illustrated damping element (s) 208 reduce vertical motions.
In some embodiments, the carrier 108 further includes a controller 210. The controller 210 may include one or more controllers and/or processors. In some embodiments, the controller 210 receives instructions from the processor (s) 116 of the movable object 102. For example, the controller 210 may be connected to the processor (s) 116 via the control bus 112. In some embodiments, the controller 210 may control movement of the actuator 204, adjust one or more parameters of the carrier sensing system 206, receive data from carrier sensing system 206, and/or transmit data to the processor (s) 116.
Figure 2C illustrates an exemplary payload 110 according to some embodiments. In some embodiments, the payload 110 includes a payload sensing system 212 and a controller 218. The payload sensing system 212 may include an imaging device 214 (e.g., a camera) having an image sensor 216 with a field of view. In some embodiments, the payload sensing system 212 includes one or more sensors of the movable object sensing system 122, as described below with respect to Figure 3.
The payload sensing system 212 generates static sensing data (e.g., a single image captured in response to a received instruction) and/or dynamic sensing data (e.g., a series of images captured at a periodic rate, such as a video) .
The image sensor 216 is, e.g., a sensor that detects light, such as visible light, infrared light, and/or ultraviolet light. In some embodiments, the image sensor 216 includes, e.g., semiconductor charge-coupled device (CCD) , active pixel sensors using complementary metal–oxide–semiconductor (CMOS) or N-type metal-oxide-semiconductor (NMOS, Live MOS) technologies, or any other types of sensors. The image sensor 216 and/or imaging device 214 captures images or image streams (e.g., videos) . Adjustable parameters of imaging device 214 include, e.g., width, height, aspect ratio, pixel count, resolution, quality, imaging mode, focus distance, depth of field, exposure time, shutter speed and/or lens configuration. In some embodiments, the imaging device 214 may configured to capture  videos and/or images at different resolutions (e.g., low, medium, high, or ultra-high resolutions, and/or high-definition or ultra-high-definition videos such as 720p, 1080i, 1080p, 1440p, 2000p, 2160p, 2540p, 4000p, and 4320p) .
In some embodiments, the payload 110 includes the controller 218. The controller 218 may include one or more controllers and/or processors. In some embodiments, the controller 218 receives instructions from the processor (s) 116 of the movable object 102. For example, the controller 218 is connected to the processor (s) 116 via the control bus 112. In some embodiments, the controller 218 may adjust one or more parameters of one or more sensors of the payload sensing system 212, receive data from one or more sensors of payload sensing system 212, and/or transmit data, such as image data from the image sensor 216, to the processor (s) 116, the memory 118, and/or the control unit 104.
In some embodiments, data generated by one or more sensors of the payload sensor system 212 is stored, e.g., by the memory 118. In some embodiments, data generated by the payload sensor system 212 are transmitted to the control unit 104 (e.g., via communication system 120) . For example, video is streamed from the payload 110 (e.g., the imaging device 214) to the control unit 104. In this manner, the control unit 104 displays, e.g., real-time (or slightly delayed) video received from the imaging device 214.
In some embodiments, an adjustment of the orientation, position, altitude, and/or one or more movement characteristics of the movable object 102, the carrier 108, and/or the payload 110 is generated (e.g., by the processor (s) 116) based at least in part on configurations (e.g., preset and/or user configured in system configuration 400, Figure 4) of the movable object 102, the carrier 108, and/or the payload 110. For example, an adjustment that involves a rotation with respect to two axes (e.g., yaw and pitch) is achieved solely by corresponding rotation of movable object around the two axes if the payload 110 including imaging device 214 is rigidly coupled to the movable object 102 (and hence not movable relative to movable object 102) and/or the payload 110 is coupled to the movable object 102 via a carrier 108 that does not permit relative movement between the imaging device 214 and the movable object 102. The same two-axis adjustment may be achieved by, e.g., combining adjustments of both the movable object 102 and the carrier 108 if the carrier 108 permits the imaging device 214  to rotate around at least one axis relative to the movable object 102. In this case, the carrier 108 can be controlled to implement the rotation around one or two of the two axes required for the adjustment and the movable object 120 can be controlled to implement the rotation around one or two of the two axes. In some embodiments, the carrier 108 may include a one-axis gimbal that allows the imaging device 214 to rotate around one of the two axes required for adjustment while the rotation around the remaining axis is achieved by the movable object 102. In some embodiments, the same two-axis adjustment is achieved by the carrier 108 alone when the carrier 108 permits the imaging device 214 to rotate around two or more axes relative to the movable object 102. In some embodiments, the carrier 108 may include a two-axis or three-axis gimbal that enables the imaging device 214 to rotate around two or all three axes.
Figure 3 illustrates an exemplary sensing system 122 of a movable object 102 according to some embodiments. In some embodiments, one or more sensors of the movable object sensing system 122 are mounted to an exterior, or located within, or otherwise coupled to the movable object 102. In some embodiments, one or more sensors of movable object sensing system are components of carrier sensing system 206 and/or payload sensing system 212. Where sensing operations are described as being performed by the movable object sensing system 122 herein, it will be recognized that such operations are optionally performed by the carrier sensing system 206 and/or the payload sensing system 212.
In some embodiments, the movable object sensing system 122 generates static sensing data (e.g., a single image captured in response to a received instruction) and/or dynamic sensing data (e.g., a series of images captured at a periodic rate, such as a video) .
In some embodiments, the movable object sensing system 122 includes one or more image sensors 302, such as image sensor 308 (e.g., a left stereographic image sensor) and/or image sensor 310 (e.g., a right stereographic image sensor) . The image sensors 302 capture, e.g., images, image streams (e.g., videos) , stereographic images, and/or stereographic image streams (e.g., stereographic videos) . The image sensors 302 detect light, such as visible light, infrared light, and/or ultraviolet light. In some embodiments, the movable object sensing system 122 includes one or more optical devices (e.g., lenses) to focus or otherwise  alter the light onto the one or more image sensors 302. In some embodiments, the image sensors 302 include, e.g., semiconductor charge-coupled devices (CCD) , active pixel sensors using complementary metal–oxide–semiconductor (CMOS) or N-type metal-oxide-semiconductor (NMOS, Live MOS) technologies, or any other types of sensors.
In some embodiments, the movable object sensing system 122 includes one or more audio transducers 304. The audio transducers 304 may include an audio output transducer 312 (e.g., a speaker) , and an audio input transducer 314 (e.g. a microphone, such as a parabolic microphone) . In some embodiments, the audio output transducer 312 and the audio input transducer 314 are used as components of a sonar system for tracking a target object (e.g., detecting location information of a target object) .
In some embodiments, the movable object sensing system 122 includes one or more infrared sensors 306. In some embodiments, a distance measurement system includes a pair of infrared sensors e.g., infrared sensor 316 (such as a left infrared sensor) and infrared sensor 318 (such as a right infrared sensor) or another sensor or sensor pair. The distance measurement system is used for measuring a distance between the movable object 102 and the target object 106.
In some embodiments, the movable object sensing system 122 may include other sensors for sensing a distance between the movable object 102 and the target object 106, such as a Radio Detection and Ranging (RADAR) sensor, a Light Detection and Ranging (LiDAR) sensor, or any other distance sensor.
In some embodiments, a system to produce a depth map includes one or more sensors or sensor pairs of movable object sensing system 122 (such as a left stereographic image sensor 308 and a right stereographic image sensor 310; an audio output transducer 312 and an audio input transducer 314; and/or a left infrared sensor 316 and a right infrared sensor 318. In some embodiments, a pair of sensors in a stereo data system (e.g., a stereographic imaging system) simultaneously captures data from different positions. In some embodiments, a depth map is generated by a stereo data system using the simultaneously captured data. In some embodiments, a depth map is used for positioning and/or detection operations, such as  detecting a target object 106, and/or detecting current location information of a target object 106.
In some embodiments, the movable object sensing system 122 includes one or more global positioning system (GPS) sensors, motion sensors (e.g., accelerometers) , rotation sensors (e.g., gyroscopes) , inertial sensors, proximity sensors (e.g., infrared sensors) and/or weather sensors (e.g., pressure sensor, temperature sensor, moisture sensor, and/or wind sensor) .
In some embodiments, sensing data generated by one or more sensors of the movable object sensing system 122 and/or information determined using sensing data from one or more sensors of the movable object sensing system 122 are transmitted to the control unit 104 (e.g., via the communication system 120) . In some embodiments, data generated one or more sensors of the movable object sensing system 122 and/or information determined using sensing data from one or more sensors of the movable object sensing system 122 is stored by the memory 118.
Figures 4A and 4B illustrate a block diagram of an exemplary memory 118 of a movable object 102 according to some embodiments. In some embodiments, one or more elements illustrated in Figures 4A and/or 4B may be located in the control unit 104, the computing device 126, and/or another device.
In some embodiments, the memory 118 stores a system configuration 400. The system configuration 400 includes one or more system settings (e.g., as configured by a manufacturer, administrator, and/or user of the movable object 102) . For example, a constraint on one or more of orientation, position, attitude, and/or one or more movement characteristics of the movable object 102, the carrier 108, and/or the payload 110 is stored as a system setting of the system configuration 400.
In some embodiments, the memory 118 stores a radio communication module 401. The radio communication module 401 connects to and communicates with other network devices (e.g., a local network, such as a router that provides Internet connectivity, networked storage devices, network routing devices, electronic device 810 etc. ) that are coupled to one  or more communication networks (e.g., communication network (s) 810, Figure 8) via the communication system 120 (wired or wireless) .
In some embodiments, the memory 118 stores a motion control module 402. The motion control module 402 stores control instructions that are received from the control module 104 and/or the computing device 126. The control instructions are used for controlling operation of the movement mechanisms 114, the carrier 108, and/or the payload 110.
In some embodiments, memory 118 stores a tracking module 404. In some embodiments, the tracking module 404 generates tracking information for a target object 106 that is being tracked by the movable object 102. In some embodiments, the tracking information is generated based on images captured by the imaging device 214 and/or based on output from an video analysis module 406 (e.g., after pre-processing and/or processing operations have been performed on one or more images) and/or based on input of a user. Alternatively or in combination, the tracking information may be generated based on analysis of gestures of a human target, which are captured by the imaging device 214 and/or analyzed by a gesture analysis module 403. The tracking information generated by the tracking module 404 may include a location, a size, and/or other characteristics of the target object 106 within one or more images. In some embodiments, the tracking information generated by the tracking module 404 is transmitted to the control unit 104 and/or the computing device 126 (e.g., augmenting or otherwise combined with images and/or output from the video analysis module 406) . For example, the tracking information may be transmitted to the control unit 104 in response to a request from the control unit 104 and/or on a periodic basis (e.g., every 2 seconds, 5 seconds, 10 seconds, or 30 seconds) .
In some embodiments, the memory 118 includes a video analysis module 406. The video analysis module 406 performs processing operations on videos and images, such as videos and images captured by the imaging device 214. In some embodiments, the video analysis module 406 performs pre-processing on raw video and/or image data, such as re-sampling to assure the correctness of the image coordinate system, noise reduction, contrast enhancement, and/or scale space representation. In some embodiments, the processing operations performed on video and image data (including data of videos and/or  images that has been pre-processed) include feature extraction, image segmentation, data verification, image recognition, image registration, and/or image matching. In some embodiments, the output from the video analysis module 406 (e.g., after the pre-processing and/or processing operations have been performed) is transmitted to the control unit 104 and/or the computing device 126. In some embodiments, feature extraction is performed by the control unit 104, the processor (s) 116 of the movable object 102, and/or the computing device 126. In some embodiments, the video analysis module 406 may use neural networks to perform image recognition and/or classify object (s) that are included in the videos and/or images. For example, the video analysis module 406 may extract frames that include the target object 106, analyze features of the target object 106, and compare the features with characteristics of one or more predetermined recognizable target object types, thereby enabling the target object 106 to be recognized at a certain confidence level.
In some embodiments, the memory 118 includes a gesture analysis module 403. The gesture analysis module 403 processes gestures of one or more human targets. In some embodiments, the gestures may be captured by the imaging device 214. In some embodiments, after processing the gestures, the gesture analysis results may be fed into the tracking module 404 and/or the motion control module 402 to generate, respectively, tracking information and/or control instructions that are used for controlling operations of the movement mechanisms 114, the carrier 108, and/or the payload 110 of the movable object 102.
In some embodiments, a calibration process may be performed before using gestures of a human target to control the movable object 102. For example, during the calibration process, the gesture analysis module 403 may capture certain features of human gestures associated with a certain control command and stores the gesture features in the memory 118. When a human gesture is received, the gesture analysis module 403 may extract features of the human gesture and compare these features with the stored features to determine whether the certain command may be performed by the user. In some embodiments, the correlations between gestures and control commands associated with a certain human target may or may not be different from such correlations associated with another human target.
In some embodiments, the memory 118 includes a spatial relationship determination module 405. The spatial relationship determination module 405 calculates one or more spatial relationships between the target object 106 and the movable object 102, such as a horizontal distance between the target object 106 and the movable object 102, and/or a pitch angle between the target object 106 and the movable object 102.
In some embodiments, the memory 118 includes a signal processing module 407. The signal processing module 407 processes signals (e.g., wireless signals) that are received by the movable object 102 (e.g., from an electronic device 810 of Figure 8, from the control unit 104, etc. ) . In some embodiments, the movable object 102 uses the signals to determine position (e.g., positional coordinates) of the target object 106. In some embodiments, the signals may include direction (s) of illumination, pattern (s) of illumination, wavelength (s) (e.g., color) of illumination, and/or temporal frequencies of illumination, and/or times of illumination, and/or intensities of illumination. In some embodiments, the signals may include a position of the electronic device 810. In some embodiments, the signals may include one or more parameters from the electronic device 810 to control a flight of the movable object 102. In some embodiments, the one or more parameters include a pitch, roll, and/or yaw value (s) for the movable object 102 from sensors 920, and/or adjusted pitch, roll, and/or yaw value (s) that are based on a combination of the sensors 920. In some embodiments, the one or more parameters include raw data from the sensors 920 of the electronic device 810. In some embodiments, the signal processing module 407 processes the raw sensor data to determine a flight of the movable object 102.
In some embodiments, the memory 118 stores target information 408. In some embodiments, the target information 408 is received by the movable object 102 (e.g., via communication system 120) from the control unit 104, the computing device 126, the target object 106, and/or another movable object.
In some embodiments, the target information 408 includes a time value (e.g., a time duration) and/or an expiration time indicating a period of time during which the target object 106 is to be tracked. In some embodiments, the target information 408 includes a flag (e.g.,  a label) indicating whether a target information entry includes specific tracked target information 412 and/or target type information 410.
In some embodiments, the target information 408 includes target type information 410 such as color, texture, pattern, size, shape, and/or dimension. In some embodiments, the target type information 410 includes, but is not limited to, a predetermined recognizable object type and a general object type as identified by the video analysis module 406. In some embodiments, the target type information 410 includes features or characteristics for each type of target and is preset and stored in the memory 118. In some embodiments, the target type information 410 is provided to a user input device (e.g., the control unit 104) via user input. In some embodiments, the user may select a pre-existing target pattern or type (e.g., an object or a round object with a radius greater or less than a certain value) .
In some embodiments, the target information 408 includes tracked target information 412 for a specific target object 106 being tracked. The target information 408 may be identified by the video analysis module 406 by analyzing the target in a captured image. The tracked target information 412 includes, e.g., an image of the target object 106, an initial position (e.g., location coordinates, such as pixel coordinates within an image) of the target object 106, and/or a size of the target object 106 within one or more images (e.g., images captured by the imaging device 214) . A size of the target object 106 is stored, e.g., as a length (e.g., mm or other length unit) , an area (e.g., mm 2 or other area unit) , a number of pixels in a line (e.g., indicating a length, width, and/or diameter) , a ratio of a length of a representation of the target in an image relative to a total image length (e.g., a percentage) , a ratio of an area of a representation of the target in an image relative to a total image area (e.g., a percentage) , a number of pixels indicating an area of target object 106, and/or a corresponding spatial relationship (e.g., a vertical distance and/or a horizontal distance) between the target object 106 and the movable object 102 (e.g., an area of the target object 106 changes based on a distance of the target object 106 from the movable object 102) .
In some embodiments, one or more features (e.g., characteristics) of the target object 106 are determined from an image of the target object 106 (e.g., using image analysis techniques on images captured by the imaging device 112) . For example, one or more  features of the target object 106 are determined from an orientation and/or part or all of identified boundaries of the target object 106. In some embodiments, the tracked target information 412 includes pixel coordinates and/or a number of pixel counts to indicate, e.g., a size parameter, position, and/or shape of the target object 106. In some embodiments, one or more features of the tracked target information 412 are to be maintained as the movable object 102 tracks the target object 106 (e.g., the tracked target information 412 are to be maintained as images of the target object 106 are captured by the imaging device 214) . In some embodiments, the tracked target information 412 is used to adjust the movable object 102, the carrier 108, and/or the imaging device 214, such that specific features of the target object 106 are substantially maintained. In some embodiments, the tracked target information 412 is determined based on one or more of the target types 410.
In some embodiments, the memory 118 also includes predetermined recognizable target type information 414. The predetermined recognizable target type information 414 specifies one or more characteristics of certain predetermined recognizable target types (e.g., target type 1, target type 2, …, target type n) . Each predetermined recognizable target type may include one or more characteristics such as a size parameter (e.g., area, diameter, height, length and/or width) , position (e.g., relative to an image center and/or image boundary) , movement (e.g., speed, acceleration, altitude) and/or shape. For example, target type 1 may be a human target. One or more characteristics associated with a human target may include a height in a range from about 1.4 meters to about 2 meters, a pattern comprising a head, shoulders, a torso, joints and/or limbs, and/or a moving speed having a range from about 2 kilometers/hour to about 25 kilometers/hour. In another example, target type 2 may be a car target. One or more characteristics associated with a car target may include a height in a range from about 1.4 meters to about 4.5 meters, a length having a range from about 3 meters to about 10 meters, a moving speed of 5 kilometers/hour to about 140 kilometers/hour, and/or a pattern of a sedan, a SUV, a truck, or a bus. In yet another example, target type 3 may be a ship target. Other types of predetermined recognizable target object may also include: an airplane target, an animal target, other moving targets, and stationary (e.g., non-moving) targets such as a building and a statue. Each predetermined target type may further include  one or more subtypes, each of the subtypes having more specific characteristics thereby providing more accurate target classification results.
In some embodiments, the target information 408 (including, e.g., the target type information 410 and the tracked target information 412) , and/or predetermined recognizable target information 414 is generated based on user input, such as a user input received at user input device 506 (Figure 5) of the control unit 104. Additionally or alternatively, the target information 408 may be generated based on data from sources other than the control unit 104. For example, the target type information 410 may be based on previously stored images of the target object 106 (e.g., images captured by the imaging device 214 and stored by the memory 118) , other data stored by the memory 118, and/or data from data stores that are remote from the control unit 104 and/or the movable object 102. In some embodiments, the target type information 408 is generated using a computer-generated image of the target object 106.
In some embodiments, the target information 408 is used by the movable object 102 (e.g., the tracking module 404) to track the target object 106. In some embodiments, the target information 408 is used by a video analysis module 406 to identify and/or classify the target object 106. In some cases, target identification involves image recognition and/or matching algorithms based on, e.g., CAD-like object models, appearance-based methods, feature-based methods, and/or genetic algorithms. In some embodiments, target identification includes comparing two or more images to determine, extract, and/or match features contained therein.
In some embodiments, the memory 118 also includes flight routes 416 (e.g., predefined flight routes) of the movable object 102, such as a portrait flight route 418 (e.g., when the target object 106 is a person) , a long range flight route 420, and a normal flight route 422. Each of the flight routes 416 includes one or more flight paths, each of the one or more paths having a corresponding trajectory mode. In some embodiments, the movable object 102 automatically selects one of the predefined flight routes according to a target type of the target object 106 and executes an autonomous flight according to the predefined flight route. In some embodiments, after automatically selecting a flight route 416 for the movable object  102, the movable object 102 further performs an automatic customization of the flight route taking into consideration factors such as a distance between the movable object 102 and the target object 106, presence of potential obstacle (s) and/or other structures (e.g., buildings and trees) , or weather conditions. In some embodiments, customization of the flight route includes modifying a rate of ascent of the movable object 102, an initial velocity of the movable object 102, and/or an acceleration of the movable object 102. In some embodiments, the customization is provided in part by a user. For example, depending on the target type and the distance, the movable object 102 may cause the computing device 126 to display a library of trajectories that can be selected by the user. The movable object 102 then automatically generates the paths of the flight route based on the user selections.
In some embodiments, the flight routes 416 also include user defined flight route (s) 424, which are routes that are defined and customized by the user. For example, in some embodiments, the user may define a flight route using the control unit 104 (e.g., by identifying two or more points of interests on a map that is displayed on the control unit 104) . The control unit 104 may transmit to the movable object 102 a user defined flight route 424 that includes the identified points of interest and/or positional coordinates of the identified points of interest.
In some embodiments, the memory stores data 426 that are captured by the image sensor 216 during an autonomous flight, including video data 428 and image (s) 430. In some embodiments, the data 426 also includes audio data 432 that are captured by a microphone of the movable object 102 (e.g., the audio input transducer 314) . In some embodiments, the data 426 is simultaneously stored on the moving object 102 as it is being captured. In some embodiments, the memory 118 further stores with the data 426 metadata information. For example, the video data 428 may include tag information (e.g., metadata) that identifies the flight path and trajectory mode corresponding to a respective segment of the video data 428.
In some embodiments, the data 426 further includes mapping data 434. The mapping data comprises mapping relationships between user movements (e.g., movements detected by the electronic device 810) and corresponding pitch, roll, and/or yaw values for the movable object 102.
The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or otherwise re-arranged in various embodiments. In some embodiments, the memory 118 may store a subset of the modules and data structures identified above. Furthermore, the memory 118 may store additional modules and data structures not described above. In some embodiments, the programs, modules, and data structures stored in the memory 118, or a non-transitory computer readable storage medium of the memory 118, provide instructions for implementing respective operations in the methods described below. In some embodiments, some or all of these modules may be implemented with specialized hardware circuits that subsume part or all of the module functionality. One or more of the above identified elements may be executed by the one or more processors 116 of the movable object 102. In some embodiments, one or more of the above identified elements is executed by one or more processors of a device remote from the movable object 102, such as the control unit 104 and/or the computing device 126.
Figure 5 illustrates an exemplary control unit 104 of the target identification and tracking system 100, in accordance with some embodiments. In some embodiments, the control unit 104 communicates with the movable object 102 via the communication system 120, e.g., to provide control instructions to the movable object 102. Although the control unit 104 is typically a portable (e.g., handheld) device, the control unit 104 need not be portable. In some embodiments, the control unit 104 is a dedicated control device (e.g., dedicated to operation of movable object 102) , a laptop computer, a desktop computer, a tablet computer, a gaming system, a wearable device (e.g., watches, glasses, gloves, and/or helmet) , a microphone, and/or a combination thereof.
The control unit 104 typically includes one or more processor (s) 502, a communication system 510 (e.g., including one or more network or other communications interfaces) , memory 504, one or more input/output (I/O) interfaces (e.g., an input device 506 and/or a display 506) , and one or more communication buses 512 for interconnecting these components.
In some embodiments, the input device 506 and/or the display 508 comprises a touchscreen display. The touchscreen display optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments. The touchscreen display and the processor (s) 502 optionally detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touchscreen display.
In some embodiments, the input device 506 includes one or more: joysticks, switches, knobs, slide switches, buttons, dials, keypads, keyboards, mice, audio transducers (e.g., microphones for voice control systems) , motion sensors, and/or gesture controls. In some embodiments, an I/O interface of the control unit 104 includes sensors (e.g., GPS sensors, and/or accelerometers) , audio output transducers (e.g., speakers) , and/or one or more tactile output generators for generating tactile outputs.
In some embodiments, the input device 506 receives user input to control aspects of the movable object 102, the carrier 108, the payload 110, or a component thereof. Such aspects include, e.g., attitude (e.g., aviation) , position, orientation, velocity, acceleration, navigation, and/or tracking. For example, the input device 506 is manually set to one or more positions by a user. Each of the positions may correspond to a predetermined input for controlling the movable object 102. In some embodiments, the input device 506 is manipulated by a user to input control instructions for controlling the navigation of the movable object 102. In some embodiments, the input device 506 is used to input a flight mode for the movable object 102, such as auto pilot or navigation according to a predetermined navigation path.
In some embodiments, the input device 506 is used to input a target tracking mode for the movable object 102, such as a manual tracking mode or an automatic tracking mode. In some embodiments, the user controls the movable object 102, e.g., the position, attitude, and/or orientation of the movable object 102, by changing a position of the control unit 104 (e.g., by tilting or otherwise moving the control unit 104) . For example, a change in a  position of the control unit 104 may detected by one or more inertial sensors and output of the one or more inertial sensors may be used to generate command data. In some embodiments, the input device 506 is used to adjust an operational parameter of the payload, such as a parameter of the payload sensing system 212 (e.g., to adjust a zoom parameter of the imaging device 214) and/or an attitude of the payload 110 relative to the carrier 108 and/or the movable object 102.
In some embodiments, the input device 506 is used to indicate information about the target object 106, e.g., to select a target object 106 to track and/or to indicate the target type information 412. In some embodiments, the input device 506 is used for interaction with augmented image data. For example, an image displayed by the display 508 includes representations of one or more target objects 106. In some embodiments, representations of the one or more target objects 106 are augmented to indicate identified objects for potential tracking and/or a target object 106 that is currently being tracked. Augmentation includes, for example, a graphical tracking indicator (e.g., a box) adjacent to or surrounding a respective target object 106. In some embodiments, the input device 506 is used to select a target object 106 to track or to change the target object being tracked. In some embodiments, a target object 106 is selected when an area corresponding to a representation of the target object 106 is selected by e.g., a finger, stylus, mouse, joystick, or other component of the input device 506. In some embodiments, the specific target information 412 is generated when a user selects a target object 106 to track.
The control unit 104 may also be configured to allow a user to enter target information using any suitable method. In some embodiments, the input device 506 receives a selection of a target object 106 from one or more images (e.g., video or snapshot) displayed by the display 508. For example, the input device 506 receives input including a selection performed by a gesture around the target object 106 and/or a contact at a location corresponding to the target object 106 in an image. In some embodiments, computer vision or other techniques are used to determine a boundary of the target object 106. In some embodiments, input received at the input device 506 defines a boundary of the target object 106. In some embodiments, multiple targets are simultaneously selected. In some embodiments, a selected  target is displayed with a selection indicator (e.g., a bounding box) to indicate that the target is selected for tracking. In some other embodiments, the input device 506 receives input indicating information such as color, texture, shape, dimension, and/or other characteristics associated with a target object 106. For example, the input device 506 includes a keyboard to receive typed input indicating the target information 408.
In some embodiments, the control unit 104 provides an interface that enables a user to select (e.g., using the input device 506) between a manual tracking mode and an automatic tracking mode. When the manual tracking mode is selected, the interface enables the user to select a target object 106 to track. For example, a user is enabled to manually select a representation of a target object 106 from an image displayed by the display 508 of the control unit 104. Specific target information 412 associated with the selected target object 106 is transmitted to the movable object 102, e.g., as initial expected target information.
In some embodiments, when the automatic tracking mode is selected, the user does not provide input selecting a target object 106 to track. In some embodiments, the input device 506 receives target type information 410 from a user input. In some embodiments, the movable object 102 uses the target type information 410, e.g., to automatically identify the target object 106 to be tracked and/or to track the identified target object 106.
Typically, manual tracking requires more user control of the tracking of the target and less automated processing or computation (e.g., image or target recognition) by the processor (s) 116 of the movable object 102, while automatic tracking requires less user control of the tracking process but more computation performed by the processor (s) 116 of the movable object 102 (e.g., by the video analysis module 406) . In some embodiments, allocation of control over the tracking process between the user and the onboard processing system is adjusted, e.g., depending on factors such as the surroundings of movable object 102, motion of the movable object 102, altitude of the movable object 102, the system configuration 400 (e.g., user preferences) , and/or available computing resources (e.g., CPU or memory) of the movable object 102, the control unit 104, and/or the computing device 126. For example, relatively more control is allocated to the user when movable object is navigating in a relatively complex environment (e.g., with numerous buildings or obstacles or  indoor) than when movable object is navigating in a relatively simple environment (e.g., wide open space or outdoor) . As another example, more control is allocated to the user when the movable object 102 is at a lower altitude than when the movable object 102 is at a higher altitude. As a further example, more control is allocated to the movable object 102 if the movable object 102 is equipped with a high-speed processor adapted to perform complex computations relatively quickly. In some embodiments, the allocation of control over the tracking process between the user and the movable object 102 is dynamically adjusted based on one or more of the factors described herein.
In some embodiments, the control unit 104 includes an electronic device (e.g., a portable electronic device) and an input device 506 that is a peripheral device that is communicatively coupled (e.g., via a wireless and/or wired connection) and/or mechanically coupled to the electronic device. For example, the control unit 104 includes a portable electronic device (e.g., a cellphone or a smart phone) and a remote control device (e.g., a standard remote control with a joystick) coupled to the portable electronic device. In this example, an application executed by the cellphone generates control instructions based on input received at the remote control device.
In some embodiments, the display device 508 displays information about the movable object 102, the carrier 108, and/or the payload 110, such as position, attitude, orientation, movement characteristics of the movable object 102, and/or distance between the movable object 102 and another object (e.g., the target object 106 and/or an obstacle) . In some embodiments, information displayed by the display device 508 includes images captured by the imaging device 214, tracking data (e.g., a graphical tracking indicator applied to a representation of the target object 106, such as a box or other shape around the target object 106 shown to indicate that target object 106 is currently being tracked) , and/or indications of control data transmitted to the movable object 102. In some embodiments, the images including the representation of the target object 106 and the graphical tracking indicator are displayed in substantially real-time as the image data and tracking information are received from the movable object 102 and/or as the image data is acquired.
The communication system 510 enables communication with the communication system 120 of the movable object 102, the communication system 610 (Figure 6) of the computing device 126, and/or a base station (e.g., computing device 126) via a wired or wireless communication connection. In some embodiments, the communication system 510 transmits control instructions (e.g., navigation control instructions, target information, and/or tracking instructions) . In some embodiments, the communication system 510 receives data (e.g., tracking data from the payload imaging device 214, and/or data from movable object sensing system 122) . In some embodiments, the control unit 104 receives tracking data (e.g., via the wireless communications 124) from the movable object 102. Tracking data is used by the control unit 104 to, e.g., display the target object 106 as the target is being tracked. In some embodiments, data received by the control unit 104 includes raw data (e.g., raw sensing data as acquired by one or more sensors) and/or processed data (e.g., raw data as processed by, e.g., the tracking module 404) .
In some embodiments, the memory 504 stores instructions for generating control instructions automatically and/or based on input received via the input device 506. The control instructions may include control instructions for operating the movement mechanisms 114 of the movable object 102 (e.g., to adjust the position, attitude, orientation, and/or movement characteristics of the movable object 102, such as by providing control instructions to the actuators 132) . In some embodiments, the control instructions adjust movement of the movable object 102 with up to six degrees of freedom. In some embodiments, the control instructions are generated to initialize and/or maintain tracking of the target object 106. In some embodiments, the control instructions include instructions for adjusting the carrier 108 (e.g., instructions for adjusting the damping element 208, the actuator 204, and/or one or more sensors of the carrier sensing system 206) . In some embodiments, the control instructions include instructions for adjusting the payload 110 (e.g., instructions for adjusting one or more sensors of the payload sensing system 212) . In some embodiments, the control instructions include control instructions for adjusting the operations of one or more sensors of movable the object sensing system 122.
In some embodiments, the memory 504 also stores instructions for performing image recognition, target classification, spatial relationship determination, and/or gesture analysis that are similar to the corresponding functionalities discussed with reference to Figure 4. The memory 504 may also store target information, such as tracked target information and/or predetermined recognizable target type information, as discussed in Figure 4.
In some embodiments, the input device 506 receives user input to control one aspect of the movable object 102 (e.g., the zoom of the imaging device 214) while a control application generates the control instructions for adjusting another aspect of movable the object 102 (e.g., to control one or more movement characteristics of movable object 102) . The control application includes, e.g., control module 402, tracking module 404 and/or a control application of control unit 104 and/or computing device 126. For example, input device 506 receives user input to control one or more movement characteristics of movable object 102 while the control application generates the control instructions for adjusting a parameter of imaging device 214. In this manner, a user is enabled to focus on controlling the navigation of movable object without having to provide input for tracking the target (e.g., tracking is performed automatically by the control application) .
In some embodiments, allocation of tracking control between user input received at the input device 506 and the control application varies depending on factors such as, e.g., surroundings of the movable object 102, motion of the movable object 102, altitude of the movable object 102, system configuration (e.g., user preferences) , and/or available computing resources (e.g., CPU or memory) of the movable object 102, the control unit 104, and/or the computing device 126. For example, relatively more control is allocated to the user when movable object is navigating in a relatively complex environment (e.g., with numerous buildings or obstacles or indoor) than when movable object is navigating in a relatively simple environment (e.g., wide open space or outdoor) . As another example, more control is allocated to the user when the movable object 102 is at a lower altitude than when the movable object 102 is at a higher altitude. As a further example, more control is allocated to the movable object 102 if movable object 102 is equipped with a high-speed processor adapted to perform complex computations relatively quickly. In some embodiments, the  allocation of control over the tracking process between the user and the movable object is dynamically adjusted based on one or more of the factors described herein.
Figure 6 illustrates an exemplary computing device 126 for controlling movable object 102 according to some embodiments. The computing device 126 may be a server computer, a laptop computer, a desktop computer, a tablet, or a phone. The computing device 126 typically includes one or more processor (s) 602 (e.g., processing units) , memory 604, a communication system 610 and one or more communication buses 612 for interconnecting these components. In some embodiments, the computing device 126 includes input/output (I/O) interfaces 606, such as a display 614 and/or an input device 616.
In some embodiments, the computing device 126 is a base station that communicates (e.g., wirelessly) with the movable object 102 and/or the control unit 104.
In some embodiments, the computing device 126 provides data storage, data retrieval, and/or data processing operations, e.g., to reduce the processing power and/or data storage requirements of movable object 102 and/or control unit 104. For example, computing device 126 is communicatively connected to a database (e.g., via communication 610) and/or computing device 126 includes database (e.g., database is connected to communication bus 612) .
The communication system 610 includes one or more network or other communications interfaces. In some embodiments, the computing device 126 receives data from the movable object 102 (e.g., from one or more sensors of the movable object sensing system 122) and/or the control unit 104. In some embodiments, the computing device 126 transmits data to the movable object 102 and/or the control unit 104. For example, computing device provides control instructions to the movable object 102.
In some embodiments, the memory 604 stores instructions for performing image recognition, target classification, spatial relationship determination, and/or gesture analysis that are similar to the corresponding functionalities discussed with respect to Figure 4. The memory 604 may also store target information, such as the tracked target information 408 and/or the predetermined recognizable target type information 414 as discussed in Figure 4.
In some embodiments, the memory 604 or a non-transitory computer-readable storage medium of the memory 604 stores an application 620, which enables interactions with and control over the movable object 102, and which enables data (e.g., audio, video and/or image data) captured by the movable object 102 to be displayed, downloaded, and/or post-processed. The application 620 may include a user interface 630, which enables interactions between a user of the computing device 126 and the movable object 126. In some embodiments, the application 630 may include a video editing module 640, which enables a user of the computing device 126 to edit videos and/or images that have been captured by the movable object 102 during a flight associated with a target object 102, e.g., captured using the image sensor 216.
In some embodiments, the memory 604 also stores templates 650, which may be used for generating edited videos.
In some embodiments, the memory 604 also stores data 660 that have been captured by the movable object 102 during a flight associated with a target object 106, which include videos 661 that have been captured by the movable object 102 during a flight associated with a target object 106. In some embodiments, the data 660 may be organized according to flights 661 (e.g., for each flight route) by the movable object 102. The data for each of the flights 661 may include video data 662, images 663, and/or audio data 664. and/or. In some embodiments, the memory 604 further stores with the video data 662, the images 663, and the audio data 664 tag information 666 (e.g., metadata information) . For example, the video data 662-1 corresponding to flight 1 661-1 may include tag information (e.g., metadata) that identifies the flight path and trajectory mode corresponding to the flight 661-1.
In some embodiments, the memory 604 also stores a web browser 670 (or other application capable of displaying web pages) , which enables a user to communicate over a network with remote computers or devices.
Each of the above identified executable modules, applications, or sets of procedures may be stored in one or more of the memory devices, and corresponds to a set of instructions for performing a function described above. The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures, or  modules, and thus various subsets of these modules may be combined or otherwise re-arranged in various embodiments. In some embodiments, the memory 604 stores a subset of the modules and data structures identified above. Furthermore, the memory 604 may store additional modules or data structures not described above.
Figure 7A illustrates an exemplary configuration 700 of a movable object 102, a carrier 108, and a payload 110 according to some embodiments. The configuration 700 is used to illustrate exemplary adjustments to an orientation, position, attitude, and/or one or more movement characteristics of the movable object 102, the carrier 108, and/or the payload 110, e.g., as used to perform initialization of target tracking and/or to track a target object 106.
In some embodiments, the movable object 102 rotates around up to three orthogonal axes, such as X1 (pitch) 710, Y1 (yaw) 708 and Z1 (roll) 712 axes. The rotations around the three axes are referred to herein as a pitch rotation 722, a yaw rotation 720, and a roll rotation 724, respectively. Angular velocities of the movable object 102 around the X1, Y1, and Z1 axes are referred to herein as ωX1, ωY1, and ωZ1, respectively. In some embodiments, the movable object 102 engages in  translational movements  728, 726, and 730 along the X1, Y1, and Z1 axes, respectively. Linear velocities of the movable object 102 along the X1, Y1, and Z1 axes (e.g., velocities of the  translational movements  728, 726, and 730) are referred to herein as VX1, VY1, and VZ1, respectively.
In some embodiments, the payload 110 is coupled to the movable object 102 via the carrier 108. In some embodiments, the payload 110 moves relative to the movable object 102 (e.g., the payload 110 is caused by the actuator 204 of the carrier 108 to move relative to the movable object 102) .
In some embodiments, the payload 110 moves around and/or along up to three orthogonal axes, e.g., an X2 (pitch) axis 716, a Y2 (yaw) axis 714, and a Z2 (roll) axis 718. The X2, Y2, and Z2 axes are parallel to the X1, Y1, and Z1 axes respectively. In some embodiments, where the payload 110 includes the imaging device 214 (e.g., an optical module 702) , the roll axis Z2 718 is substantially parallel to an optical path or optical axis for the optical module 702. In some embodiments, the optical module 702 is optically coupled to the image sensor 216 (and/or one or more sensors of the movable object sensing system 122) .  In some embodiments, the carrier 108 causes the payload 110 to rotate around up to three orthogonal axes, X2 (pitch) 716, Y2 (yaw) 714 and Z2 (roll) 718, e.g., based on control instructions provided to the actuator 204 of the carrier 108. The rotations around the three axes are referred to herein as the pitch rotation 734, yaw rotation 732, and roll rotation 736, respectively. The angular velocities of the payload 110 around the X2, Y2, and Z2 axes are referred to herein as ωX2, ωY2, and ωZ2, respectively. In some embodiments, the carrier 108 causes the payload 110 to engage in  translational movements  740, 738, and 742, along the X2, Y2, and Z2 axes, respectively, relative to the movable object 102. The linear velocity of the payload 110 along the X2, Y2, and Z2 axes is referred to herein as VX2, VY2, and VZ2, respectively.
In some embodiments, the movement of the payload 110 may be restricted (e.g., the carrier 108 restricts movement of the payload 110, e.g., by constricting movement of the actuator 204 and/or by lacking an actuator capable of causing a particular movement) .
In some embodiments, the movement of the payload 110 may be restricted to movement around and/or along a subset of the three axes X2, Y2, and Z2 relative to the movable object 102. For example, the payload 110 is rotatable around the X2, Y2, and Z2 axes (e.g., the  movements  732, 734, 736) or any combination thereof, the payload 110 is not movable along any of the axes (e.g., the carrier 108 does not permit the payload 110 to engage in the movements 738, 740, 742) . In some embodiments, the payload 110 is restricted to rotation around one of the X2, Y2, and Z2 axes. For example, the payload 110 is only rotatable about the Y2 axis (e.g., rotation 732) . In some embodiments, the payload 110 is restricted to rotation around only two of the X2, Y2, and Z2 axes. In some embodiments, the payload 110 is rotatable around all three of the X2, Y2, and Z2 axes.
In some embodiments, the payload 110 is restricted to movement along the X2, Y2, or Z2 axis (e.g., the movements 738, 740, or 742) , or any combination thereof, and the payload 110 is not rotatable around any of the axes (e.g., the carrier 108 does not permit the payload 110 to engage in the  movements  732, 734, or 736) . In some embodiments, the payload 110 is restricted to movement along only one of the X2, Y2, and Z2 axes. For example, movement of the payload 110 is restricted to the movement 740 along the X2 axis) . In some  embodiments, the payload 110 is restricted to movement along only two of the X2, Y2, and Z2 axes. In some embodiments, the payload 110 is movable along all three of the X2, Y2, and Z2 axes.
In some embodiments, the payload 110 is able to perform both rotational and translational movement relative to the movable object 102. For example, the payload 110 is able to move along and/or rotate around one, two, or three of the X2, Y2, and Z2 axes.
In some embodiments, the payload 110 is coupled to the movable object 102 directly without the carrier 108, or the carrier 108 does not permit the payload 110 to move relative to the movable object 102. In some embodiments, the attitude, position and/or orientation of the payload 110 is fixed relative to the movable object 102 in such cases.
In some embodiments, adjustment of attitude, orientation, and/or position of the payload 110 is performed by adjustment of the movable object 102, the carrier 108, and/or the payload 110, such as an adjustment of a combination of two or more of the movable object 102, the carrier 108, and/or the payload 110. For example, a rotation of 60 degrees around a given axis (e.g., yaw axis) for the payload is achieved by a 60-degree rotation by the movable object 102 alone, a 60-degree rotation by the payload relative to the movable object 102 as effectuated by the carrier, or a combination of 40-degree rotation by the movable object and a 20-degree rotation by the payload 110 relative to the movable object 102.
In some embodiments, a translational movement for the payload 110 is achieved via adjustment of the movable object 102, the carrier 108, and/or the payload 110 such as an adjustment of a combination of two or more of the movable object 102, carrier 108, and/or the payload 110. In some embodiments, a desired adjustment is achieved by adjustment of an operational parameter of the payload 110, such as an adjustment of a zoom level or a focal length of the imaging device 214.
Figure 7B illustrates movement of a movable object 102 with respect to a pitch axis 760, a roll axis 762, and/or a yaw axis 764 according to some embodiments. In some embodiments, the movable object 102 comprises a front 754 (e.g., a front end) , a top 752, and a side 756. In some embodiments, the front 754 of the movable object 102 is also known as the nose of the movable object 102. For example, in some embodiments, the front 754 (e.g.,  a front end) corresponds to the portion of the movable object 102 that is facing the target object 106 as the movable object 102 during a flight route (e.g., as the movable object 102 travels toward the target object 106) . In some embodiments, the movable object 102 resembles the shape of an aircraft, and the side 756 corresponds to the position of one of the wings of the aircraft (e.g., movable object 102) , with the other wing positioned on the other side of the aircraft that is opposite to the side 756. In this example, the pitch axis 760 corresponds to the axis that runs from one wing to the other wing.
Figure 7B (ii) illustrates movement (e.g., rotation) of the movable object 102 about the pitch axis 760. In this example, the direction of the pitch axis 760 is pointing out of the plane of the paper. Figure 7B (ii) illustrates that when the movable object 102 rotates about the pitch axis 760, the front 754 of the movable object 102 (e.g., a nose of the movable object 102) moves (e.g., rotates) up or down about the pitch axis 760. In some embodiments, the pitch axis 760 is also known as a transverse axis.
Figure 7B (iii) illustrates movement (e.g., rotation) of the movable object 102 about the roll axis 762. The roll axis 762 runs from the back to the front 754 of the movable object 102. In this example, the direction of the roll axis 762 is pointing out of the plane of the paper. Figure 7B (iii) illustrates that when the movable object 102 rotates about the roll axis 762, the body of the movable object 102 rotates side to side, about the roll axis 762. In some embodiments, the roll axis 762 is also known as a longitudinal axis.
Figure 7B (iv) illustrates movement (e.g., rotation) of the movable object 102 about the yaw axis 764. The yaw axis 764 runs from the bottom to the top 752 of the movable object 102. In this example, the direction of the yaw axis 764 is pointing out of the plane of the paper. Figure 7B (iv) illustrates that when the movable object 102 rotates about the yaw axis 764, the top 754 (e.g., the nose) of the movable object 102 moves (e.g., rotates) from side to side (e.g., towards the side 756 or away from the side 756) with respect to the yaw axis 764. In some embodiments, the yaw axis 764 is also known as a vertical axis.
Figure 8 illustrates an exemplary operating environment 800 according to some embodiments. In some embodiments, and as illustrated in Figure 8, the operating environment comprises a movable object 102, an electronic device 810, and a target object  106. The movable object 102 is communicatively connected to the electronic device 810 via communication network (s) 802. In some embodiments, the electronic device 810 is a user-operated device. In some embodiments, the electronic device 810 is a controller device (e.g., control unit 104, and/or any remote controller device that is used operatively with a drone drone) for controlling a flight (e.g., a flight path, a flight route etc. ) of the movable object 102, such as a speed, trajectory, elevation, attitude, direction, and/or rotation etc. of the movable object 102.
In some embodiments, the electronic device 810 includes a first component 820 and a second component 830. In the example of Figure 8, the electronic device 810 (e.g., the first component 820) comprises a hand-held device (e.g., hand-held component) that is held (e.g., attached to, coupled to) using a hand, a palm, and/or fingers of the user. In some embodiments, the first component 820 includes an input interface (e.g., input interface 910, Figure 9) that enables input of user instructions. For example, the input interface can include input buttons (e.g., buttons 912, Figure 9) , a display interface (e.g., touchscreen interface 914, Figure 9) , a joystick, control knobs, and/or an audio input interface etc.
In some embodiments, the second component 830 comprises a wearable component. For example, the second component 830 comprises a wristband-like structure that is worn on a wrist or an arm (e.g., a forearm) of the user, for detecting arm/wrist movement as the user is controlling the first component. In some embodiments, the second component 830 is worn on the same arm that is used to hold the first component 820. In some embodiments, the second component 830 is worn on an arm that is different from the arm used to hold the first component 820.
In some embodiments, the first component 820 and the second component 830 are communicatively connected, for example via wireless signals such as Bluetooth, WiFi, and/or other wireless signals. In some embodiments, the first component 820 and the second component 830 are not physically connected to each other (e.g., they are physically decoupled from each other) . In some embodiments, the first component 820 and the second component 830 are communicatively connected via signals that are transmitted using a hard-wired cable. In some embodiments, the first component 820 and the second  component 830 are physically connected to each other (e.g., via a cable) . In some embodiments, the first component 820 and the second component 830 are components of two distinct electronic devices that are communicatively connected to each other (e.g., via Bluetooth, WiFi, other cellular connections, or a wired cable connection etc. ) .
As illustrated in Figure 8, the first component 820 includes a first sensor 840. The second component 830 includes a second sensor 850. In some embodiments, the first sensor 840 detects (e.g., senses and measures) a first user movement from the fingers, palm, hand (e.g., fingers, thumb and/or palm) etc. of the user. The first user movement may correspond to a user instruction to control a flight of the movable object 102 (e.g., a speed, trajectory, elevation, direction, rotation, and/or attitude etc. of the movable object 102) . In some embodiments, the first user movement may correspond to a user instruction to control a flight path or a flight route of the movable object 102. For example, in some embodiments, the first user movement can comprise user activation of one or more input controls (e.g., buttons, knobs, joystick etc. ) on the input interface of the electronic device 810 using one or more fingers of the user, which are detected by the first sensor 840. In some embodiments, the first user movement comprises user movement (e.g., user rotation) of the first component 820 (e.g., waving the first component 820 or gesturing using the first component 820) in a certain direction and/or with a certain speed, which are detected by the first sensor 840.
In some embodiments, the second sensor 850 detects a second user movement. In some embodiments, the second user movement comprises movement from a wrist, forearm, elbow, arm etc. of the user. In some embodiments, the second user movement comprises movements from the user due to (e.g., because of) the first user movement. For example, the second user movement may comprise natural movements (e.g., unexpected or inevitable movements) from the wrist and/or forearm of the user when the user activates the inputs controls on the first component 820 using the user’s fingers. As another example, the second user movement may also comprise natural movements (e.g., unexpected or inevitable movements) from the wrist, forearm and/or elbow of the user when the user moves (e.g., waves) the first component to control the flight of the movable object 102. In some embodiments, the second user movement causes the user instruction (e.g., from the first user  movement) to be over-amplified. In some embodiments, the second user movement causes the user instruction (e.g., from the first user movement) to be understated.
In some embodiments, the electronic device 810 determines one or more parameters associated with the user instruction to control the flight of the movable object 102 based on an interaction between the first user movement and the second user movement. Details of the interaction between the first user movement and the second user movement, and the user instruction are described in more detail in Figures 10, 11, and 13.
In some embodiments, the one or more parameters include a velocity (e.g., speed) of the movable object 102 (e.g., having units of meters per second, miles per hour, kilometers per hour etc. ) , or a speed setting for the movable object 102 (e.g., a low speed, a medium speed, or a high-speed setting) . In some embodiments, each of the speed settings corresponds to an actual speed (or a range of speeds) of the movable object 102, and is predetermined by a manufacturer of the movable object 102. In some embodiments, the one or more parameters include a trajectory, an attitude, an elevation, a flight direction, and/or a rotation (e.g., an angular rotation) of the movable object 102. In some embodiments, the one or more parameters comprise a pitch, yaw, and/or roll value (e.g., pitch, roll and/or yaw angles) . In some embodiments, the one or more parameters comprise sensor values that are measured by the first sensor and/or the second sensor. In some embodiments, the one or more parameters include a pitch /roll and/or yaw of a gimbal (e.g., the carrier 108) that is attached to the movable object 102.
In some embodiments, the electronic device 810 transmits to the movable object 102 a wireless signal 860 that is based on (e.g., that includes) the one or more parameters. The movable object 102 is configured to adjust the flight of the movable object 102 in accordance with the one or more parameters.
Figure 9 is a block diagram illustrating a representative electronic device 810 according to some embodiments.
In some embodiments, the electronic device 810 includes one or more processor (s) 902, one or more communication interface (s) 904 (e.g., network interface (s) ) , memory 906, and  one or more communication buses 908 for interconnecting these components (sometimes called a chipset) .
In some embodiments, the electronic device 810 includes an input interface 910 that facilitates user input and/or audio input. In some embodiments, the input interface 910 includes microphones, button (s) 912, and/or a touchscreen interface 914.
In some embodiments, the electronic device 810 includes output device (s) 916 that facilitate visual output and/or audio output. In some embodiments, the output device (s) 916 include speaker (s) and/or a display 918.
In some embodiments, the electronic device 810 includes one or more sensors 920, such as the first sensor 840 and the second sensor 850 that are shown in Figure 8.
In some embodiments, the sensors 920 include one or more movement sensors (e.g., accelerometers) , light sensors, time-of-flight (ToF) sensors, positioning sensors (e.g., GPS) , inertial sensors (e.g., an inertial measurement unit (IMU) , a magnetometer etc. ) , and/or audio sensors. In some implementations, the positioning sensors include one or more location sensors (e.g., passive infrared (PIR) sensors) and/or one or more orientation sensors (e.g., gyroscopes) .
In some embodiments, the sensors 920 include an inertial measurement unit (IMU) . In some embodiments, the IMU uses a combination of sensors (e.g., an accelerometer, a gyroscope, and/or a magnetometer) to measure orientation of the electronic device 810 or orientation of a component of the electronic device (e.g., the first component 820 and/or the second component 830) . For example, in some embodiments, the IMU uses a combination of accelerometer, a gyroscope, a magnetometer. The accelerometer measures the amount of force (e.g., acceleration) it is experiencing in X, Y and Z directions. In some embodiments, the IMU determines a roll value and a pitch value based on the measured acceleration. The gyroscope measures an angular velocity along the X, Y and Z axes. In some embodiments, the IMU determines an angle by integrating the angular velocity over time, which is used to measure the change in roll, pitch and/or yaw of the electronic device 810. The magnetometer measures magnetism. The magnetometer determines an orientation using the  earth’s magnetic field. In some embodiments, the X, Y and Z magnetometer readings are used to calculate yaw.
In some embodiments, the electronic device 810 includes radios 930. The radios 930 enable one or more communication networks, and allow the electronic device 810 to communicate with other devices, such as the movable object 102. In some implementations, the radios 930 are capable of data communications using any of a variety of custom or standard wireless protocols (e.g., IEEE 802.15.4, Wi-Fi, ZigBee, 6LoWPAN, Thread, Z-Wave, Bluetooth Smart, ISA100.5A, WirelessHART, MiWi, Ultrawide Band (UWB) , software defined radio (SDR) etc. ) custom or standard wired protocols (e.g., Ethernet, HomePlug, etc. ) , and/or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
In some embodiments, the electronic device 810 includes a clock 922. In some embodiments, the clock 922 synchronizes (e.g., coordinates) time with the clock 152 of the movable object 102.
The memory 906 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other non-volatile solid state storage devices. The memory 906, optionally, includes one or more storage devices remotely located from one or more processor (s) 902. The memory 906, or alternatively the non-volatile memory within the memory 906, includes a non-transitory computer-readable storage medium. In some implementations, the memory 906, or the non-transitory computer-readable storage medium of the memory 906, stores the following programs, modules, and data structures, or a subset or superset thereof:
● operating logic 932 including procedures for handling various basic system services and for performing hardware dependent tasks;
● a radio communication module 934 for connecting to and communicating with other network devices (e.g., a local network, such as a router that provides Internet  connectivity, networked storage devices, network routing devices, server systems movable object 102 etc. ) coupled to one or more communication networks 810 via one or more communication interfaces 904 (wired or wireless) ;
● positioning module 936 for determining a position (e.g., positional coordinates) of the electronic device 810; and
● device data 938 for the electronic device 810, including but not limited to:
○ device settings 940 for the electronic device 810, such as default options and preferred user settings; and
○ user settings 942, such as a proficiency level of the user (e.g., beginner user /low proficiency, medium proficiency, and/or expert user /high proficiency) ;
○ sensor data 944 that are measured from the sensors 920. In some embodiments, the sensor data 944 includes data from the first sensor 840 and data from the second sensor 850;
○ weights 946 that are assigned to the sensor data 944; and
○ mapping data 948 that comprises mapping relationships between user movements (e.g., movements detected by the electronic device 810) and corresponding pitch, roll, and/or yaw values for the movable object 102.
● a computation module 950 for translating the user movements into corresponding flight instructions (e.g., instructions to the movable object 102) . In some embodiments, the computation module 950 computes one or more parameters for controlling the movable object 102 based on the sensor data 944, such as an adjusted pitch value, an adjusted yaw value, and/or an adjusted roll value that is based on a combination of the measurements from the sensors 920, including the first sensor 840 and the second sensor 850.
● Each of the above identified modules are optionally stored in one or more of the memory devices described herein, and corresponds to a set of instructions for performing the functions described above. The above identified modules or programs need not be implemented as separate software programs, procedures, modules or data structures, and thus various subsets of these modules may be combined or otherwise re-arranged in various implementations. In some embodiments, the memory 906 stores a subset of the modules and data structures identified above. Furthermore, the memory 906, optionally, stores additional modules and data structures not described above (e.g., a microphone module for obtaining and/or analyzing audio signals in conjunction with microphone input devices, a module for voice detection and/or speech recognition in a voice-enabled smart speaker) . In some embodiments, a subset of the programs, modules, and/or data stored in the memory 906 are stored on and/or executed by a server system (e.g., computing device 126) .
Figure 10 illustrates an electronic device according to some embodiments. Figure 10A illustrates the electronic device 810 that includes the first component 820 and the second component 830.
Figure 10B illustrates a first component 820 of the electronic device 810 according to some embodiments. The first component 820 includes the first sensor 840 (e.g., sensors 920, Figure 9) . In some embodiments, the first sensor 840 is an orientation sensor. In some other embodiments, the first sensor 840 is an IMU that comprises a combination of one or more of: an accelerometer, a gyroscope and a magnetometer. In yet other embodiments, the first sensor 840 determines a first roll value 1002, a first pitch value 1012, and/or a first yaw value 1022 from the measured sensor data.
In some embodiments, the first sensor 840 detects user interaction (s) (e.g., movements and/or rotations) with the first component 820 and maps the user interaction (s) to a corresponding movement and/or rotation of the movable object 102.
For example, in some embodiments, the first component 820 is a handheld component and includes a front and a back. The front and the back of the first component 820 directly  maps to the front and the back of the movable object 102 (see, e.g., Figure 7B) . In some embodiments, a movement of the first component 820 is directly mapped to a corresponding movement (e.g., rotation) of the movable object 102. For example, in some embodiments, a user moves the first component 820 in a forward or backward direction (e.g., by flexing her wrist toward the front or back of the first component 820) . The forward or backward movement corresponds to a first pitch value 1012 (e.g., a first pitch angle) . In some embodiments, the user moves the first component 820 from side to side (e.g., by rotating her wrist with respect to a long axis that is formed by her arm) . The side-to-side movement corresponds to a first roll value 1002 (e.g., a first roll angle) . In some embodiments, the user may also rotate the first component 820 with respect to a vertical axis of the first component 820. In this instance, the movement corresponds to the first yaw value 1022.
Figure 10C illustrates a second component 830 of the electronic device 810 according to some embodiments. In some embodiments, the second component 830 is a wearable component that is configured to be worn by a user (e.g., on a wrist or a forearm of the user) . The second component 830 includes the second sensor 850 (e.g., sensors 920, Figure 9) . In some embodiments, the second sensor 850 is an orientation sensor. In some other embodiments, the second sensor 850 is an IMU that comprises a combination of one or more of:an accelerometer, a gyroscope and a magnetometer. In yet other embodiments, the second sensor 850 determines a second roll value 1004, a second pitch value 1014, and/or a second yaw value 1024 from the second sensor data.
In some embodiments, the second sensor 850 detects user movements that may arise due to (e.g., because of) the user interaction (s) with the first component 820. For example, when the user moves the first component 820 in a forward or backward direction (e.g., by flexing her wrist) , the user may also naturally move her arm. In some embodiments, the arm movement is detected by the second sensor 850.
In some embodiments, the electronic device 810 uses a combination of sensor data from the first sensor 840 and the second sensor 850 (e.g., a combination of the first roll value 1002, the first pitch value 1012, the first yaw value 1022, the second roll value 1004, the second pitch value 1014, and/or the second yaw value 1024) to control the movable object 102.
For example, in some embodiments, the electronic device 810 selects either the first roll value 1002 or the second roll value 1004 as the roll value of the movable object 102. The electronic device 810 selects (e.g., uses) either the first pitch value 1012 or the second pitch value 1014 as the pitch value of the movable object 102. The electronic device 810 also selects either the first yaw value 1022 or the second yaw value 1024 as the yaw value of the movable object 102.
In some embodiments, the electronic device 810 determines an adjusted roll value (e.g., a total roll value) that is based on a combination (e.g., addition or subtraction) of the first roll value 1002 and the second roll value 1004 (e.g., Roll_total = Roll 1002 + Roll 1004, or Roll_total = Roll 1002 –Roll 1004) .
In some embodiments, the electronic device 810 determines an adjusted pitch value (e.g., a total pitch value) that is based on a combination (e.g., addition or subtraction) of the first pitch value 1012 and the second pitch value 1014 (e.g., Pitch_total = Pitch 1012 + Pitch 1014, or Pitch_total = Pitch 1012 –Pitch 1014) .
In some embodiments, the electronic device 810 determines an adjusted yaw value (e.g., a total yaw value) that is based on a combination (e.g., addition or subtraction) of the first yaw value 1022 and the second yaw value 1024 (e.g., Yaw_total = Yaw 1022 + Yaw 1024, or Yaw_total = Yaw 1022 –Yaw 1024) .
In some embodiments, the electronic device 810 transmits to the movable object 102 one or more parameters that includes one or more of the adjusted roll value, the adjusted pitch value, and the adjusted yaw value.
In a first example, the one or more parameters include an adjusted pitch value based on a subtracting the second pitch value from the first pitch value (e.g., Pitch_total = Pitch 1012 –Pitch 1014) . In the first example, the one or more parameters also include the first roll value and the first yaw value (e.g., Roll_total = Roll 1002 and Yaw_total = Yaw 1022) . In some embodiments, the first example depicts a first scenario whereby a user generally raises or lowers her arm naturally when controlling the up and down movement of the movable object 102. Accordingly, the first pitch value (e.g., pitch 1022) that is measured by the first sensor 820 may be bigger than an intended pitch control value for the movable object 102 because it  includes the effects of the user arm movement. In some embodiments, subtracting the second pitch value from the first pitch value (e.g., Pitch_total = Pitch 1012 –Pitch 1014) accounts for measurements (e.g., errors or inaccuracies) due to the user arm movement. Accordingly, the adjusted pitch value (e.g., Pitch_total) more accurately reflects the true pitch value that is intended for the movable object 102. In the scenario described in the first example, by taking into account user movements that are not intended for controlling the movable object 102 and removing these effects through an adjusted pitch value, the flight of the movable object 102 can be more accurately determined. As a result, the user experience is enhanced.
In a second example, the one or more parameters include an adjusted pitch value that is determined by a subtracting the second pitch value from the first pitch value (e.g., Pitch_total = Pitch 1012 –Pitch 1014) . The one or more parameters also include an adjusted yaw value that is determined by a subtracting the second yaw value from the first yaw value (e.g., Yaw_total = Yaw 1022 –Yaw 1024) . The one or more parameters also include the second roll value (e.g., Roll_total = Roll 1004) . In some embodiments, the second example depicts a second scenario whereby a user raises or lowers her arm naturally (e.g., unintentionally) when controlling an upward or downward movement (e.g. a pitch) of the movable object and therefore the first pitch value 1012 may be over-amplified. Furthermore, the first yaw value (e.g., Yaw 1022) may be larger than the actual yaw value that is intended for the movable object 102 because it includes the effects of user body movement when the user is manipulating the remote controller (e.g., the first component 820) .
In some embodiments, the electronic device 810 uses a weighted combination (e.g., weighted sum, weighted aggregation) of sensor values from the first sensor 840 and the second sensor 850 to control the movable object 102. (e.g., a weighted combination of the first roll value 1002, the first pitch value 1012, the first yaw value 1022, the second roll value 1004, the second pitch value 1014, and/or the second yaw value 1024) 
In some embodiments, the electronic device determines an adjusted roll value (e.g., a total roll value) that is based on a weighted combination of the first roll value 1002 and the second roll value 1004. For example, in some embodiments, Roll_total = W A× Roll 1002  + W B× Roll 1004, or Roll_total = W C× Roll 1002 -W D× Roll 1004, wherein W A, W B, W C and W D are respective weights assigned to the first roll value 1002 and the second roll value 1004 and each of the weights W A, W B, W C and W D has a respective value between 0%and 100%inclusive.
In some embodiments, the electronic device 810 determines an adjusted pitch value (e.g., a total pitch value) that is based on a weighted combination of the first pitch value 1012 and the second pitch value 1014. For example, in some embodiments, Pitch_total = W E×Pitch 1012 + W F× Pitch 1014, or Pitch_total = W G× Pitch 1012 -W H× Pitch 1014, wherein W E, W F, W G and W H are respective weights assigned to the first pitch value 1012 and the second pitch value 1014 and each of the weights W E, W F, W G and W H has a respective value between 0%and 100%inclusive.
In some embodiments, the electronic device 810 determines an adjusted yaw value (e.g., a total yaw value) that is based on a weighted combination of the first yaw value 1022 and the second yaw value 1024. For example, in some embodiments, Yaw_total = W L× Yaw 1022 + W J× Yaw 1024, or Yaw_total = W K× Yaw 1022 -W L× Yaw 1024, wherein W L, W J, W K and W L are respective weights assigned to the first yaw value 1022 and the second yaw value 1024 and each of the weights W L, W J, W K and W L has a respective value between 0%and 100%inclusive.
In some embodiments, the electronic device 810 transmits to the movable object 102 one or more parameters that includes one or more of the adjusted roll value, the adjusted pitch value, and the adjusted yaw value. As an example, the one or more parameters include an adjusted pitch value based on a weighted combination of the first pitch value 1012 and the second pitch value 1014. For instance, an adjusted pitch value can be Pitch_total = 100%×Pitch 1012 –10%× Pitch 1014. In some embodiments, the one or more parameters also include an adjusted roll value that is based on a weighted combination of the first roll value 1002 and the second roll value 1004. For instance, the adjusted roll value can be Roll_total = 90%× Roll 1002 + 10%× Roll 1004. In some embodiments, the one or more parameters also include an adjusted yaw value that is based on a weighted combination of the first yaw value 1022 and the second yaw value 1024. In some instances, one of the weights is zero.  For example, Yaw_total = 100%× Yaw 1022 + 0%× Yaw 1024 (e.g., Yaw_total = Yaw 1022 in this example) .
In some embodiments, the electronic device 810 determines (e.g., assigns) relative weights (e.g., weights 946. Figure 9) in the weighted combination according to a level of proficiency of the user (e.g., familiarity of the user) in using the electronic device 810 to control the movable object 102. For example, in some embodiments, before a user starts to interact with the electronic device 810, the user may be asked (e.g., via the display 918) to input a level of proficiency of controlling the movable object 102. If the user indicates that she is relatively new to UAV technology and has little or no experience in using or controlling a movable object 102, the electronic device 810 can assign lower weights to values from the second sensor 850 and assign higher weights to the values from the first sensor 840. In one instance, the electronic device 810 can determine an adjusted yaw value to be Yaw_Total = 50%× Yaw 1022 (e.g., the electronic device assigns a weight of zero to the second yaw value 1024) . In the same instance, the electronic device 810 may determine an adjusted pitch value as Pitch_total = 100%× Pitch 1012 -10%× Pitch 1014) . The adjusted pitch value comprises a subtraction of the weighted second pitch value from the weighted first pitch value because the wrist of the user may move in the same direction as the direction of the Pitch 1012, thus causing the pitch value to be amplified. In the same instance, the electronic device 810 may also determine an adjusted roll value as Roll_total = 80%× (80%× Roll 1002 + 10%× Roll 1004) . In this instance, the hand movement (e.g.., movement of the first component 820) in the roll direction can lead to corresponding movement of the wrist /forearm in the same roll direction. Accordingly, a smaller weight of 10%is assigned to the Roll 1004 value. In some embodiments, the electronic device 910 may assign a weight of zero to the Roll 1004 value.
In some circumstances, if a user indicates that she is an expert user of the movable object 102, the electronic device 810 may determine adjusted yaw value and an adjusted pitch value that is largely based on the readings of the first sensor 840. For example, in some instances, the electronic device 810 can determine an adjusted yaw value to be Yaw_Total = 100%× Yaw 1022, Pitch_total = (100%× Pitch 1012 -10%× Pitch 1014) . The electronic  device 810 can also determine an adjusted roll value that includes higher weights for the first and second roll values, e.g., Roll_total = 90%× Roll 1002 + 50%× Roll 1004.
Figure 11 illustrates an electronic device 810 according to some embodiments.
In some embodiments, the electronic device includes a first component 820 and a second component 830. For example, the first component 820 comprises a main body of the electronic device 810. In some embodiments, the first component 820 is a handheld component and includes a front region and a back region, as illustrated in Figure 11 (and explained previously in Figure 10) . The first component 820 can include an input interface (e.g., input interface 910) . The first component 820 may also include input button (s) (e.g., button (s) 912) and/or a touchscreen interface (e.g., touchscreen interface 914) in some embodiments. In some embodiments, and as illustrated in Figure 11, the first component includes the first sensor 840.
In some embodiments, the second component 830 comprises a wearable component. For example, the second component 830 can be attached to (e.g., worn on) a wrist or a forearm of the user.
Figure 11 shows that the first component 820 and the second component 820 are mechanically coupled to each other via a link 1110. In some embodiments, the link 1110 comprises a first end 1112 that is rotatably coupled to the first component 820 via a first rotation node 1142. The link 1110 also includes a second end 1114 that is rotatably coupled to the second component 830 via a second rotation node 1144. In this example, each of the first rotation node 1142 and the second rotation node 1144 has rotational degrees of freedom in a respective pitch axis and a respective yaw axis.
In some embodiments, the first rotational node 1142 and the second rotational node 1144 comprise rotational sensors that detect (e.g., sense and measure) rotation in a respective pitch, yaw, and/or roll direction (e.g., axis of rotation) .
For example, the first rotational node 1142 measures a third pitch value 1122 and a third yaw value 1132. The second rotational node 1144 measures a fourth pitch value 1124 and a fourth yaw value 1134.
In some embodiments, the electronic device 810 determines an adjusted roll value that is based on a combination of the third pitch value 1122 and the fourth pitch value 1124, e.g., Pitch_total = Pitch 1122 + Pitch 1124.
In some embodiments, the electronic device 810 determines an adjusted yaw value that is based on a combination of the third yaw value 1132 and the fourth yaw value 1134, e.g., Yaw_total = Yaw 1132 + Yaw 1134.
In some embodiments, the electronic device 810 determines an adjusted roll value based on the first sensor 840 measurement, e.g., Roll_total = Roll 1002.
Figures 12A and 12B illustrate representative views of the electronic device 810 according to some embodiments.
Figures 13A-13C illustrate a flowchart for a method 1300 performed (1302) at an electronic device (e.g., the electronic device 810 as described in Figures 8, 9, 10, 11, and 12) according to some embodiments. In some embodiments, the electronic device comprises a controller device for controlling a movable object (e.g., a UAV, a movable object 102 etc. ) .
The electronic device is communicatively connected (1304) (e.g., wireless connected, through the Internet, other cellular networks such as 4G and 5G networks, Bluetooth etc. ) with an unmanned aerial vehicle (UAV) (e.g., movable object 102) .
The electronic device includes (1306) an input interface (e.g., input interface 910, Figure 9) , a first sensor (e.g., first sensor 840 in Figures 8 and 10 or sensors 920, Figure 9) , a second sensor (e.g., second sensor 850 in Figures 8 and 10 or sensors 920 in Figure 9) , one or more processors (e.g., processor (s) 902, Figure 9) , and memory (e.g., memory 906, Figure 9) . In some embodiments, the input interface may also include input control button (s) (e.g., button (s) 912, Figure 9) and/or a touchscreen interface (e.g., touchscreen interface 914, Figure 9) .
In some embodiments, the first sensor is (1308) an inertial measurement unit (IMU) that is attached to (e.g., mounted on and/or embedded in) the electronic device or to a component of the electronic device. For example, the first sensor measures and reports a force, angular rate, and/or orientation of the electronic device to which it is attached. In some embodiments, the first sensor comprises a combination of one or more of: an accelerometer, a gyroscope, and a magnetometer.
In some embodiments, the second sensor is an inertial measurement unit (IMU) that is attached to (e.g., mounted on and/or embedded in) the electronic device or to a component of the electronic device. For example, the second sensor measures and reports a force, angular rate, and/or orientation of the electronic device to which it is attached. In some embodiments, the second sensor comprises a combination of one or more of: an accelerometer, a gyroscope, and a magnetometer.
The memory stores one or more programs and/or instructions that are executed by the one or more processors.
In some embodiments, the electronic device includes (1310) a first component (e.g., first component 820) and a second component (e.g., second component 830) . In some embodiments, the first component 820 and the second component 830 are communicatively connected to each other (e.g., via Bluetooth, WiFi, other wireless signals, or via a hard-wired cable etc. ) .
For example, in some embodiments, the first component comprises a handheld component. In some embodiments, the first component includes the input interface. In some embodiments, the second component comprises a wearable component, such as a wristband structure that is worn on (e.g., attached to) a wrist or a forearm of the user. In some embodiments, the second component includes a sensor (e.g., the second sensor) for detecting movement (e.g., wrist and/or forearm movement, or elbow movement) as the user interacts with the first component. In some embodiments, the first component comprises a handheld component and the second component comprises a wearable component is worn on the same arm that is used to hold the first component. In some embodiments, the first component comprises a handheld component and the second component is a wearable component that is worn on an arm different from the arm used to hold the first component. In some embodiments, the first component is utilized by a first user and the second component is utilized by a second user, distinct from the first user. In some embodiments, the second component is worn by a finger, a head, a leg, or a foot of the first user or the second user. In some embodiments, the first component and the second component are  components of two distinct electronic devices that are communicatively connected to each other (e.g., via Bluetooth, WiFi, other cellular connections, or a wired cable connection etc. ) .
In some embodiments, the electronic device detects (1312) a first user movement from the first sensor attached to a first body portion (e.g., fingers, palm, hand (e.g., fingers, thumb, and palm) ) of a user. The electronic device also detects a second user movement from the second sensor attached to a second body portion (e.g., wrist, forearm, arm, elbow etc. ) of the user. The first user movement represents (1314) a user instruction to control a flight of the UAV (e.g., movable object 102) . For example, in some embodiments, the first user movement represents a user instruction to control a speed, trajectory, elevation, attitude, direction, and/or rotation etc. of the UAV. In some embodiments, the UAV flies (e.g., autonomously and/or by executing a flight path) in accordance with the user instruction. In some embodiments, the first user movement represents a user instruction to control a flight path of the UAV, such as a starting point (e.g., position and/or location of the UAV) , and/or an ending point (e.g., position and/or location of the UAV) , and/or a flight route of the UAV. The second body portion is connected (1316) to the first body portion.
The electronic device determines (1318) one or more parameters associated with the user instruction to control the flight of the UAV based on an interaction between the first user movement and the second user movement. For example, in some embodiments, the one or more parameters include a velocity (e.g., speed) of the UAV (e.g., having units of meters per second, miles per hour, kilometers per hour etc. ) , or a speed setting for the UAV (e.g., a low speed, a medium speed, or a high speed setting) . In some embodiments, each of the speed settings corresponds to an actual speed (or a range of speeds) of the UAV that are predetermined by a manufacturer of the UAV. In some embodiments, the one or more parameters include a trajectory, an attitude, an elevation, a flight direction, and/or a rotation (e.g., an angular rotation) of the UAV. In some embodiments, the one or more parameters comprise a pitch, yaw, and/or roll value (e.g., pitch, roll and/or yaw angles) . In some embodiments, the one or more parameters comprise sensor values that are measured by the first sensor and/or the second sensor. In some embodiments, the one or more parameters  include a pitch /roll and/or yaw of a gimbal (e.g., a carrier 108) that is attached to the UAV (e.g., movable object 102) .
In some embodiments, the electronic device transmits (1320) to the UAV a wireless signal that is based on (e.g., includes) the one or more parameters.
In some embodiments, the UAV is configured to adjust (1322) the flight of the UAV in accordance with the one or more parameters.
In some embodiments, the electronic device detects the first user movement and the second user movement simultaneously. In some embodiments, the electronic device detects the first user movement and the second user movement over a predefined time window (e.g., within 5 seconds, 10 seconds, 20 seconds, 30 seconds etc. of each other) . In some embodiments, the second user movement comprises movements from the user due to (e.g., because of) the first user movement. For example, the second user movements comprise natural movements (e.g., unexpected, inevitable, unintentional movements) of the second body portion during the first user movement. In some embodiments, the second user movement over-amplifies (e.g., exaggerates) the user instruction to control the flight of the UAV. In some embodiments, the second user movement counteracts (e.g., reduces) the user instruction to control the flight of the UAV.
In some embodiments, the one or more parameters comprise sensor parameters that are detected by the first and second sensors. The electronic device transmits the sensor parameters to the UAV. The movement comprises user movement of the hand (e.g., the first body portion) .
In some embodiments, after the electronic device determines one or more parameters associated with the user instruction to control the flight of the UAV, the electronic device generates a command in accordance with the one more parameters. The electronic device then transmits to the UAV a wireless signal that includes the command. The UAV is configured to adjust the flight of the UAV in accordance with the one or more parameters.
In some embodiments, the electronic device is used in conjunction with another electronic device (e.g., a second electronic device, such as another controller device of the UAV, a head-mounted display, and/or a combination thereof etc. ) .
In some embodiments, after determining the one or more parameters, the electronic device generates a command in accordance with the one more parameters. The electronic device transmits the command to a second electronic device, which in turn transmits the command to the UAV.
In some embodiments, the one or more parameters comprise sensor parameters (e.g., sensor values) that are detected (e.g., measured) by the first and second sensors. In some embodiments, the electronic device transmits the sensor parameters to another electronic device, which in turn transmits the sensor parameters to the UAV. The UAV generates a command based on the sensor parameters, and adjusts the flight of the UAV in accordance with the command.
In some embodiments, the one or more parameters comprise sensor parameters that are detected by the first and second sensors. The electronic device transmits the sensor parameters to a second electronic device (e.g., another controller device of the UAV, a head-mounted display, and/or a combination thereof etc. ) , which in turn transmits the sensor parameters to the UAV. The UAV generates a command based on the sensor parameters, and adjusts the flight of the UAV in accordance with the command. In some embodiments, the second electronic device transmits the sensor parameters of the electronic device as well as its own detected measurements. The UAV generates a command based on the sensor parameters and the detected measurements from the second electronic device.
Referring again to Figure 13, in some embodiments, the UAV adjusts (1324) the flight of the UAV by adjusting a pitch, roll, and/or yaw of the UAV.
In some embodiments, the user instruction to control a flight of the UAV comprises (1326) a first pitch value (e.g., pitch 1012) , a first roll value (e.g., roll 1002) , and/or a first yaw value (e.g., yaw 1022) . The electronic device further determines (1328) a second pitch value (e.g., pitch 1014) , a second roll value (e.g., pitch 1004) , and/or a second yaw value (e.g., 1024) based on the second user movement from the second sensor.
In some embodiments, the electronic device transmits to the UAV (e.g., either directly or indirectly via an intermediate device) one or more parameters that comprises the first pitch value, the first roll value, the first yaw value, the second pitch value, the second roll value,  and/or the second yaw value for processing at the UAV. In some embodiments, the electronic device processes the first pitch value, the first roll value, the first yaw value, the second pitch value, the second roll value, and/or the second yaw value on the electronic device (e.g., via the computation module 950) . For example, the electronic device may compute combinations (e.g., a summation or a subtraction) or a weighted combination of one or more of: the first pitch value, the first roll value, the first yaw value, the second pitch value, the second roll value, and/or the second yaw value. The electronic device transmits the processed values to the UAV.
In some embodiments, the UAV adjusts a flight according to a combination of the first pitch value, the first roll value, and/or the first yaw value and the second pitch value, second roll value, and/or second yaw value. In some embodiments, the second pitch value, the second roll value, and/or the second yaw value can be used to adjust a corresponding pitch /roll and/or yaw of a payload of the UAV (e.g., payload 110, Figure 2) . In some embodiments, the payload 110 can include an imaging device (e.g., imaging device 214, Figure 2C) . Therefore, by adjusting a corresponding pitch /roll and/or yaw of the payload 110. the field of view of the imaging device (e.g., image sensor 216) is also modified (e.g., adjusted) .
In some embodiments, the electronic device determines the one or more parameters by determining (1330) one or more of: an adjusted pitch value based on a combination (e.g., addition or subtraction) of the first pitch value and the second pitch value (e.g., Pitch_total =Pitch 1012 + Pitch 1014, or Pitch_total = Pitch 1012 –Pitch 1014) ; an adjusted roll value based on a combination (e.g., addition or subtraction) of the first roll value and the second roll value (e.g., Roll_total = Roll 1002 + Roll 1004, or Roll_total = Roll 1002 –Roll 1004) ; and an adjusted yaw value based on a combination (e.g., addition or subtraction) of the first yaw value and the second yaw value. (e.g., Yaw_total = Yaw 1022 –Yaw 1024) .
In a first example, determining the one or more parameters includes determining an adjusted pitch value based on a subtracting the second pitch value from the first pitch value (e.g., Pitch_total = Pitch 1012 –Pitch 1014) . In the first example, determining the one or more parameters also includes determining the first roll value and the first yaw value (e.g.,  Roll_total = Roll 1002 and Yaw_total = Yaw 1022) . In some embodiments, the first example depicts a first scenario whereby a user generally raises or lowers her arm naturally when controlling the up and down movement of the movable object 102. Accordingly, the first pitch value (e.g., pitch 1022) that is measured by the first sensor 820 may be bigger than an intended pitch control value for the movable object 102 because it includes the effects of the user arm movement. In some embodiments, subtracting the second pitch value from the first pitch value (e.g., Pitch_total = Pitch 1012 –Pitch 1014) accounts for measurements (e.g., errors or inaccuracies) due to the user arm movement. Accordingly, the adjusted pitch value (e.g., Pitch_total) more accurately reflects the true pitch value that is intended for the movable object 102. In the scenario described in the first example, by taking into account user movements that are not intended for controlling the movable object 102 and removing these effects through an adjusted pitch value, the flight of the movable object 102 can be more accurately determined. As a result, the user experience is enhanced.
In a second example, determining the one or more parameters includes determining an adjusted pitch value that is determined by a subtracting the second pitch value from the first pitch value (e.g., Pitch_total = Pitch 1012 –Pitch 1014) . Determining the one or more parameters also includes determining an adjusted yaw value by subtracting the second yaw value from the first yaw value (e.g., Yaw_total = Yaw 1022 –Yaw 1024) . Determining the one or more parameters also include determining the second roll value (e.g., Roll_total = Roll 1004) . In some embodiments, the second example depicts a second scenario whereby a user raises or lowers her arm naturally (e.g., unintentionally) when controlling an upward or downward movement (e.g. a pitch) of the movable object and therefore the first pitch value 1012 may be over-amplified. Furthermore, the first yaw value (e.g., Yaw 1022) may be larger than the actual yaw value that is intended for the movable object 102 because it includes the effects of user body movement when the user is manipulating the remote controller (e.g., the first component 820) .
In some embodiments, the electronic device determines the one or more parameters by determining (1332) a weighted combination (e.g., weighted sum, weighted aggregation) that  includes a plurality of the first pitch value, the first roll value, the first yaw value, the second pitch value, the second roll value, and the second yaw value.
In some embodiments, the weighted combination comprises (1334) a weighted combination of the first pitch value (e.g., pitch 1112) and the second pitch value (e.g., pitch 1114) .
In some embodiments, the weighted combination comprises (1336) a weighted combination of the first roll value (e.g., roll 1102) and the second roll value (e.g., roll 1102) .
In some embodiments, the weighted combination comprises (1338) a weighted combination of the first yaw value (e.g., yaw 1022) and the second yaw value (e.g., yaw 1024) .
In some embodiments, prior to (1340) determining the weighted combination, the electronic device assigns (1342) respective first weights to the first pitch value, the first roll value, and/or the first yaw value. The electronic device also assigns (1346) respective second weights to the second pitch value, the second roll value, and/or the second yaw value. The weighted combination is further determined (1350) based on the respective assigned first weights and the respective assigned second weights.
In some embodiments, the electronic device 810 determines (e.g., assigns) relative weights (e.g., weights 946. Figure 9) in the weighted combination according to a level of proficiency of the user (e.g., familiarity of the user) in using the electronic device 810 to control the movable object 102. For example, in some embodiments, before a user starts to interact with the electronic device 810, the user may be asked (e.g., via the display 918) to input a level of proficiency of controlling the movable object 102. If the user indicates that she is relatively new to UAV technology and has little or no experience in using or controlling a movable object 102, the electronic device 810 can assign lower weights to values from the second sensor 850 and assign higher weights to the values from the first sensor 840. In one instance, the electronic device 810 can determine an adjusted yaw value to be Yaw_Total = 50%× Yaw 1022 (e.g., the electronic device assigns a weight of zero to the second yaw value 1024) . In the same instance, the electronic device 810 may determine an adjusted pitch value as Pitch_total = 100%× Pitch 1012 -10%× Pitch 1014) . The  adjusted pitch value comprises a subtraction of the weighted second pitch value from the weighted first pitch value because the wrist of the user may move in the same direction as the direction of the Pitch 1012, thus causing the pitch value to be amplified. In the same instance, the electronic device 810 may also determine an adjusted roll value as Roll_total = 80%× (80%× Roll 1002 + 10%× Roll 1004) . In this instance, the hand movement (e.g.., movement of the first component 820) in the roll direction can lead to corresponding movement of the wrist /forearm in the same roll direction. Accordingly, a smaller weight of 10%is assigned to the Roll 1004 value. In some embodiments, the electronic device 910 may assign a weight of zero to the second roll (e.g., Roll 1004) value.
In some circumstances, if a user indicates that she is an expert user of the movable object 102, the electronic device 810 may determine adjusted yaw value and an adjusted pitch value that is largely based on the readings of the first sensor 840. For example, in some instances, the electronic device 810 can determine an adjusted yaw value to be Yaw_Total = 100%× Yaw 1022, Pitch_total = (100%× Pitch 1012 -10%× Pitch 1014) . The electronic device 810 can also determine an adjusted roll value that includes higher weights for the first and second roll values, e.g., Roll_total = 90%× Roll 1002 + 50%× Roll 1004.
In some embodiments, at least one of the first weights has (1344) a value of zero.
In some embodiments, at least one of the second weights has (1348) a value of zero.
In some embodiments, the electronic device includes (1352) a first component (e.g., first component 820, Figure 11) and a second component (e.g., second component 830, Figure 11) that is mechanically coupled to the first component via a link. For example, in Figure 11, the first component 820 comprises a main body of the electronic device 810. In some embodiments, the first component includes the input interface (e.g., input interface 910) . The first component 820 may also include input button (s) (e.g., button (s) 912) and/or a touchscreen interface (e.g., touchscreen interface 914) in some embodiments. In some embodiments, the second component 830 is attached to (e.g., worn by) a user. For example, the second component can be worn on the wrist or a forearm of the user.
In some embodiments, the link (e.g., link 1110, Figure 11) comprises (1354) a first end (e.g., first end 1112, Figure 11) that is rotatably coupled to the first component and a second end (e.g., second end 1114, Figure 11) that is rotatably coupled to the second component.
For example, as illustrated in Figure 11, in some embodiments, the first end 1112 is rotatably coupled to the first component 820 via a first rotation node 1142. The second end 1114 is rotatably coupled to the second component 830 via a second rotation node 1144. In some embodiments, the first rotational node 1142 and the second rotational node 1144 comprise rotational sensors that detect (e.g., sense and measure) rotation in a respective pitch, yaw, and/or roll direction (e.g., axis of rotation) .
In some embodiments, the first component includes the input interface.
In some embodiments, the electronic device further includes a third sensor. The first sensor 840 is positioned on the first component. The second sensor 850 (e.g., a rotational sensor, such as a pitch/yaw/roll sensor, or an IMU) is positioned on the second end of the link (e.g., at the second rotational node 1144) . The third sensor (e.g., a rotational sensor, such as a pitch/yaw/roll sensor) is positioned on the first end of the link (e.g., at the first rotational node 1142) .
In some embodiments, the second sensor (e.g., a rotational sensor, such as a pitch/yaw/roll sensor) is configured to measure respective rotation angles at the second end. For example, in some embodiments, the second sensor is a pitch/yaw/roll angle sensor and is configured to measure the respective rotation angles with respect to each axis of rotation at the second end. In some embodiments, the axis of rotation at the second end includes a pitch axis of rotation, a yaw axis of rotation, and/or a roll axis of rotation.
In some embodiments, the third sensor is configured to measure respective rotation angles at the first end. For example, in some embodiments, the third sensor is a pitch/yaw/roll angle sensor and is configured to measure the respective rotation angles with respect to each axis of rotation at the first end. In some embodiments, the axis of rotation at the first end includes a pitch axis of rotation, a yaw axis of rotation, and/or a roll axis of rotation.
In some embodiments, the respective rotation angles at the first end include two or more of: a first pitch angle, a first roll angle, and/or a first yaw angle. The respective rotation  angles at the second end include two or more of: a second pitch angle, a second roll angle, and/or a first second angle.
In some embodiments, the electronic device determines the one or more parameters of a command by determining (1356) a combined rotation angle based on a combination (e.g., an addition, a subtraction, a weighted sum, a weighted subtraction etc. ) of one or more of: the first pitch angle and the second pitch angle; the first roll angle and the second roll angle; and the first yaw angle and the second yaw angle.
Many features of the present invention can be performed in, using, or with the assistance of hardware, software, firmware, or combinations thereof. Consequently, features of the present invention may be implemented using a processing system. Exemplary processing systems (e.g., processor (s) 116, controller 210, controller 218, processor (s) 502, processor (s) 602, and/or processor (s) 902) include, without limitation, one or more general purpose microprocessors (for example, single or multi-core processors) , application-specific integrated circuits, application-specific instruction-set processors, field-programmable gate arrays, graphics processing units, physics processing units, digital signal processing units, coprocessors, network processing units, audio processing units, encryption processing units, and the like.
Features of the present invention can be implemented in, using, or with the assistance of a computer program product, such as a storage medium (media) or computer readable medium (media) having instructions stored thereon/in which can be used to program a processing system to perform any of the features presented herein. The storage medium (e.g., ( e.g. memory  118, 504, 604) can include, but is not limited to, any type of disk including floppy disks, optical discs, DVD, CD-ROMs, microdrive, and magneto-optical disks, ROMs, RAMs, EPROMs, EEPROMs, DRAMs, VRAMs, DDR RAMs, flash memory devices, magnetic or optical cards, nanosystems (including molecular memory ICs) , or any type of media or device suitable for storing instructions and/or data.
Stored on any one of the machine readable medium (media) , features of the present invention can be incorporated in software and/or firmware for controlling the hardware of a processing system, and for enabling a processing system to interact with other mechanism  utilizing the results of the present invention. Such software or firmware may include, but is not limited to, application code, device drivers, operating systems and execution environments/containers.
Communication systems as referred to herein (e.g.,  communication systems  120, 510, 610) optionally communicate via wired and/or wireless communication connections. For example, communication systems optionally receive and send RF signals, also called electromagnetic signals. RF circuitry of the communication systems convert electrical signals to/from electromagnetic signals and communicate with communications networks and other communications devices via the electromagnetic signals. RF circuitry optionally includes well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. Communication systems optionally communicate with networks, such as the Internet, also referred to as the World Wide Web (WWW) , an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN) , and other devices by wireless communication. Wireless communication connections optionally use any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM) , Enhanced Data GSM Environment (EDGE) , high-speed downlink packet access (HSDPA) , high-speed uplink packet access (HSUPA) , Evolution, Data-Only (EV-DO) , HSPA, HSPA+, Dual-Cell HSPA (DC-HSPDA) , long term evolution (LTE) , near field communication (NFC) , wideband code division multiple access (W-CDMA) , code division multiple access (CDMA) , time division multiple access (TDMA) , Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 102.11a, IEEE 102.11ac, IEEE 102.11ax, IEEE 102.11b, IEEE 102.11g and/or IEEE 102.11n) , voice over Internet Protocol (VoIP) , Wi-MAX, a protocol for e-mail (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP) ) , instant messaging (e.g., extensible messaging and presence protocol (XMPP) , Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE) , Instant Messaging and Presence Service (IMPS) ) , and/or Short  Message Service (SMS) , or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the invention.
The present invention has been described above with the aid of functional building blocks illustrating the performance of specified functions and relationships thereof. The boundaries of these functional building blocks have often been arbitrarily defined herein for the convenience of the description. Alternate boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Any such alternate boundaries are thus within the scope and spirit of the invention.
The terminology used in the description of the various described embodiments herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singular forms “a, ” “an, ” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes, ” “including, ” “comprises, ” and/or “comprising, ” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the term “if” may be construed to mean “when” or “upon” or “in response to determining” or “in accordance with a determination” or “in response to detecting, ” that a stated condition precedent is true, depending on the context. Similarly, the phrase “if it is determined [that a stated condition precedent is true] ” or “if [astated condition precedent is true] ” or “when [astated condition precedent is true] ” may be construed to mean “upon determining” or “in response to determining” or “in accordance with a determination”  or “upon detecting” or “in response to detecting” that the stated condition precedent is true, depending on the context.
The foregoing description of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. The breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments. Many modifications and variations will be apparent to the practitioner skilled in the art. The modifications and variations include any relevant combination of the disclosed features. The embodiments were chosen and described in order to best explain the principles of the invention and its practical application, thereby enabling others skilled in the art to understand the invention for various embodiments and with various modifications that are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalence.

Claims (22)

  1. A method performed at an electronic device that is communicatively connected with an unmanned aerial vehicle (UAV) , the electronic device including an input interface, a first sensor, a second sensor, one or more processors, and memory, the method comprising:
    detecting a first user movement from the first sensor attached to a first body portion of a user and a second user movement from the second sensor attached to a second body portion of the user, wherein the first user movement represents a user instruction to control a flight of the UAV and the second body portion is connected to the first body portion;
    determining one or more parameters associated with the user instruction to control the flight of the UAV based on an interaction between the first user movement and the second user movement; and
    transmitting to the UAV a wireless signal that is based on the one or more parameters, wherein the UAV is configured to adjust the flight of the UAV in accordance with the one or more parameters.
  2. The method of claim 1, wherein adjusting the flight of the UAV comprises adjusting a pitch, roll, and/or yaw of the UAV.
  3. The method of claim 1, wherein the electronic device includes a first component and a second component that is communicatively connected with the first component.
  4. The method of claim 3, wherein:
    the first component comprises a handheld component; and
    the second component comprises a wearable component that is worn on a wrist or a forearm or an arm of the user.
  5. The method of claim 1, wherein the user instruction to control a flight of the UAV comprises a first pitch value, a first roll value, and/or a first yaw value;
    the method further comprising:
    determining a second pitch value, a second roll value, and/or a second yaw value based on the second user movement from the second sensor.
  6. The method of claim 5, wherein determining the one or more parameters comprises determining one or more of:
    an adjusted pitch value based on a combination of the first pitch value and the second pitch value;
    an adjusted roll value based on a combination of the first roll value and the second roll value; and
    an adjusted yaw value based on a combination of the first yaw value and the second yaw value.
  7. The method of claim 5, wherein determining the one or more parameters comprises determining a weighted combination that includes a plurality of: the first pitch value, the first roll value, the first yaw value, the second pitch value, the second roll value, and the second yaw value.
  8. The method of claim 7, wherein the weighted combination comprises a weighted combination of the first pitch value and the second pitch value.
  9. The method of claim 7, wherein the weighted combination comprises a weighted combination of the first roll value and the second roll value.
  10. The method of claim 7, wherein the weighted combination comprises a weighted combination of the first yaw value and the second yaw value.
  11. The method of claim 7, further comprising:
    prior to determining the weighted combination:
    assigning respective first weights to the first pitch value, the first roll value, and/or the first yaw value; and
    assigning respective second weights to the second pitch value, the second roll value, and/or the second yaw value,
    wherein the weighted combination is further determined based on the respective assigned first weights and the respective assigned second weights.
  12. The method of claim 11, wherein at least one of the first weights has a value of zero.
  13. The method of claim 11, wherein at least one of the second weights has a value of zero.
  14. The method of claim 1, wherein:
    the electronic device includes a first component and a second component that is mechanically coupled to the first component via a link; and
    the link comprises a first end that is rotatably coupled to the first component and a second end that is rotatably coupled to the second component.
  15. The method of claim 14, wherein the first component includes the input interface.
  16. The method of claim 14, wherein:
    the electronic device further includes a third sensor;
    the first sensor is positioned on the first component;
    the second sensor is positioned on the second end of the link; and
    the third sensor is positioned on the first end of the link.
  17. The method of claim 16, wherein:
    the second sensor is configured to measure respective rotation angles at the second end; and
    the third sensor is configured to measure respective rotation angles at the first end.
  18. The method of claim 17, wherein:
    the respective rotation angles at the first end include two or more of: a first pitch angle, a first roll angle, and/or a first yaw angle; and
    the respective rotation angles at the second end include two or more of: a second pitch angle, a second roll angle, and/or a first second angle.
  19. The method of claim 18, wherein determining one or more parameters of a command comprises determining a combined rotation angle based on a combination of one or more of:
    the first pitch angle and the second pitch angle;
    the first roll angle and the second roll angle; and
    the first yaw angle and the second yaw angle.
  20. The method of claim 1, wherein the first sensor is an inertial measurement unit sensor.
  21. An electronic device, comprising:
    one or more processors; and
    memory coupled to the one or more processors, the memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any of claims 1 to 20.
  22. A non-transitory computer-readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by an electronic device, cause the electronic device to perform operations comprising the method of any of claims 1 to 20.
PCT/CN2020/141370 2020-12-30 2020-12-30 Systems and methods for controlling an unmanned aerial vehicle using a body-attached remote control WO2022141187A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/141370 WO2022141187A1 (en) 2020-12-30 2020-12-30 Systems and methods for controlling an unmanned aerial vehicle using a body-attached remote control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/141370 WO2022141187A1 (en) 2020-12-30 2020-12-30 Systems and methods for controlling an unmanned aerial vehicle using a body-attached remote control

Publications (1)

Publication Number Publication Date
WO2022141187A1 true WO2022141187A1 (en) 2022-07-07

Family

ID=82258792

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/141370 WO2022141187A1 (en) 2020-12-30 2020-12-30 Systems and methods for controlling an unmanned aerial vehicle using a body-attached remote control

Country Status (1)

Country Link
WO (1) WO2022141187A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115980742A (en) * 2023-03-20 2023-04-18 成都航空职业技术学院 Radar detection method and device for unmanned aerial vehicle

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203759869U (en) * 2014-03-20 2014-08-06 西南科技大学 Gesture sensing type aircraft remote controller
CN104898524A (en) * 2015-06-12 2015-09-09 江苏数字鹰科技发展有限公司 Unmanned plane remote control system based on gesture
CN105138126A (en) * 2015-08-26 2015-12-09 小米科技有限责任公司 Unmanned aerial vehicle shooting control method and device and electronic device
CN105836127A (en) * 2016-03-18 2016-08-10 普宙飞行器科技(深圳)有限公司 Control method of unmanned plane and unmanned plane system
US20170322563A1 (en) * 2014-06-23 2017-11-09 Nixie Labs, Inc. Launch-controlled unmanned aerial vehicles, and associated systems and methods
CN107765702A (en) * 2016-08-17 2018-03-06 宁波原子智能技术有限公司 Remotely-piloted vehicle and its control method
CN108496141A (en) * 2017-06-30 2018-09-04 深圳市大疆创新科技有限公司 Method, control device and the system for tracking that control movable equipment follows
CN109564432A (en) * 2016-08-05 2019-04-02 深圳市大疆创新科技有限公司 The method and related system of movable equipment are communicated/controlled with movable equipment by posture
CN112154402A (en) * 2019-08-29 2020-12-29 深圳市大疆创新科技有限公司 Wearable device, control method thereof, gesture recognition method and control system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203759869U (en) * 2014-03-20 2014-08-06 西南科技大学 Gesture sensing type aircraft remote controller
US20170322563A1 (en) * 2014-06-23 2017-11-09 Nixie Labs, Inc. Launch-controlled unmanned aerial vehicles, and associated systems and methods
CN104898524A (en) * 2015-06-12 2015-09-09 江苏数字鹰科技发展有限公司 Unmanned plane remote control system based on gesture
CN105138126A (en) * 2015-08-26 2015-12-09 小米科技有限责任公司 Unmanned aerial vehicle shooting control method and device and electronic device
CN105836127A (en) * 2016-03-18 2016-08-10 普宙飞行器科技(深圳)有限公司 Control method of unmanned plane and unmanned plane system
CN109564432A (en) * 2016-08-05 2019-04-02 深圳市大疆创新科技有限公司 The method and related system of movable equipment are communicated/controlled with movable equipment by posture
CN107765702A (en) * 2016-08-17 2018-03-06 宁波原子智能技术有限公司 Remotely-piloted vehicle and its control method
CN108496141A (en) * 2017-06-30 2018-09-04 深圳市大疆创新科技有限公司 Method, control device and the system for tracking that control movable equipment follows
CN112154402A (en) * 2019-08-29 2020-12-29 深圳市大疆创新科技有限公司 Wearable device, control method thereof, gesture recognition method and control system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115980742A (en) * 2023-03-20 2023-04-18 成都航空职业技术学院 Radar detection method and device for unmanned aerial vehicle

Similar Documents

Publication Publication Date Title
US10802491B2 (en) Methods and systems for target tracking
US11632497B2 (en) Systems and methods for controlling an image captured by an imaging device
US11669987B2 (en) Obstacle avoidance during target tracking
US11019255B2 (en) Depth imaging system and method of rendering a processed image to include in-focus and out-of-focus regions of one or more objects based on user selection of an object
US11106201B2 (en) Systems and methods for target tracking
US20170195549A1 (en) Systems, methods, and devices for setting camera parameters
US11049261B2 (en) Method and system for creating video abstraction from image data captured by a movable object
WO2022141369A1 (en) Systems and methods for supporting automatic video capture and video editing
US20200050184A1 (en) Wind velocity force feedback
US11320817B2 (en) Target-based image exposure adjustment
US20230259132A1 (en) Systems and methods for determining the position of an object using an unmanned aerial vehicle
WO2022141187A1 (en) Systems and methods for controlling an unmanned aerial vehicle using a body-attached remote control
US20200106958A1 (en) Method and system for operating a movable platform using ray-casting mapping
JP2021073796A (en) Control device, and method for obtaining image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20967528

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20967528

Country of ref document: EP

Kind code of ref document: A1