WO2023097263A1 - Enhanced unmanned aerial vehicle flight along computed splines - Google Patents
Enhanced unmanned aerial vehicle flight along computed splines Download PDFInfo
- Publication number
- WO2023097263A1 WO2023097263A1 PCT/US2022/080412 US2022080412W WO2023097263A1 WO 2023097263 A1 WO2023097263 A1 WO 2023097263A1 US 2022080412 W US2022080412 W US 2022080412W WO 2023097263 A1 WO2023097263 A1 WO 2023097263A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- spline
- computed
- drone
- unmanned aerial
- aerial vehicle
- Prior art date
Links
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims abstract description 72
- 230000004048 modification Effects 0.000 claims abstract description 36
- 238000012986 modification Methods 0.000 claims abstract description 36
- 238000000034 method Methods 0.000 claims description 50
- 230000033001 locomotion Effects 0.000 claims description 12
- 230000003993 interaction Effects 0.000 claims description 11
- 230000010006 flight Effects 0.000 claims description 10
- 239000003550 marker Substances 0.000 claims 2
- 238000005516 engineering process Methods 0.000 abstract description 51
- 230000008569 process Effects 0.000 description 27
- 238000004891 communication Methods 0.000 description 25
- 238000012545 processing Methods 0.000 description 20
- 230000006870 function Effects 0.000 description 17
- 239000010432 diamond Substances 0.000 description 12
- 230000000694 effects Effects 0.000 description 12
- 230000008859 change Effects 0.000 description 10
- 230000004044 response Effects 0.000 description 10
- 238000013500 data storage Methods 0.000 description 8
- 230000000007 visual effect Effects 0.000 description 7
- 229910003460 diamond Inorganic materials 0.000 description 6
- 238000013459 approach Methods 0.000 description 5
- 230000002441 reversible effect Effects 0.000 description 5
- 230000003190 augmentative effect Effects 0.000 description 4
- 230000007935 neutral effect Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 230000001351 cycling effect Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 230000002085 persistent effect Effects 0.000 description 3
- 238000010079 rubber tapping Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 230000002238 attenuated effect Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000010408 sweeping Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 239000003570 air Substances 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 230000036461 convulsion Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0038—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
Definitions
- UAVs unmanned aerial vehicles
- enhanced UAV flight along computed splines Various implementations of the present technology relate to unmanned aerial vehicles (UAVs) and, in particular, to enhanced UAV flight along computed splines.
- Unmanned aerial vehicles are commonly used to capture video, images, or other data from a vantage point or location that might otherwise be difficult or cumbersome to reach. Drones are used for various purposes, such as for recreation, scientific exploration, military operations, intelligence gathering, and commercial uses. UAVs for commercial and recreational use typically have multiple rotors so that they are agile and rapidly responsive to flight commands. For example, a popular configuration known as a “quadcopter” comprises four rotors for flight.
- a UAV includes a flight control subsystem and an electromechanical subsystem.
- the flight control subsystem records keyframes during flight and computes a spline based on the keyframes.
- the flight control subsystem then saves the computed spline for playback, when the UAV automatically flies in accordance with the computed spline.
- the flight control subsystem is capable of receiving user input and responsively modifying the computed spline based at least on the user input, resulting in a modified version of the computed spline.
- the flight control subsystem may save the modified version of the computed spline for later playback.
- the UAV may be capable of uploading (or downloading) the modified version of the compute spline to a remote storage location.
- Examples of the user input include one or more of changes to one or more components of the computed spline, such as position, direction, speed, and orientation of the unmanned aerial vehicle along the computed spline.
- the components may also include a camera focal length and a camera orientation with respect to the unmanned aerial vehicle.
- Other examples of the user input include snapping the unmanned aerial vehicle directly to a new position on the computed spline out-of-turn with respect to a next position on the computed spline, reversing direction along the computed spline relative to present direction along the computed spline, and hovering at a point on the computed spline.
- Figure 1 illustrates an operating architecture of an unmanned aerial vehicle in an implementation.
- Figure 2 illustrates a method of operation of a UAV to create a spline in an implementation.
- FIG. 3 illustrates a detailed diagram of UAV systems in an implementation.
- Figure 4A illustrates an operational environment and exemplary scenario.
- Figure 4B illustrates an operational environment and exemplary scenario.
- Figure 5A illustrates an exemplary UAV flight path.
- Figure 5B illustrates an exemplary UAV computed flight path circling a building.
- Figure 6A illustrates an exemplary overhead view of flying a UAV around a house.
- Figure 6B illustrates an exemplary overhead view of a computed spline flight path.
- Figure 6C illustrates an exemplary overhead view of a computed spline flight path with a modification to flight path and drone orientation.
- Figure 6D illustrates an exemplary close-up overhead view of a modification to flight path and drone orientation during flight along computed flight path.
- Figure 6E illustrates an exemplary display on a remote control with first-person view as the pilot issues flight commands during flight along computed flight path.
- Figure 7 illustrates an exemplary workflow when a pilot issues flight commands as a drone flies by autopilot control.
- Figure 8 illustrates an exemplary overhead view of effect of pilot commands on drone flight by autopilot along computed flight path.
- Figure 9 illustrates an exemplary overhead view of effect of pilot commands on drone flight by autopilot along computed flight path.
- Figure 10 illustrates an exemplary overhead view of the computation of new position points for a modification to a computed flight path during flight.
- Figure 11 illustrates a workflow for controlling an aircraft in an implementation.
- Figure 12A-12J illustrate multiple views of the graphical user interface of an autonomous flight control application on a drone remote control in an implementation.
- Figure 13A-13D illustrates the user interface of an autonomous flight control application on a drone remote control in Key Frame Mode in an implementation.
- Figure 14A-14D illustrates the graphical user interface of an autonomous flight control application in an implementation.
- Figure 15 is a sequence of images illustrating the user interface of an autonomous flight control application in an implementation during playback.
- Various implementations disclosed herein include unmanned aerial vehicles that include a flight control subsystem and an electromechanical subsystem.
- the flight control subsystem is capable of recording keyframes during flight and computing a spline based on the keyframes.
- the flight control subsystem saves the computed spline for playback, at which time the flight control subsystem directs the electromechanical subsystem to fly the UAV in accordance with the computed spline.
- the flight control subsystem is also capable of receiving user input and modifying the computed spline based on the user input.
- the modified version of the computed spline may itself be saved for later playback.
- the user input may cause one or more changes to the computed spline, such as a change in position, direction, speed, and/or orientation of the unmanned aerial vehicle along the computed spline.
- Other changes include modifications to a focal length of a camera on the UAV, and orientation of the camera, or changes to any other peripherical instrument on the UAV.
- Still other examples of the user input include snapping the unmanned aerial vehicle directly to a new position on the computed spline out-of-turn with respect to a next position on the computed spline, reversing direction along the computed spline relative to present direction along the computed spline, and hovering at a point on the computed spline.
- a drone pilot operating a controller device identifies a set of discrete spatial locations called keyframes. Resources onboard the drone and/or the controller (or distinct from either system) compute a spline based on the keyframes. The computed spline may then be “played back” by the drone, meaning that the flight control subsystem onboard the drone commands its flight based on the computed spline.
- the pilot can modify the computed spline inflight, while the drone is flying the spline, allowing the pilot to focus on and control one or more aspects of drone operation without having to actively pilot the drone.
- the pilot may, for example, alter the position, direction, speed, and/or orientation of the drone as it flies the computed spline. These changes may cause the drone to depart from the computed spline, or change direction as it travels along the spline.
- the pilot may cause the drone to speed up or slow down along segments of the spline or at points along the spline.
- the pilot may also modify the operation of a camera, such as zooming in or out or making adjustments to the exposure as the drone flies the spline.
- the pilot may cause the drone to reverse course along the computed spline or to “snap-to” a new position on the spline without traveling along the spline.
- the pilot may also cause the drone to stop and hover at point on the computed spline, for example, to add a keyframe at that location. Any or all of these modifications may be saved for subsequent use with the spline or as a new version of the spline.
- the pilot may control the orientation of the drone along the spine and/or the gimbal position as the drone travels along the computed spine, i.e., “Free Look Mode.” In this manner, the pilot can focus on camera angles and positions without having to actively pilot the drone.
- pilot input integrated into programmed operation allows greater creative buy-in to the video product by allowing the pilot to focus on artistic aspects and nuances of the recording without having to actively navigate the drone’s flight.
- the ability to integrate pilot or user input into the operation of a drone as it is flying under the authority of its autopilot can be exercised in a number of ways.
- a UAV flies a programmed or predetermined flight plan.
- the programmed or predetermined flight plan may be a computed spline flight path where a trajectory is defined by a set of discrete locations through which the drone will pass, or it may be a flight path that was recorded during an earlier flight. It may also be a flight path that was programmed manually and uploaded to the drone.
- the programmed flight operation of the drone includes such parameters as the position, the orientation and/or velocity of the drone as it flies along the predetermined flight path.
- Drone operation may include subject tracking during programmed operation.
- a drone may be deployed to record a cycling race by flying a programmed route along the racecourse from an overhead perspective while tracking the progress of a particular cyclist.
- Drone operation may also include object detection and avoidance so that the drone does not collide with a tree while flying over and recording the cycling race.
- the pilot can focus on a particular aspect of drone operation, such as the view captured by an onboard video camera without having to actively fly the drone at the same time.
- the pilot ceases to control that aspect of drone operation, the drone will smoothly return to and resume its programmed operation.
- the drone or UAV is docked prior to flight.
- the dock provides the drone with localization information by which the drone can ascertain its position and orientation relative to the dock for navigating during flight.
- Positional information for navigation may be specified in three-dimensional coordinates, such as a set of coordinates relative to the dock or coordinates detected by an onboard GPS sensor. Positional information for navigation may also incorporate or rely on visual tracking using one or more onboard navigation cameras.
- Drone orientation information may reference the drone’s pitch, roll, and yaw angles.
- the pilot communicates wirelessly with the drone’s flight control system using a remote control.
- the remote control may be a dedicated device or an app on a computing device such as laptop computer, tablet, or smartphone.
- the user interface or UI on a UAV remote control may include a display screen and one or more input mechanisms for controlling the drone, such as buttons, rocker switches, sliders, toggles, and the like.
- One implementation of the user interface on drone remote control includes a touch-enabled display screen, i.e., a touchscreen, which displays functional graphical object representations of control input mechanisms.
- Wireless transmissions between the UAV and the remote control may be carried over a WiFi connection, a Bluetooth® connection, or any suitable wireless connection.
- the UAV may also communicate wirelessly with other devices such as a computer which can receive, view, record, and store information transmitted from the drone.
- a predetermined flight includes a set of keyframes, each of which may be defined according to the visual input of one or more navigation cameras in a process of dead reckoning or visual tracking. Keyframes defined according to visual tracking are particularly useful in indoor or outdoor environments where global positioning system (GPS) data or other inertial navigation data is partially or totally unavailable or with drones which lack inertial or satellite navigation capability. Keyframes locations may also be defined in three-dimensional coordinates relative to the drone dock location or from satellite or inertial navigation data.
- GPS global positioning system
- a keyframe may contain additional information about the drone orientation, drone speed, or onboard sensor activity.
- a drone may be programmed to pause at a keyframe and pivot horizontally to capture a panorama shot as the camera zoom lens moves from a telephoto to a wide-angle focal length.
- the drone’s path from one keyframe location to the next is a computed function such as a cubic spline interpolant or “spline.”
- a predetermined flight plan may be a previously recorded drone flight that was saved to the drone’s persistent storage (e.g., nonvolatile memory), or to a data storage device in wireless communication with the drone.
- the prerecorded flight plan may be uploaded to other drones for reuse.
- a flight plan for monitoring the perimeter of a secured facility can be recorded and saved for periodic reuse or when the drone is replaced by a back-up drone.
- a prerecorded flight path may comprise an entire flight from launch from the dock to return to the dock, or it may comprise a subset of the drone’s flight, such as the spline between two keyframes, or it may even comprise drone operation (such as a sweeping video shot) at a single location.
- the predetermined flight path may be a computer program that is uploaded to the drone.
- a flight plan may be programmed based on map or topographical data.
- the pilot can record aspects of drone operation such as recording the trajectory of a drone, recording the camera view of the drone, pausing the recording, deleting the recording, and saving the recording.
- the recorded operations or recorded components of the flight may be saved to onboard nonvolatile memory, or they may be transmitted to a device in communication with the drone, such as to the remote control, to a laptop computer receiving transmissions from the drone, or to cloud data storage.
- the display screen or touchscreen of the user interface displays a view transmitted from a forward-facing camera on the drone, known as the first-person view, in real time.
- An augmented reality (AR) graphic representing a predetermined flight path is superimposed on the forward-facing camera view in the form of a translucent curve overlaying the camera view and, optionally, in a color that is highly distinctive from the background.
- the AR representation may indicate distance from the drone along the flight path by the varying the width of the curve, for example, the curve narrows with distance from the drone.
- the AR representation continually updates as the drone flies the predetermined flight path.
- An additional AR graphic may show keyframes or waypoints identified by a distinctive shape and color such as a diamond on the curve representation of the predetermined flight path.
- the pilot may add, edit, and delete keyframes on the predetermined flight path via the user interface on the remote control.
- the user interface displays an AR graphic in the form of a translucent frame over the camera view which frames the camera shot.
- a UI touchscreen may display virtual buttons for adding, editing, and deleting keyframes.
- Keyframes may record and store such information as drone position or location, drone orientation, and information concerning the operation of onboard sensors such as cameras.
- the AR representation of a predetermined flight plan defined by keyframes may be updated according to the most recent set of keyframes.
- the user interface on the remote control displays a linear playback track or timeline representation of a predetermined flight plan.
- the linear timeline representation may include keyframes identified by a distinctive shape, such as a diamond, on the timeline. Distances along the timeline may be proportional to actual flight distances.
- the timeline may indicate the drone’s progress along the flight path using one color to show the completed portion and a second color to show the portion remaining; optionally, an arrow or other symbol may travel along the timeline as the drone flies to show in real time the drone’ s travel along the flight path.
- the pilot may issue commands to the drone using the flight path timeline.
- the pilot may command the drone to reverse direction on the flight path or to pause at a point on the flight path by touching a particular point or location on the displayed timeline.
- the pilot may also command the drone to snap to (that is, immediately fly to) a point on the spline either by traveling along the spline or by flying directly to the indicated point.
- the pilot may command the drone to jump to a point between two key frames and add a keyframe at that location.
- the UI screen also shows drone speed at points along the timeline.
- the UI may include virtual or physical control mechanisms such as rocker switches or sliders to adjust the drone speed along a segment of the flight path or at a point on the flight path.
- the pilot may command the drone to pause (i.e., hover) at various keyframes to take a prolonged still view or sweeping view from those vantage points.
- a virtual slider controlling drone speed is displayed in multiple colors to show multiple zones of dynamic feasibility.
- green may indicate the range of speeds which are dynamically feasible for the drone to fly; yellow may indicate the range of speeds pushing the operating envelope of the drone; and red may indicate the range of speeds which are not dynamically feasible for the drone to fly.
- red may indicate the range of speeds which are not dynamically feasible for the drone to fly.
- a pilot may retain aspects of operational control of a drone as the drone flies a predetermined flight plan.
- the pilot may command the drone to re-fly the recorded flight path while manually controlling the camera orientation.
- Camera orientation can be controlled by changing the drone’s pitch, roll, and/or yaw angles.
- the pilot may issue flight or operational commands via the remote control which cause the drone to deviate slightly from the flight path or to change the drone’s orientation as it flies. For example, the pilot may nudge the drone’s orientation to turn westward for a several seconds as the drone flies a predetermined path heading north.
- the drone receives and integrates realtime inputs into its flight operations corresponding to the predetermined flight path.
- the pilot’ s real-time inputs cease, the drone effects an automatic return to the predetermined flight path.
- the real-time inputs are smoothed or dampened resulting in an attenuated adjustment to the drone’s flight.
- the return to the predetermined operation is similarly smoothed or dampened.
- the pilot may activate a subject tracking capability by which the drone will maintains its orientation toward a subject so that the subject is always in view of the drone camera.
- an object avoidance function may cause the drone to deviate from its programmed flight path if the flight path intersects with an obstacle.
- the ability to manually control one or more aspects of drone operation e.g. drone flight dynamics, drone orientation during flight, and onboard camera operation
- the drone navigates a predetermined flight path may give the pilot or videographer piloting the drone a greater sense of creative ownership of the video recording because it will not be a strictly programmed or mechanical operation.
- deviations from or adjustments to a predetermined flight plan made as the drone is flying the flight path may be saved for later reuse.
- the adjustments may be saved by themselves (to be added to a predetermined flight path), or the flight path and the adjustment may be saved together as an entirely new flight path.
- multiple adjustments to a particular predetermined flight path may be layered onto the flight path enabling the ability to create flight plans of increasing complexity or variation. For example, a flight path may be re-flown multiple times with a different camera orientation operation each time to compare and contrast a variety of perspectives.
- the drone or UAV may be docked at a location remote from the pilot. Pilots typically fly drones by maintaining line of sight to the drone in accordance with FAA rules governing drone flight. However, in certain circumstances the pilot may navigate the drone relying on the drone’s first-person view, that is, by seeing what the drone camera sees, without having line of sight to the drone. This manner of flying a drone is generally only permissible in certain limited situations, such as indoor flight in a large warehouse or stadium.
- the autopilot of the drone receives and integrates a number of internal and external operational inputs governing the flight of the drone in order to issue a command to the drone microprocessor. These commands are received by the microprocessor as if they had been issued by the (human) pilot and as such are issued by the autopilot as ostensible joystick commands.
- the autopilot integrates the computed or programmed flight path of the drone with sensor data relevant to drone operation, such as wind speed and direction data.
- the autopilot may also receive joystick input when the pilot issues a command via the joystick on the remote control.
- Joystick input is interpreted according to the particular functionality assigned to the joystick during autopilot operation.
- the joystick may be used to change the pitch or yaw of the drone during programmed operation to change the view of the onboard camera.
- a governing function may be applied to the joystick input which can dampen or limit the input so that the drone does not exceed an operational envelope for its flight operation.
- the drone autopilot may also receive inputs from a collision avoidance system or from a subject tracking system. Based on input from these various sources, the autopilot computes and issues ostensible joystick commands to the drone microprocessor. In response to receiving an ostensible joystick command from the autopilot, the microprocessor transmits a flight command to the drone electromechanical propulsion system causing the drone to fly according to the autopilot’s command.
- the drone pilot may command the drone to change its orientation during flight to obtain a view of the crowd of spectators along the race route or of a particularly notable vista in the distance.
- the pilot may change adjust the operation of an onboard camera such as by zooming out for a wide angle shot of a distant mountain range or zooming in for a close-up of a cyclist.
- the autopilot will receive input from the joystick corresponding to a return to its neutral position, which will in turn effect a smooth return to its programmed or computed flight plan.
- FIG. 1 illustrates an unmanned aerial vehicle (UAV 101) and its components, represented by operational architecture 128.
- Operational architecture 128 broadly includes a flight controller subsystem 124, an electromechanical subsystem 126, external operational inputs 120, and internal operational inputs 122.
- Flight controller subsystem 124 may include a circuit board housing one or more microprocessors, also known as a flight controller, that controls various aspects of drone operation.
- Electromechanical subsystem 126 may include an electronic speed controller unit and various rotors, power supplies, and the like.
- the external operational inputs can include inputs received from a remote control, typically operated by a human pilot, and sensor data measuring environmental conditions affecting UAV operation.
- Internal operational inputs can include programmed or computed flight or operation plans which direct drone position, drone orientation, or sensor operation in flight, and other information relating to the particular use or capabilities of the UAV, such as map or topographical data.
- FIG. 2 illustrates process 200 implemented by one or more components of flight control subsystem 124 of UAV 101.
- Process 200 is implemented in program instructions that, when executed by the one or more hardware and/or firmware elements of flight control subsystem 124, direct it to operate as follows.
- flight control subsystem 124 of UAV 101 records one or more keyframes (step 210).
- the pilot of UAV 101 may direct UAV 101 to record the keyframes based on the first-person view the pilot sees on remote control 130.
- Recorded keyframe data may include parameters such as the physical location of the drone in based on visual tracking or based on three-dimensional coordinates and the orientation of the drone at that location.
- Keyframe data may also include data relating onboard camera or sensor operation at the keyframe location.
- Flight control subsystem 124 of UAV 101 computes a flight path or computed spline which connects the keyframes (step 220).
- the flight path may be computed by the onboard microprocessor using a set of discrete location points or waypoints identified by the pilot during the present or a prior flight.
- the flight control subsystem of UAV 101 saves the computed spline for subsequent use (step 230).
- the computed spline may be re-flown by UAV 101 or, for example, by a back-up drone while UAV 101 is being recharged.
- the computed spline may be saved locally, such as in onboard persistent memory or data storage. Also, it may be saved remotely, as in the data storage of a device in communication with the drone such as the remote control or a laptop computer receiving transmissions from the drone. This may also include, for example, remote cloud data storage.
- a pilot may make modifications to the flight, for example, to make incremental improvements to the drone’s operation, to make temporary adjustments based on unforeseen conditions, or to explore different ways of operating the drone. These modifications may be similarly saved.
- FIG. 3 shows an exemplary systems architecture 300 of quadcopter 401 of Figure 4.
- Systems architecture 300 includes a flight control subsystem 391 and an electromechanical subsystem 392.
- Flight control subsystem 391 includes an autopilot function (represented by autopilot 328, a flight controller 326, an inertial measurement unit 302, sensors 304, a transmitter 306, a receiver 308, and a memory card port 310.
- Electromechanical subsystem 392 includes an electronic speed controller 312, and rotors 314. It may be appreciated that both flight control subsystem 391 and electromechanical subsystem 392 may include other elements in addition to (or in place of) those disclosed herein, which are illustrated for exemplary purposes.
- Systems architecture 300 also includes operational inputs 393.
- Operational inputs 393 include joystick data 322 supplied by a remote-control device 318, as well as governing factors 330 and a computed flight path 332.
- Inertial measurement unit 302 includes one or more sensors such as a gyroscope and an accelerometer which provide movement and orientation data to the flight controller subsystem.
- the flight controller subsystem may also connect to or contain other sensors 304 such as video cameras, Global Positioning System (GPS) sensors, magnetometers, or barometers.
- UAVs also carry equipment for wireless communication, such as antennas, video transmitter 306, and radio receiver 308, which enable communication with remote control 340 by which a human pilot sends commands such as flight commands or commands relating to onboard sensor operation.
- the remote control may be a dedicated device, or it may be an application on a mobile computing device such as a smart phone, tablet or laptop computer capable of wireless communication with UAV 401.
- Wireless communication between the remote control and the UAV may be carried over a WiFi network or Bluetooth® link.
- the flight controller subsystem may also connect to onboard persistent or nonvolatile memory or memory card port 310 for recording flight and sensor operations and data.
- electronic speed controller 312 is connected to the flight controller subsystem and controls the operation of rotors 314 according to flight commands 316 received from the microprocessor on the flight controller subsystem.
- Remote control 340 for drone 401 contains the wireless communication hardware for communicating with drone 401 as well as throttle device (for example, a physical or virtual joystick) 320 for manually controlling the flight (i.e., speed and direction) of drone 401.
- throttle device for example, a physical or virtual joystick
- remote control 340 will transmit joystick data 322 to UAV 401.
- remote-control devices for drones typically have display screen 324 to display the perspective of an onboard camera, referred to as the first-person view.
- First-person view capability enables the pilot to find and capture views from remote or difficult-to-access vantage points.
- Exemplary operational environment 400 of Figure 4A illustrates the process of creating a computed spline flight path.
- Pilot 404 has initiated flight of drone 401 with the drone launching from drone docking device 402.
- Drone dock 402 gives drone 401 localization information at the start of a flight so that drone 401 can ascertain its position relative to dock
- Drone flight may be manually controlled by pilot 404 using remote control
- drone 401 which is in wireless communication with UAV 401.
- the pilot may use joystick 320 to turn or accelerate drone 401.
- Remote control 403 transmits pilot 404 ’s input received from joystick 320 to drone 401 via onboard receiver 308 coupled to flight controller 326.
- Flight controller 326 translates the pilot’s input into flight commands 316 issued to electronic speed controller 312, which in turn throttles rotors 314 accordingly.
- drone 401 is piloted by pilot 404 along an arbitrary route 405 (event 1).
- pilot 404 identifies a location to be saved for the spline at point A.
- Pilot 404, using remote control 403, adds a keyframe at the location as the drone flies.
- Keyframe A along with flight and/or operational data associated with location A are saved in event 3.
- Keyframe data that is stored at event 3 includes location coordinates (for example, GPS coordinates or coordinates relative to dock 402), and may also include drone 401’s speed at location A, orientation at location A, as well as data concerning the operation of the onboard camera.
- Keyframe data may be saved in data storage on remote control 403, or it may be saved in data storage onboard drone 401.
- Pilot 404 adds another keyframe (event 4).
- Keyframe B is saved in event 5 in a manner similar to event 3.
- spline 410 is computed which connects keyframes A and B.
- the spline computation may be carried out in an onboard processor of drone 401 or in a processor of remote control 403 which is then transmitted to drone 401.
- the spline computation is carried out by a processor of the onboard flight control subsystem of drone 401.
- FIG. 4B illustrates drone 401 of operational environment 400 in flight subsequent to the recording of keyframes “A” and “B” and the computation of spline 410.
- Keyframes “A” and “B” and spline 410 may have been saved to the onboard data storage of drone 401, or the keyframes and spline may be uploaded to drone 401 before or during the current flight from a remote device storing the information, such as remote control 403.
- pilot 404 commands drone 401 to play back spline 410.
- event 2 drone flies spline 410.
- As drone 401 flies spline 410 it is operating under the control of its onboard autopilot which is part of drone 401’s flight control subsystem.
- pilot 404 may issue flight or operational commands to alter its operations under the command of the autopilot. For example, pilot 404 may command drone 401 to make a departure and return to spline 410; to speed up, slow down, or stop and hover at a location on spline 410; or to reverse direction along 410. Pilot 404 may command drone 401 face north as it flies, rather than facing forward along the spline. Pilot 404 may add additional keyframes to spline 410. Pilot 404 may operate the drone camera as the autopilot navigates drone 401 along spline 410.
- UAVs can be commanded to fly predetermined flight paths.
- a flight path may be defined by discrete sets of position data (and optionally velocity data) called waypoints.
- Waypoints may be specified in three-dimensional Cartesian coordinates. Waypoints may be chosen for different purposes: some waypoints may be locations where the drone is intended to stop and view a point of interest, while others may specify the precise position a drone must attain in order to pass through, say, a door or window.
- View 500 of Figure 5A shows eight exemplary keyframes 504 chosen to view the exterior of building 506 from aerial positions.
- Each keyframe 504 is a static record of a waypoint along with the orientation of drone 101 (that is, the orientation of drone 101’s onboard camera) at that waypoint so that the desired vantage point can be recaptured at different times.
- Drone orientation refers to the angular position of drone 101 relative to forward-facing level flight, namely pitch 112, roll 110, and yaw 114 angles as show in Figure 1.
- a videographer piloting a drone may desire to capture a construction site from an elevated vantage point looking downward during different phases of construction to document the progress of the work. This technique is also useful for before-and-after comparisons of the development of an expansive area or of the reconstruction of an area after a natural disaster.
- Drone 101 can record a particular location in three-dimensional space while in flight by recording sensor or telemetry data about the location such as the drone’s distance and orientation from drone dock 402, GPS data gathered from an onboard GPS sensor 304, visual data gathered from an onboard camera 304, or combinations thereof. Similarly, the orientation of drone 101 at a particular location can be recorded and stored using data from inertial measurement unit 302.
- One technique for programming the flight of a UAV is to record a sequential set of keyframes 504 which will define the flight path and video operation of drone 101. For example, when checking of a security perimeter, a pilot may define a set of keyframes capturing every point of entry of a building. The operating system of drone 101 will then compute a flight path for drone 101 and its orientation during flight from one keyframe to the next. In subsequent flights, the pilot can deploy drone 101 to fly the same route and capture the same views each time, making it easier to identify when something has changed.
- An alternative to recording a sequential set of static keyframes 504 to define a flight path and video operation is to record a continuous keyframe: the flight path and video operation of a first flight are continuously recorded over a period of time. A flight that has been recorded as a continuous keyframe can be subsequently re-flown as needed. Because a continuous keyframe will record all motion, including any jerky or irregular movement or other idiosyncrasies associated with manual flight control, this mode of operation may be more appropriate for experienced drone pilots or those with more competent flying skills.
- Spline interpolation is a method of fitting a curve to a set of points by computing a piecewise set of lower-order polynomials or splines over sequential pairs of points such that the resulting function is continuous and smooth over its domain. It is a simpler and more stable method than polynomial interpolation which involves fitting a single higher-order polynomial to all of the points in a set.
- FIG. 5A demonstrates an exemplary flight path 508 comprising straight line segments defined by a set of eight keyframes 504 at various locations around building 506. Such sharp turns are not only aesthetically undesirable for cinematography, they may be dynamically unfeasible, that is, they may exceed the flying capability of drone 101.
- cubic splines will satisfy an additional constraint for drone flight and cinematography which is that the third- and fourth-order differentials of the interpolant with respect to time, known as jerk and snap, be zero.
- View 520 of Figure 5B illustrates flight path 510 through the same keyframes 504 of Figure 5 A but connected with a cubic spline interpolant.
- Spline interpolation calculations can be done on the fly, so to speak, as the pilot adds or deletes keyframes in the sequence.
- varying the constraints on the waypoints or endpoints can alter the character of the calculated curve which in turn will affect the dynamic character of the flight path.
- the orientation of drone 101 can be programmed to provide a smooth and steady video recording, eliminating any uneven camera motion or direction changes that can occur during manual camera operation and allowing camera operation to be subsequently recaptured once a preferred orientation program is found.
- a drone orientation function can be interpolated to provide smooth drone operation along the flight path using the drone orientation data (i.e., pitch 112, roll 110, and yaw 114 angles) specified at the waypoints.
- the orientation of drone 101 can also be recorded during the first flight for later reuse.
- a pilot may desire to modify the video recording or other sensor data gathering while drone 101 is in flight by making minor, transitory changes to the UAV’s position or orientation without terminating the programmed operation of the UAV.
- the pilot may issue a flight or operational command via joystick 320 on remote control 130 which is transmitted to the drone’s flight controller subsystem.
- the flight controller subsystem Upon receiving the new joystick data, the flight controller subsystem will modify the flight commands that it issues to electronic speed controller 312 based on its computing a modification to computed spline 510, continuous keyframe path, or programmed flight path.
- the modification will factor in one or more factors governing drone flight such as environmental conditions (e.g., wind speed and direction) or obstacle detection and avoidance.
- a realtor desires to provide a video recording of residential property 602 from an external aerial perspective for marketing purposes.
- a drone system with videography capability comprising drone 101 and videographer 606 who flies drone 101 using remote control 130 is dispatched to make the recording.
- videographer 606 flies drone 101 around house 602, selectively chooses a set of particularly desirable vantage points, and saves that information as a set of five keyframes 608 which include the drone position and drone orientation information at each keyframe.
- Exemplary display screens 610 and 612 for identifying and adding keyframes to a set are shown in Figure 6A, where augmented reality imagery 614 can be superimposed on the first-person view to precisely identify a camera shot associated with a keyframe.
- flight controller subsystem 124 computes via spline interpolation flight path 622 as shown in overhead view 620 of Figure 6B. Drone 101 flies flight path 622 determined by the interpolant and records a video of the entire perimeter of the house 602 including the particularly desirable vantage points of keyframes 608.
- videographer 604 modifies the drone operation during the flight on computed spline 622. As shown in overhead view 640 of Figure 6C, videographer 604 issues a command using joystick 320 which causes drone 101 to move slightly leftward 624 from computed spline 622 and to turn the camera away from house 602 and toward pond 616.
- videographer 604 allows joystick 320 to return to its neutral position and drone 101 continues its flight having computed a return path 626 to the computed spline 622 and a reorientation to the orientation spline.
- a close-up view of the computed modification to flight path 622 including departure 624 and return 626 and drone orientation indicated by arrows are shown in Figure 6D.
- Figure 6E demonstrates an exemplary first-person view of drone 101 as seen on display screen 324 of remote control 130 at a series of points VI through V10 before and during the modification of its flight along flight path 622 according to the flight commands issued by videographer 604.
- FIG. 7 illustrates process 700 implemented by one or more components of flight control subsystem 124 of UAV 401.
- Process 700 is implemented in program instructions that, when executed by the one or more hardware elements of a flight control subsystem onboard drone 401, direct it to operate as follows.
- a drone pilot uses joystick 320 to modify the flight of drone 401 along a computed spline (step 710), where the modification may affect the position or the orientation of the drone, or some combination thereof.
- the joystick data is wirelessly transmitted from remote-control device 403 to autopilot 328 of drone 401 (step 712).
- Autopilot 328 also receives external operational inputs (i.e. wind speed and direction data) affecting drone flight (step 714).
- Autopilot 328 computes a modification to the computed spline (step 716). In event 718, this modification is transmitted to flight controller 326, which, in event 720, directs the drone electromechanical subsystem of drone 401 to fly according to the modification to the computed spline (step 718).
- the electromechanical subsystem issues a flight command that simulates an actual joystick command, in other words, it issues a synthetic or modified joystick command 330 (step 720).
- Modified joystick command 330 includes the pilot’ s input together with factors that govern drone operation.
- a pilot can command multiple simultaneous modifications to the flight during programmed operation using a remote control with multiple joysticks, with each joystick assigned a particular aspect of drone operation or by assigning multiple functionalities to a single joystick.
- input from one joystick may control drone yaw angle 114 while the other controls drone pitch angle 112, giving the pilot the ability to focus on fine-tuning camera operation while the drone flies autonomously along the predetermined flight path.
- a governing factor may transform an actual joystick command into a modified joystick command by applying a dampening function to the actual joystick input or data.
- the dampening function may be a mathematical model that simulates the response of a spring-rigged object subject to a force corresponding to the actual joystick input. More specifically, the spring- rigged model translates a real-time input by the pilot using the joystick into the response of a simulated object rigged with three linear and three torsional overdamped springs subjected to displacement in one or more directions.
- the joystick input will be dampened to avoid an abrupt dynamic response in reorientation, resulting in a more desirable response for cinematographic purposes.
- the application of a dampening function can effect a limit on modifications to the computed spline or other predetermined flight operation of the drone.
- This limitation on the dynamic response of the drone to any input by the pilot creates operating envelope around the drone’ s computed or predetermined flight path.
- the pilot can cause drone 804 to make small deflections 806 from flight path 802 as it flies flight path 802.
- the dampening function applied to the pilot’s flight commands creates operating envelope 810 about flight path 802.
- the autopilot of drone 804 continues to provide operational instructions to the electromechanical subsystem, while taking into account the joystick data it receives from the remote control.
- the pilot can focus his or her attention on a particular aspect of the drone operation without having to assume full control of the drone’s operation.
- FIG. 9 shows an overhead view 900 of a predetermined drone flight path of a drone flying by autopilot control.
- a drone flies predetermined flight path 904
- the drone pilot pushes the joystick to the left (event 920)
- such an input would typically disengage the autopilot operation and result in the drone turning leftward, which, if the joystick is held in that position and then released at shortly after, would result in the drone flying backward (926), and eventually stopping and hovering.
- the same joystick input is filtered through a dampening function which produces a modified joystick command.
- the modified joystick command causes the drone to make a slight leftward departure 902 from the flight path 902 but still following flight path 902.
- the drone executes a return 906 to and resumption of its computed flight path 904 or programmed operation.
- the net effect of such an implementation of the technology is to create an operational envelope 910 around flight path 904 where the drone preferentially adheres to flight path 904 but can make deviations away from that path in terms of the drone’s position or orientation based on joystick data provided by the drone pilot.
- the autopilot communicates modified joystick commands to the electromechanical subsystem which then throttles the rotors accordingly.
- modified joystick commands are typically issued several times per second, such as every 50 milliseconds.
- Factored into the commands are the current position of the drone, the desired next position of the drone according to the flight path, and external or environmental factors such as wind speed and direction.
- the autopilot factors into the modified joystick commands factors governing drone flight operations. These factors can affect drone operation such as by attenuating the drone’s dynamic response to joystick data or by incorporating a collision avoidance response to object detection.
- points 1002 represent the position data computed incrementally by the autopilot of the flight control subsystem based on the computed spline. Points 1002 may also include flight parameters governing drone orientation, however, for the sake of clarity, this example is limited to a discussion of positional modifications.
- Overhead view 1000 of a drone flight path shown in Figure 10 further exemplifies the effect of joystick data received from the remote-control device which causes a transient flight path deflection.
- the autopilot issues modified joystick commands at position 1020 which in turn creates a new path comprising a new set of flight parameters including incremental positional data 1006 and which may also include orientation data. Points 1006 represent a dampened response to the joystick input received from the remote control.
- the drone autopilot computes incremental positional data 1008 which returns the drone to flight path 1002.
- the dampening of the pilot’s joystick commands effects an operating envelope 1012 about flight path 1002 which limits the ability of the drone to depart from the flight path even when the pilot pushes fully and continually on the joystick.
- FIG 11 illustrates process 1100 implemented by one or more components of flight control subsystem 124 of UAV 101 and remote control 130.
- Process 1600 is implemented in program instructions that, when executed by the one or more hardware and/or firmware elements of flight control subsystem 124 and remote control 130, direct flight control subsystem 124 and remote control 130 to operate as follows.
- a computer system directs the graphical user interface (GUI) on remote control 130 to display a perspective view from a sensor operatively coupled to flight control subsystem 124 (step 1110).
- the computer system detects inputs from the pilot interfacing with the GUI which include instructions to add key frames, wherein the key frames comprise the spatial location of UAV 101 and direction of the sensor (step 1120).
- GUI graphical user interface
- the computer system continually generates and updates a spline comprising a projected flight path or trajectory between each of the multiple keyframes and including the direction of the sensor (step 1130).
- the computer system continually displays a graphical representation of the spline overlaid on the perspective view from the sensor onboard UAV 101 (step 1140).
- the sensor is a forward-facing camera providing a first- person perspective view of UAV 101.
- the direction of the camera corresponds to the gimbal angle of the camera relative to level flight.
- Figures 12A-12J illustrate an implementation of a user interface presented to a pilot on the display screen of a remote control.
- the remote control displays the user interface of an augmented reality-based autonomous flight control application which receives inputs from a pilot and commands the drone to fly accordingly.
- the remote control displays the user interface on a touchscreen along with various input devices such as buttons, sliders, and so on in virtual form.
- the input devices may be physical buttons, sliders, toggles, joysticks, etc. on the remote control.
- the remote control may be a computing device such as a dedicated drone control device, a smartphone, tablet or other mobile device, or a laptop or other computer in wireless communication with the drone.
- an autonomous flight control application receives inputs from a user or pilot through the virtual input devices of its user interface. Where the virtual input devices are described below as being “selected,” this indicates that the autonomous flight control application has received an indication from the pilot (such as by touching, tapping, or “clicking” the virtual input device) causing the virtual input device to change its state. The application responds to that change of state according to its program instructions.
- a drone operating in KeyFrame Mode generates a computed spline flight path or “spline” circling a small copse of trees and then flies the spline during playback.
- the UI software displays an AR representation of the computed spline and the associated keyframes overlaying a live video feed from an onboard camera on the touchscreen.
- the UI software continually updates the AR representation of the spline and the keyframes as the keyframes are added, edited, or deleted and the spline is generated or recomputed, and during playback as the drone flies the spline.
- the computed spline can be recorded and saved for later use by the same drone or by other drones with similar capabilities.
- the computed spline may also be edited in later uses; any changes may be saved as new splines or as revisions that may be selectively added to the spline in later use.
- Figure 12A illustrates an implementation of the UI of an AR-based autonomous flight control application on a touchscreen of a drone remote control when a spline is to be defined.
- the touchscreen displays camera view 1210 captured by an onboard camera.
- the UI displays Launch button 1201 which causes the application to launch the drone from its dock.
- To the left of the screen are virtual indicators by which the UI presents various statuses relating to drone operation or virtual buttons by which the pilot can access aspects of drone operation: battery charge indicator 1211,
- Wifi signal strength indicator 1212 to access graphical map
- home button 1222 Auto Record indicator 1223 which indicates whether video is being recorded
- settings button 1224 and operating mode graphic 1225 which indicates the operating mode of the drone (in this view, the drone is being operated manually).
- Figure 12B illustrates the UI of an autonomous flight control application when home button 1222 is selected in an implementation.
- the UI receives an input indicating that home button 1222 has been selected, the UI displays tabbed window 1230 including Cinematic tab 1240 for selecting from among several modes of automated drone flight.
- Motion Track button 1241 causes the drone to track an object such as an individual or vehicle in motion during flight while autonomously avoiding obstacles along the way.
- Fixed Track button 1242 initiates Fixed Track mode which is used to track a subject traveling on a fixed track. Fixed Track mode causes the drone to follow the subject while keeping a set distance from the subject and while maintaining its original camera orientation.
- Orbit Subject button 1243 initiates a subject tracking flight operation of the drone whereby the application commands the drone to fly an orbit around a subject.
- Cable button 1244 engages a method of operating the drone whereby the application defines two keyframes marking the endpoints of a flight path and then flies the drone between the two points as if tethered to a cable strung between them.
- Hover button 1245 causes the drone to hover at a single spatial location or keyframe.
- Key Frame button 1246 activates a Key Frame Mode of drone operation in which the application records and stores multiple keyframes, automatically and dynamically generates a spline flight path between each of the keyframes, and commands the drone to fly the spline.
- the application may also incorporate inputs received by the UI from the pilot’s interactions with input devices of the interface.
- an existing spline can also be edited and saved for reuse by the drone or by other drones with KeyFrame Mode capability.
- the saved spline may be saved in and retrieved from nonvolatile storage onboard the drone, within the controller, on a computer in communication with the controller or the drone, or in connected cloud storage.
- Figure 12C illustrates the UI of an autonomous flight control application when KeyFrame button 1246 is selected causing the application to initiate the KeyFrame Mode of operating the drone as indicated by operating mode graphic 1225.
- the touchscreen displays camera view 1210 from an onboard camera.
- the UI displays text display 1203 to indicate the current mode: “KeyFrame Mode.”
- Flight parameter set 1202 is a graphic in the upper left comer of the touchscreen displaying drone flight speed, drone distance from the dock, drone elevation, and gimbal angle of the camera relative to level flight.
- the UI displays virtual buttons by which the pilot can define keyframes to be used in generating the spline:
- Add button 1251 causes the application to add a keyframe at the drone’s current location
- Undo button 1250 reverses the action triggered by Add button 1251 (i.e., undoes adding the most recently added keyframe)
- Done button 1252 terminates the addition of keyframes.
- graphic 1204 for pausing KeyFrame Mode so the pilot can stop autonomous flight and take manual control. Note that camera view 1210 is darkened to enhance the visibility of text display 1203.
- Figure 12D illustrates the UI at the initiation of KeyFrame Mode with the UI of the autonomous flight control application prompting the pilot to add the first keyframe in text display 1203.
- Figure 12E illustrates the UI after the first keyframe is added.
- the autonomous flight control application records the spatial location of the drone. The application may also record the drone orientation, the gimbal angle of the onboard camera, focal length and/or exposure settings of the onboard camera, the velocity of the drone, and/or other flight or operational parameters at the newly added keyframe location.
- the drone has been piloted to a position closer to the copse for the second keyframe, and a second keyframe is added.
- Figure 12F illustrates text display 1203 of the UI confirming that a keyframe has been added.
- Figures 12G and 12H illustrate the touchscreen display as the drone is piloted around the copse and keyframes are added.
- the autonomous flight control application dynamically recomputes the spline as keyframes are added, and the AR representation of the spline is continually updated by the UI on the display.
- computed spline 1260 is displayed on the touchscreen augmented over camera view 1210.
- Keyframe markers 1262 are displayed as diamonds on computed spline 1260.
- Figure 121 illustrates camera view 1210 looking down on the copse as the seventeenth keyframe is to be added.
- Computed spline 1260 and keyframe markers 1262 are more clearly seen in this view.
- Computed spline 1260 is a two-dimensional projection of the three-dimensional spline comprising a flight path or trajectory between each of the multiple key frames.
- the size of diamond- shaped keyframe markers 1262 marking the locations of the keyframe varies with the order in which they are added, with the most recently added keyframe indicated by the largest diamond.
- Computed spline 1260 and keyframe markers 1262 may also be scaled in proportion to their distance from the drone, with the keyframe markers dynamically growing in size on the display as the drone approaches the keyframe location.
- different geometric shapes may be used to indicate keyframes according to a particular purpose, such as the starting point or end point of a computed spline.
- Figure 12J illustrates the UI during keyframe addition and spline generation at the point when the pilot has completed adding keyframes.
- the autonomous flight control application receives an indication that the spline is complete and switches to a keyframe playback mode as shown in Figure 13A.
- Figures 13A-13D illustrates an implementation of a UI once a spline definition is complete and the spline is to be played back.
- the UI of the autonomous flight control application displays playback track 1320, which is linear graphical representation of the computed spline.
- playback track 1320 keyframes are represented as diamonds 1322, and the current position of the drone along the spline is also shown as arrowhead 1324. The relative distance between the keyframes is indicated by the proportional spacing of diamonds 1322 on playback track 1320.
- Figure 13B illustrates the addition of virtual speed control slider 1330 to the touchscreen display by which the UI receives manual input(s) causing the application to speed up, slow down, or hover the drone as it traverses computed spline 1260.
- speed control slider 1330 the pilot can control drone speed between keyframes or across the entire spline.
- arrowhead 1324 changes color to indicate when the drone is in motion.
- the relative time to travel between keyframes is indicated by the proportional spacing of diamonds 1322 on playback track 1320.
- An additional functionality of playback track 1320 of the UI is to cause the drone to “snap to” any location on computed spline 1260. Tapping anywhere on playback track 1320 directs the application to fly the drone directly to that location without traversing the spline.
- Figure 13C illustrates an implementation of the UI in which the autonomous flight control application displays the progress of the drone as it flies spline 1260 during playback in KeyFrame Mode.
- the application may receive commands from the pilot using virtual play /pause button 1314 and forward/reverse button 1316 to fly the drone forward or backward along the spline or to pause and hover.
- the UI continually updates playback track 1320 to show both the direction of playback and current position of the drone along the spline.
- Diamonds 1322 change color to indicate the progress of the drone through the keyframes.
- Figure 13D illustrates the touchscreen display of a controller during playback in an implementation as the drone begins to travel computed spline 1260 starting at the first keyframe recorded in Figure 12D.
- the UI of the autonomous flight control application displays a number of flight and operational commands which cause the application to: adjust the speed of the drone’s travel along the spline; stop or reverse the drone along the spline; jump or “snap” to a keyframe out of order on the spline; add new keyframes, delete keyframes; or edit speed or orientation settings at any keyframe. Note that on the AR representation of computed spline 1260, the size of each of keyframe markers 1262 grows larger as the drone approaches the keyframe location.
- the UI displays computed spline 1260 in a color which is highly visible from the background (first-person) view on the touchscreen.
- This color can be programmatically chosen using an algorithm which detects the range of colors of camera view 1210, or the color may be set manually the pilot.
- Figures 14A-14D illustrate an implementation of the user interface of an autonomous flight control system displayed on a smartphone touchscreen.
- the autonomous flight control application operating in KeyFrame Mode has generated a spline between each of the multiple keyframes.
- the application can edit a spline during playback by adding additional keyframes or by editing existing keyframes.
- Figure 14A illustrates the UI prior to the start of the drone’s travel on the spline.
- the UI displays playback track 1410 with diamonds 1412 marking the locations of keyframes in proportion to their distances along the spline or in proportion to the time to travel between the keyframes.
- spline 1404 is AR graphic displayed over camera view 1402 with keyframe markers 1406 indicating the locations of upcoming keyframes as the drone traverses spline 1404.
- Figure 14B illustrates the UI as the drone travels computed spline 1404 passing between the second and third keyframes. Arrow 1414 traverses playback track 1410 indicating in real time the location of the drone on spline 1404.
- Edit button 1416 Next to playback track 1410 are virtual input devices Edit button 1416 and Add button 1418.
- Edit button 1416 As the drone traverses spline 1404, when it reaches a keyframe, Edit button 1416 becomes active which allows the pilot to select the keyframe for editing.
- Add button 1418 is active which, when selected, prompts the application to define and add a new keyframe at the drone’s location, recording and storing the drone’s location as determined by visual tracking or by navigational coordinates.
- the application may also record and store other flight or operational parameters for the new keyframe such as the gimbal angle, exposure settings, or focal length of the onboard camera.
- Figure 14C illustrates the UI when Add button 1418 is selected.
- the UI prompts the pilot to set the location of the new keyframe by tapping Set button 1420.
- Set button 1420 When Set button 1420 is tapped, the application recomputes spline 1404, directs the UI to display an updated graphical representation of spline 1404, and marks the location of the new keyframe by added a diamond to playback track 1410.
- Figure 14D illustrates the UI when the pilot taps Add button 1418 while the drone is at an existing keyframe: the application prompts the pilot to indicate whether the new keyframe should be positioned before or after the existing keyframe (or cancel the addition).
- the application recomputes spline 1404 and the UI updates the display accordingly.
- Figure 15 comprises a sequence of images illustrating yet another implementation of the UI of an autonomous flight control application of drone operating in KeyFrame Mode during playback.
- Images 1510-1530 illustrate the display on a drone remote control.
- the display shows a first-person camera view is a live-feed from an onboard camera.
- the drone arrow indicator is traversing the playback track traveling from right to left and shows the drone just as it approaches keyframe 4.
- a translucent diamond marking keyframe 4’s location dynamically grows in size as the drone approaches it, then disappears (in image 1520) to simulate the drone passing through the AR keyframe diamond.
- image 1530 shows the first-person view of the drone as it continues on the computed spline but with the drone pivoting starboard to track the paddleboarder. Having pivoted away from a forward-facing orientation, the AR representation of the computed spline is no longer visible, ostensibly because it is out of camera view to the left of the screen.
- aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module,” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
- the systems, methods, processes, and operational scenarios may be implemented in computer software executed by a processing system in the context of an unmanned aerial vehicle, a remote-control device, or any other type of device capable of executing software such as computers and mobile phones.
- the processing system may load and execute the software from a storage system or may be pre-configured with the software.
- the software includes and implements a process for creating a computed spline, which is representative of the spline-creation processes discussed with respect to the preceding Figures, such as process 200 and process 700.
- the software also includes and implements processes associated with the user interface of an autonomous flight control program, which is representative of the user interfaces of autonomous flight control programs discussed with respect to the preceding Figures, such as process 1100.
- the software directs the processing system to operate as described herein for at least the various processes, operational scenarios, and sequences discussed in the foregoing implementations.
- Exemplary processing system may comprise a micro-processor and other circuitry that retrieves and executes software from storage.
- the processing system may be implemented within a single processing device but may also be distributed across multiple processing devices or sub-systems that cooperate in executing program instructions. Examples of processing systems include general purpose central processing units, graphical processing units, application specific processors, and logic devices, as well as any other type of processing device, combinations, or variations thereof.
- An exemplary storage system may comprise any computer readable storage media readable by a processing system and capable of storing software.
- the storage system may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
- Examples of storage media include random access memory, read only memory, magnetic disks, optical disks, flash memory, virtual memory and non-virtual memory, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other suitable storage media.
- the computer readable storage media a propagated signal.
- the software may be implemented in program instructions and among other functions may, when executed by a processing system, direct the processing system to operate as described with respect to the various operational scenarios, sequences, and processes illustrated herein.
- the program instructions may include various components or modules that cooperate or otherwise interact to carry out the various processes and operational scenarios described herein.
- the various components or modules may be embodied in compiled or interpreted instructions, or in some other variation or combination of instructions.
- the various components or modules may be executed in a synchronous or asynchronous manner, serially or in parallel, in a single threaded environment or multi-threaded, or in accordance with any other suitable execution paradigm, variation, or combination thereof.
- the software may include additional processes, programs, or components, such as operating system software, virtualization software, or other application software.
- the software may also comprise firmware or some other form of machine-readable processing instructions executable by a suitable processing system.
- the software may, when loaded into a processing system and executed, transform a suitable apparatus, system, or device overall from a general-purpose computing system into a special-purpose computing system as described herein.
- Encoding the software on a storage system may transform the physical structure of the storage system.
- the specific transformation of the physical structure may depend on various factors in different implementations of this description. Examples of such factors may include, but are not limited to, the technology used to implement the storage media of the storage system and whether the computer-storage media are characterized as primary or secondary storage, as well as other factors.
- the software may transform the physical state of the semiconductor memory when the program instructions are encoded therein, such as by transforming the state of transistors, capacitors, or other discrete circuit elements constituting the semiconductor memory.
- a similar transformation may occur with respect to magnetic or optical media.
- Other transformations of physical media are possible without departing from the scope of the present description, with the foregoing examples provided only to facilitate the present discussion.
- the unmanned aerial vehicles, remote-control devices, or other devices in which aspects of the present invention may be embodied may include a communication interface system.
- the communication interface system may include communication connections and devices that allow for communication with other computing systems and devices (not shown) over communication networks (not shown). Examples of connections and devices that together allow for inter-system communication may include network interface cards, antennas, power amplifiers, RF circuitry, transceivers, and other communication circuitry.
- the connections and devices may communicate over communication media to exchange communications with other computing systems or networks of systems, such as metal, glass, air, or any other suitable communication media.
- the aforementioned media, connections, and devices are well known and need not be discussed at length here.
- Communication between such systems and devices may occur over a communication network or networks and in accordance with various communication protocols, combinations of protocols, or variations thereof. Examples include intranets, internets, the Internet, local area networks, wide area networks, wireless networks, wired networks, virtual networks, software defined networks, data center buses and backplanes, or any other type of network, combination of network, or variation thereof.
- the aforementioned communication networks and protocols are well known and need not be discussed at length here.
- the unmanned aerial vehicles, remote-control devices, or other devices in which aspects of the present technology may be embodied, may include a user interface system.
- a user interface system may any one or more of a joystick, a keyboard, a mouse, a voice input device, a touch input device for receiving a touch gesture from a user, a motion input device for detecting non-touch gestures and other motions by a user, and other comparable input devices and associated processing elements capable of receiving user input from a user (e.g., joystick toggles).
- Output devices such as a display, speakers, haptic devices, and other types of output devices may also be included in the user interface system.
- the input and output devices may be combined in a single device, such as a display capable of displaying images and receiving touch gestures.
- the aforementioned user input and output devices are well known in the art and need not be discussed at length here.
- the user interface system may also include associated user interface software executable by a suitable processing system in support of the various user input and output devices discussed above. Separately or in conjunction with each other and other hardware and software elements, the user interface software and user interface devices may support a graphical user interface, a natural user interface, or any other type of user interface.
- aspects of the present technology describe various operating scenarios for drone operation along a computed spline flight path or along a programmed flight path obtained from a continuous keyframe recording or other source.
- the drone pilot may issue operational commands via the joystick on the remote-control device.
- the autopilot receives the joystick data and incorporates the data into the ostensible joystick commands issued to the UAV microprocessor.
- the autopilot retains control over the operation of the drone along the flight path, and the capability to incorporate joystick data into the operation of the drone effects an operational envelope along the flight path which allows the drone pilot to control one or more particular aspects of the flight to achieve optimal drone operation.
- the words “comprise,” “comprising,” “such as,” and “the like” are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense, that is to say, in the sense of "including, but not limited to.”
- the terms “connected,” “coupled,” or any variant thereof means any connection or coupling, either direct or indirect, between two or more elements; the coupling or connection between the elements can be physical, logical, or a combination thereof.
- the words “herein,” “above,” “below,” and words of similar import when used in this application, refer to this application as a whole and not to any particular portions of this application.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22839597.6A EP4437395A1 (en) | 2021-11-24 | 2022-11-23 | Enhanced unmanned aerial vehicle flight along computed splines |
Applications Claiming Priority (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202163282725P | 2021-11-24 | 2021-11-24 | |
US63/282,725 | 2021-11-24 | ||
US202263296285P | 2022-01-04 | 2022-01-04 | |
US63/296,285 | 2022-01-04 | ||
US17/689,459 | 2022-03-08 | ||
US17/689,414 | 2022-03-08 | ||
US17/689,459 US11921500B2 (en) | 2021-11-24 | 2022-03-08 | Graphical user interface for enhanced unmanned aerial vehicle flight along computed splines |
US17/689,414 US20230161338A1 (en) | 2021-11-24 | 2022-03-08 | Enhanced Unmanned Aerial Vehicle Flight Along Computed Splines |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023097263A1 true WO2023097263A1 (en) | 2023-06-01 |
Family
ID=84887417
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2022/080412 WO2023097263A1 (en) | 2021-11-24 | 2022-11-23 | Enhanced unmanned aerial vehicle flight along computed splines |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP4437395A1 (en) |
WO (1) | WO2023097263A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190094863A1 (en) * | 2017-09-28 | 2019-03-28 | Gopro, Inc. | Multipoint Cable Cam System and Method |
US20190268720A1 (en) * | 2018-02-28 | 2019-08-29 | Walmart Apollo, Llc | System and method for indicating drones flying overhead |
-
2022
- 2022-11-23 EP EP22839597.6A patent/EP4437395A1/en active Pending
- 2022-11-23 WO PCT/US2022/080412 patent/WO2023097263A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190094863A1 (en) * | 2017-09-28 | 2019-03-28 | Gopro, Inc. | Multipoint Cable Cam System and Method |
US20190268720A1 (en) * | 2018-02-28 | 2019-08-29 | Walmart Apollo, Llc | System and method for indicating drones flying overhead |
Also Published As
Publication number | Publication date |
---|---|
EP4437395A1 (en) | 2024-10-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11921500B2 (en) | Graphical user interface for enhanced unmanned aerial vehicle flight along computed splines | |
US11573562B2 (en) | Magic wand interface and other user interaction paradigms for a flying digital assistant | |
US11347217B2 (en) | User interaction paradigms for a flying digital assistant | |
US10860040B2 (en) | Systems and methods for UAV path planning and control | |
US9947230B2 (en) | Planning a flight path by identifying key frames | |
US10831186B2 (en) | System for authoring, executing, and distributing unmanned aerial vehicle flight-behavior profiles | |
US20190250601A1 (en) | Aircraft flight user interface | |
US10317775B2 (en) | System and techniques for image capture | |
US20170039764A1 (en) | Interface for planning flight path | |
WO2016168722A1 (en) | Magic wand interface and other user interaction paradigms for a flying digital assistant | |
US11804052B2 (en) | Method for setting target flight path of aircraft, target flight path setting system, and program for setting target flight path | |
US10904427B2 (en) | Coordinated cinematic drone | |
CN105676862A (en) | Flight device control system and control method | |
US12007763B2 (en) | Magic wand interface and other user interaction paradigms for a flying digital assistant | |
US20240255943A1 (en) | Graphical User Interface For Enhanced Unmanned Aerial Vehicle Flight Along Computed Splines | |
WO2023097263A1 (en) | Enhanced unmanned aerial vehicle flight along computed splines | |
US20230259145A1 (en) | Enhanced Unmanned Aerial Vehicle Flight With Situational Awareness For Moving Vessels | |
WO2023159013A1 (en) | Unmanned aerial vehicle flight control for chasing a moving target equipped with a trackable beacon |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22839597 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2024531341 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022839597 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022839597 Country of ref document: EP Effective date: 20240624 |