WO2017173502A1 - Dispositifs aériens, ensembles rotors pour dispositifs aériens, et cadres et méthodologies se rapportant aux dispositifs configurés à des fins de commande des dispositifs aériens - Google Patents

Dispositifs aériens, ensembles rotors pour dispositifs aériens, et cadres et méthodologies se rapportant aux dispositifs configurés à des fins de commande des dispositifs aériens Download PDF

Info

Publication number
WO2017173502A1
WO2017173502A1 PCT/AU2017/050307 AU2017050307W WO2017173502A1 WO 2017173502 A1 WO2017173502 A1 WO 2017173502A1 AU 2017050307 W AU2017050307 W AU 2017050307W WO 2017173502 A1 WO2017173502 A1 WO 2017173502A1
Authority
WO
WIPO (PCT)
Prior art keywords
blade
rotor
drone
hinge connection
mounting member
Prior art date
Application number
PCT/AU2017/050307
Other languages
English (en)
Inventor
Anthony Zammit
Sivaan NAMANN
Eduard LIPKIN
Alexander YELLACHICH
Sam Anthony CASSISI
Matthew LIPSKI
Original Assignee
Iot Group Technology Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2016901302A external-priority patent/AU2016901302A0/en
Application filed by Iot Group Technology Pty Ltd filed Critical Iot Group Technology Pty Ltd
Publication of WO2017173502A1 publication Critical patent/WO2017173502A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C11/00Propellers, e.g. of ducted type; Features common to propellers and rotors for rotorcraft
    • B64C11/16Blades
    • B64C11/20Constructional features
    • B64C11/28Collapsible or foldable blades
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C11/00Propellers, e.g. of ducted type; Features common to propellers and rotors for rotorcraft
    • B64C11/46Arrangements of, or constructional features peculiar to, multiple propellers
    • B64C11/48Units of two or more coaxial propellers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/32Rotors
    • B64C27/37Rotors having articulated joints
    • B64C27/39Rotors having articulated joints with individually articulated blades, i.e. with flapping or drag hinges
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/32Rotors
    • B64C27/46Blades
    • B64C27/473Constructional features
    • B64C27/48Root attachment to rotor head
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U30/00Means for producing lift; Empennages; Arrangements thereof
    • B64U30/20Rotors; Rotor supports

Definitions

  • AERIAL DEVICES ROTOR ASSEMBLIES FOR AERIAL DEVICES, AND DEVICE FRAMEWORKS AND METHODOLOGIES CONFIGURED TO ENABLE CONTROL OF AERIAL DEVICES
  • the present invention relates to aerial devices, rotor assemblies for aerial devices, and device frameworks and methodologies configured to enable control of aerial devices. While some embodiments will be described herein with particular reference to that application, it will be appreciated that the invention is not limited to such a field of use, and is applicable in broader contexts.
  • Aerial drones have become increasingly popular in recent times.
  • drone control algorithms continue to advance, for example allowing smart autonomous control thereby to enable photography-related flight control with minimal user input.
  • some drones are configured to adopt predefined flight paths, and/or move in a defined manner relative to a defined object (for example a trackable tether device). This, combined with a desire to miniaturize drones for personal use, leads t a range of technical challenges.
  • One embodiment provides a rotor assembly for an aerial device, the rotor assembly including:
  • a first blade assembly wherein the first blade assembly includes a central blade mounting member securely mounted to a first axial shaft, wherein:
  • the first blade assembly includes a plurality of blade supporting members mounted at spaced apart peripheral locations to the central blade mounting member, each blade supporting member is mounted to the central blade mounting member via a respective primary hinge connection, thereby to allow limited rotation of the blade supporting member to the central blade mounting member about a horizontal axis defined by the primary hinge connection, thereby to enable upward and outward rotation from: (i) a storage position in which the blades have respective blade axes within less than about 20 degrees of parallel to the axis of the first axial shaft; to (ii) an operational position wherein the blades are positioned to provide uplift upon threshold rotation of the first axial shaft; and
  • each blade supporting member is mounted to a respective rotor blade via a respective secondary hinge connection, thereby to allow limited rotation of the rotor blade relative to the blade supporting member about a vertical axis defined by the primary hinge connection;
  • a second blade assembly wherein the second blade assembly includes a central blade mounting member securely mounted to a second axial shaft, wherein:
  • the second blade assembly includes a plurality of blade supporting members mounted at spaced apart peripheral locations to the central blade mounting member, each blade supporting member is mounted to the central blade mounting member via a respective primary hinge connection, thereby to allow limited rotation of the blade supporting member to the central blade mounting member about a horizontal axis defined by the primary hinge connection, thereby to enable upward and outward rotation from: (i) a storage position in which the blades have respective blade axes within less than about 20 degrees of parallel to the axis of the first axial shaft; to (ii) an operational position wherein the blades are positioned to provide uplift upon threshold rotation of the first axial shaft; and [001 1 ] each blade supporting member is mounted to a respective rotor blade via a respective secondary hinge connection, thereby to allow limited rotation of the rotor blade relative to the blade supporting member about a vertical axis defined by the primary hinge connection;
  • first and second axial shafts are coaxially mounted with respect to one another, thereby to allow individual rotational driving of the shafts and their respective blade assemblies via respective driving mechanisms.
  • One embodiment provides a rotor assembly for an aerial device wherein, in the storage position, the blades of the first and second blade assemblies are enabled to contact with an elongate housing body for the aerial device.
  • One embodiment provides a rotor assembly for an aerial device wherein the limited rotation of the rotor blades relative to the blade supporting members about the vertical axis defined by the primary hinge connections is configured to reduce the effect of an impact of a rotor blade against an external object during flight.
  • One embodiment provides a rotor assembly for an aerial device wherein a control component is configured to enable control over the pair of blade assemblies, the control component including:
  • a printed circuit board comprising a single electronic speed control (ESC) unit;
  • each MOSFET component is configured to provide direct control signalling to a respective set of two motors
  • each motor is coupled to a respective drive shaft, the drive shafts each being mounted to respective blade assemblies.
  • One embodiment provides an aerial device having a rotor assembly including:
  • a first blade assembly wherein the first blade assembly includes a central blade mounting member securely mounted to a first axial shaft
  • the first blade assembly includes a plurality of blade supporting members mounted at spaced apart peripheral locations to the central blade mounting member, each blade supporting member is mounted to the central blade mounting member via a respective primary hinge connection, thereby to allow limited rotation of the blade supporting member to the central blade mounting member about a horizontal axis defined by the primary hinge connection, thereby to enable upward and outward rotation from: (i) a storage position in which the blades have respective blade axes within less than about 20 degrees of parallel to the axis of the first axial shaft; to (ii) an operational position wherein the blades are positioned to provide uplift upon threshold rotation of the first axial shaft; and
  • each blade supporting member is mounted to a respective rotor blade via a respective secondary hinge connection, thereby to allow limited rotation of the rotor blade relative to the blade supporting member about a vertical axis defined by the primary hinge connection;
  • a second blade assembly wherein the second blade assembly includes a central blade mounting member securely mounted to a second axial shaft, wherein:
  • the second blade assembly includes a plurality of blade supporting members mounted at spaced apart peripheral locations to the central blade mounting member, each blade supporting member is mounted to the central blade mounting member via a respective primary hinge connection, thereby to allow limited rotation of the blade supporting member to the central blade mounting member about a horizontal axis defined by the primary hinge connection, thereby to enable upward and outward rotation from: (i) a storage position in which the blades have respective blade axes within less than about 20 degrees of parallel to the axis of the first axial shaft; to (ii) an operational position wherein the blades are positioned to provide uplift upon threshold rotation of the first axial shaft; and
  • each blade supporting member is mounted to a respective rotor blade via a respective secondary hinge connection, thereby to allow limited rotation of the rotor blade relative to the blade supporting member about a vertical axis defined by the primary hinge connection; [0026] wherein the first and second axial shafts are coaxially mounted with respect to one another, thereby to allow individual rotational driving of the shafts and their respective blade assemblies via respective driving mechanisms.
  • One embodiment provides an aerial device wherein, in the storage position, the blades of the first and second blade assemblies are enabled to contact with an elongate housing body for the aerial device.
  • One embodiment provides an aerial device wherein the limited rotation of the rotor blades relative to the blade supporting members about the vertical axis defined by the primary hinge connections is configured to reduce the effect of an impact of a rotor blade against an external object during flight.
  • One embodiment provides an aerial device wherein a control component is configured to enable control over the pair of blade assemblies, the control component including:
  • a printed circuit board comprising a single electronic speed control (ESC) unit;
  • each MOSFET component is configured to provide direct control signalling to a respective set of two motors
  • each motor is coupled to a respective drive shaft, the drive shafts each being mounted to respective blade assemblies.
  • One embodiment provides a control component configured to enable control over a pair of rotor assemblies for an aerial device, the control component including:
  • a printed circuit board comprising a single electronic speed control (ESC) unit;
  • each MOSFET component is configured to provide direct control signalling to a respective set of two motors; [0036] wherein each motor is coupled to a respective drive shaft, the drive shafts each being mounted to respective rotor assemblies.
  • One embodiment provides a computer implemented method for controlling an aerial drone system thereby to autonomously follow a human face, the method including:
  • processing the image data thereby to identify a target region predicted to contain a human face wherein the processing includes:
  • One embodiment provides a method according to claim 10 including configuring the tracking and control algorithm to: [0048] (i) cause drone control instructions to be implemented in respect of a first set of facial movement conditions; and
  • One embodiment provides a method wherein the first and second sets of facial movement conditions are defined to enable a tracked face to adopt a facial pose without affecting aerial drone position.
  • One embodiment provides a method wherein the first and second sets of facial movement conditions are temporally variable, such that at least one facial movement condition is transitioned from the first set to the second set during a defined time period.
  • One embodiment provides a method wherein the first and second sets of facial movement conditions include one or more of the following movement conditions: horizontal movement conditions; vertical axis facial rotation conditions; face-normal horizontal axis facial rotation conditions; and face parallel horizontal axis facial rotation conditions.
  • One embodiment provides a method wherein the tracking and control algorithm is configured with data that correlates each pixel in image data to a distance, thereby to enable tuning of the algorithm to operate with different camera modules.
  • One embodiment provides a method wherein the aerial drone is configured with a modular design, whereby a first camera module is removable and replaceable with a second camera module, wherein the second camera module has different image capture properties.
  • One embodiment provides a method wherein the tracking and control algorithm is configured to implement a preliminary positional determination process whereby the aerial drone is assumed to be a defined distance from the human face.
  • One embodiment provides a method wherein the defined distance is an arm's length approximation distance. [0057] One embodiment provides a method including performing a tuning function based on a combination of: the assumed distance; and statistical averages of facial feature layouts.
  • One embodiment provides a method the error data includes pose offset and/or distance offset data based on a predefined start position.
  • One embodiment provides a method wherein the tracking and control algorithm additionally includes: (ii) based on the error data, defining a camera control instruction thereby to apply a predefined degree of movement to a camera positioning module, thereby to reduce an error defined by the error data.
  • One embodiment provides a computer implemented method for enabling gesture-driven control over a drone device, the method including:
  • facial detection and identification algorithm operating a facial detection and identification algorithm, wherein the facial detection and identification algorithm is configured to: (i) identify the presence of a human face; and (ii) determine whether the human face corresponds to a prescribed known human face; and
  • One embodiment provides a method for enabling gesture-driven control over a drone device, the method including: [0067] receiving data representative of images captured by a camera device of the aerial drone device;
  • One embodiment provides a method for enabling user control over a drone device, the method including:
  • One embodiment provides a method wherein the motors are each connected to a respective one of a pair of coaxially mounted drive shafts.
  • One embodiment provides a method including displaying, via a display screen of the mobile device, a live video feed derived from an image capture device mounted to the drone device. [0078] One embodiment provides a method wherein left and right based movements of the handheld device are mirrored in terms of their resulting control instructions.
  • One embodiment provides a method wherein including implementing two modes of operation:
  • One embodiment provides a method wherein, in the second mode of operation, the method includes displaying, via a display screen of the mobile device, a live video feed derived from an image capture device mounted to the drone device.
  • One embodiment provides a computer program product for performing a method as described herein.
  • One embodiment provides a non-transitory carrier medium for carrying computer executable code that, when executed on a processor, causes the processor to perform a method as described herein.
  • One embodiment provides a system configured for performing a method as described herein.
  • any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others.
  • the term comprising, when used in the claims should not be interpreted as being limitative to the means or elements or steps listed thereafter.
  • the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B.
  • Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.
  • exemplary is used in the sense of providing examples, as opposed to indicating quality. That is, an "exemplary embodiment” is an embodiment provided as an example, as opposed to necessarily being an embodiment of exemplary quality.
  • FIG. 1 A to FIG. 1 N illustrate aerial drones and rotor configurations according to embodiments.
  • FIG. 2A, FIG. 2B and FIG. 2C schematically illustrate modular hardware configurations according to embodiments.
  • FIG. 3 illustrates a communications arrangement according to an embodiment.
  • FIG. 4 illustrates component connections according to one embodiment.
  • FIG. 5A illustrates a method according to one embodiment.
  • FIG. 5B illustrates a method according to one embodiment.
  • FIG. 6 illustrates an exemplary motor controller board. DETAILED DESCRIPTION
  • the present disclosure relates to drone technology, and in particular to aerial drone technology.
  • Embodiments are described by reference to a "selfie drone", being an aerial drone device that is configured to take photos (and/or video) of a user (for example by positioning itself in a defined location relative to the user).
  • a selfie drone being an aerial drone device that is configured to take photos (and/or video) of a user (for example by positioning itself in a defined location relative to the user).
  • photos and/or video
  • FIG. 1 A to FIG. 1 N illustrate aerial drone devices according embodiments, referred to collectively as "drone 100".
  • drone 100 aerial drone devices according embodiments, referred to collectively as "drone 100".
  • Various aspects of drone technology are described by reference to the example of drone 100, and/or variations thereof. It will be appreciated that various technological aspects described herein, including the likes of tracking/control functionalities, physical configurations, and the like, are applicable across a wide range of aerial and other drone devices, and as such are not in any way limited to the specific example of drone 100.
  • Drone 100 includes a body 101 , which contains a plurality of modular components.
  • the components include:
  • Assembly 1 10 includes a pair of motors, configured to drive a respective pair of rotor drive shafts, which are connected to rotor blade assemblies 160 and 170. These rotor blade assemblies are mounted in a vertically-spaced configuration on coaxially mounted drive shafts. The rotor blade assemblies are oppositely configured, thereby to enable control over drone aerial positioning by way of coordinated driving of the two motors (preferably in combination with control over blade rotation axis, for example via one or more servomotors).
  • a camera module 120 includes a camera (i.e. a digital image capture device) mounted to a positioning assembly that is optionally variable via one or more powered servomotors.
  • the servomotors enable control over rotation about a horizontal axis, thereby to enable up-down positioning of camera point of view via such rotation.
  • additional degrees of freedom may be incorporated into the camera module, thereby to enhance PTZ capabilities.
  • this additionally provides stabilization to a video feed.
  • the camera module provides a dedicated stabilization system.
  • the camera module is mounted to body 100 via dampers 121 , which are configured to reduce the impact of drone vibrations (and other movements) on the camera module, to enhance the quality of captured images and/or video.
  • One or more circuit boards 130 provide processing capabilities to drone 100, in particular enabling algorithms which process input from the camera module (and optionally other input sources) thereby to define output used in the context of defining control instructions to drive the motors (although in some embodiments processing is shared with a tethered device, such as a smartphone). For example, as described in more detail further below, in some embodiments this enables drone control based on facial recognition and facial tracking.
  • the circuit board(s) also provide additional functionalities, including one or more IMUs (thereby to derive input regarding drone acceleration and orientation) and a WiFi module (thereby to enable interaction with one or more control devices, such as smartphones executing control software applications).
  • the circuit boards additionally include one or more of: GPS for positioning outdoors; a barometer for altitude calculations; and an optical flow sensor with an ultrasonic sensor for positioning indoors.
  • a battery module 140 is configured to enable external access to a rechargeable battery component, such that the battery component is able to be removed and replaced in a quick and convenient manner.
  • Body 100 includes provides a housing, which is able to be opened thereby to provide access to internal components. This, combined with the modular configuration (which is discussed in more detail further below), enables convenient removal and replacement of various components. For example, this may include replacement for maintenance issues and/or modular upgrades.
  • Body 100 also provides one or more user input devices, in the present embodiment including a primary button 180.
  • Button 180 is in some embodiments configured to enable multiple forms of user input, for example “on”, “off” and “fly”.
  • the "on” and “fly” commands are synonymous, in that upon activating drone 100 it adopts a default autonomous fly mode.
  • the default autonomous mode may be a stable hover mode, in which the drone is configured to maintain a position substantially the same as a position in which it is released (preferably by a manual hand release).
  • the default mode includes implementation of an image-processing based tracking mode, for example a "follow-me” mode.
  • a "follow-me” mode based on facial recognition is described in more detail further below.
  • drone 100 is configured to initially identify a human face (preferably a face belonging to a human user to releases drone 100 from an extended arm with the camera pointing in their direction), and implement a tracking and control algorithm which tracks the position and/or pose of the human face (in some cases with defined restraints), and control the drone thereby to maintain a defined orientation and/or position with respect to that human face.
  • a human face preferably a face belonging to a human user to releases drone 100 from an extended arm with the camera pointing in their direction
  • a tracking and control algorithm which tracks the position and/or pose of the human face (in some cases with defined restraints)
  • the default mode is controlled by a user, for example by settings defined in a mobile application.
  • a mobile device connection is not necessary for various functionalities of drone 100, including facial tracking and image capture.
  • drone 100 need not be tethered to any form of electronic device for the purpose of tracking; the tracking is based on image processing as opposed to identification of relative position of an electronic tethered device.
  • drone 100 is able to fly due to a rotor assembly including a pair of rotor blade assemblies mounted to co-axial drive shafts.
  • the configuration of this rotor assembly has been primarily developed to provide a compact drone, which is well-suited to consumer-level applications (for instance in terms of safety).
  • the illustrated rotor assembly includes a first blade assembly, being an upper blade assembly 160, and a second blade assembly, being a lower blade assembly 170.
  • Components defining blade assembly 160 and 170 are generally similar, particularly in terms of a dual-axis hinge arrangement (described below). However, it will be appreciated that rotor blades are mirrored between the assemblies, such that the assemblies are configured to rotate in opposite directions for the purposes of flight.
  • the rotor assemblies are provided on drive shafts that are coaxially mounted with respect to one another, thereby to allow individual rotational driving of the shafts and their respective blade assemblies via respective driving mechanisms.
  • Blade assembly 160 includes a central blade mounting member 161 securely mounted to a first axial shaft 162.
  • Shaft 162 is mounted to a drive motor provided by motor and drive control assembly 1 10.
  • a plurality of blade supporting members 163 are, mounted at spaced apart peripheral locations to the central blade mounting member (in the illustrated example there are two diametrically opposed blade supporting members separated by 180 degrees, in other embodiments there may be three or more).
  • Each blade supporting member is mounted to the central blade mounting member via a respective primary hinge connection, thereby to allow limited rotation of the blade supporting member to the central blade mounting member about a horizontal axis defined by the primary hinge connection.
  • the blade supporting members are configured to lock into place in the operational position (for example via an over-centring arrangement, snap lock formations, or the like). It will be appreciated that the upward and outward rotation occurs automatically in response to forces caused by rotation of the drive shaft.
  • the drone is configured (or intended) to be used in a manner whereby the upward and outward rotation into a locked operation position is performed by a user prior to motor activation.
  • Each blade supporting member is mounted to a respective rotor blade 165 via a respective secondary hinge connection.
  • the secondary hinge connection allows limited rotation of the rotor blade relative to the blade supporting member about a vertical axis defined by the primary hinge connection. It will be appreciated that, in response to forces caused by rotation of the drive shaft during flight, the blades remain diametrically opposed at the centre of the hinge rotation. However, upon a collision, the rotation serves to reduce the moment of impact. This is particularly relevant in the context of providing a degree of safety to a consumer-level proximity drone as described herein. Moreover, the use of a dual horizontal and vertical hinge arrangement as described and illustrated here allows for a drone that is both compact and low-hazard.
  • blade assembly 170 is generally similar to blade assembly 160 in terms of a dual horizontal and vertical hinge arrangement.
  • blade assembly 170 includes additional rotational joints, thereby to enable adjustment of the alignment of the axis blade rotation (for example about one or two axis). This is in some embodiments controlled by a pair of servomotors, mounted within the central blade mounting member.
  • the blade mounting member 171 is mounted to the drive shaft by a pair of axles (which mount to a connection block 171 a), upon which the servomotors act (as best shown in FIG. 1 H to FIG. 1 n). This allows limited controllable rotation of the central blade mounting member about two orthogonal horizontal axis, allowing advances directional control and/or stabilization of the aerial drone. It will be observed that some diagrams, including FIG. 1 F, show an alternate arrangement for members 163.
  • FIG. 2A schematically illustrates a modular configuration according to one embodiment.
  • this modular configuration is implemented in respect of drone 100.
  • the drone includes three discrete modules (modules 200, 210 230), which are able to be removed and replaced at a consumer level.
  • the drone body provides a communications infrastructure that allows each module to be in essence un-plugged and plugged-in without a need to perform electrical wiring or the like. This preferably enables replacement of modules by end-users, or by technical personnel with minimal effort. Such replacement of modules allows for convenient repair of devices, and in some cases upgrading of devices (for example to improve control/tracking capabilities, camera quality, and so on).
  • Module 200 includes a GPS module 201 , drive modules 202, and a motor controller 203.
  • GPS module 201 is configured to calculate GPS position data, which is transmitted to a flight controller module thereby to enable waypoint navigation and stability.
  • Drive modules 202 include components that are configured to convert electrical energy into mechanical energy, thereby to cause the drone to fly. This includes all of the shafts, gears, motors, etc.
  • a motor controller 203 includes a circuit board which provides current and voltage monitoring and drives the main rotor motors. In a preferred embodiment, this board includes two separate MOSFET devices, which are each configured to control a respective motor, thereby to effect control over a dual rotor assembly drone such as drone 100.
  • Module 201 includes a battery module 21 1 , a main controller 212, a flight controller 213, and a landing gear 214.
  • Battery module 21 1 provides power to fly the drone, and preferably is configured to retain a removable battery unit (that is, a battery unit is replaceable without replacing the entire battery module, which includes battery connection contacts and the like.
  • Main controller 212 acts as a high level control system managing interactions between a user and the flight controller. In this regard, it includes a WiFi module which provides a WiFi hot spot for connectivity to a user (i.e. a user connects to that hotspot thereby to control and otherwise interact with the drone). In further embodiments alternate communications protocols are used.
  • Main controller 212 additionally provides a vision system processing configured for detecting and tracking targets (as discussed in additional detail further below).
  • FIG. 2B illustrates a variation of module 210, wherein battery module 21 1 and main controller 212 are vertically elongate and positioned horizontally adjacent one- another, rather than being stacked vertically as shown in FIG. 2A.
  • Flight controller 213 includes a circuit board containing an IMU. In this regard, it is configured to manage stable flight of the drone using on board sensor data from the IMU, optionally in combination with data derived from GPS module 201 , and/or distance and motions sensors based on vision system processing provided by main controller 212. In the present embodiment, flight controller 213 is additionally configured to navigate waypoints based on GPS data.
  • Landing gear 214 provides a mechanical system that allows the drone to land on a flat surface, and yet still pack compactly. It is preferably electrically released. In some embodiments the landing gear is omitted, in favour of a limited hand-launched and hand-landed configuration.
  • a camera module 231 includes a digital camera mounted on a tilt-angling servomotor. Various other forms of controllable camera mounts may also be used. Camera module 231 preferably allows replacement of camera components, including the likes of lenses, sensors, and the like.
  • FIG. 2C illustrates inter-module connections according to one embodiment:
  • Connector 251 is a low current connector, which provides GPS serial and power cables via 5 pin connectors at each of GPS module 201 and flight controller 213.
  • Connector 252 is a high current connector between drive module 202 and motor controller 203 (which in this embodiment includes two servo motors and two drive motors). At the drive module end it provides a 10 pins, which terminate at two- pin connections at each drive motor, and three-pin connections at each servo motor.
  • Connector 252 is a high current connector between the motor controller and the battery module, thereby to provide power to the motor controller (via a 2-pin connector arrangement).
  • Connector 254 is a low current connector between main controller 212 and flight controller 213. This provides serial connection between main controller and flight controller for data transfer, in this embodiment via a two-pin connection.
  • Connector 255 is a low current connector, providing high speed hardware interfacing (e.g. a CSI port) using a ribbon cable.
  • a 16 pin connection is used, for example a 15mm by 1 mm ribbon cable.
  • Connector 256 is a low current connector (for example 5V, 2A), from the motor controller to the main controller, configured for providing power to the main controller (via a 2-pin connector arrangement).
  • Connector 257 is a low current connector between the main controller and the flight controller, enabling motor driver serial communications thereby to facilitate drone control. A 6-pin connector arrangement is used in this embodiment.
  • Connector 258 is a low current connector configured to provide WiFi capabilities.
  • Connector 259 is a low current connector between the flight controller and the camera module, configured to provide power to camera module and transmit drive signals to the tilt servo drive provided by the camera module.
  • a 3-pin arrangement is used in this embodiment.
  • a key design principle of the above modular design is to incorporate tracking and control algorithms in an on-board manner, thereby to simplify the overall hardware make-up and enable reductions in device costs.
  • FIG. 6 schematically illustrates a motor controller board according to one embodiment. This shows a microcontroller coupled to a single MOSFET gate driver. Which is responsible for controlling two motors.
  • components are selected as follows:
  • MOSFET Low Rds(on) enables low conduction losses negating the need for a heatsink.
  • the Rds(on) is a balance between cost and heat generation under load.
  • the MOSFET should be sized so that the cost is minimised while keeping board temperature under load ⁇ 85°C.
  • Freewheeling diode Diode with low forward voltage drop for high efficiency and an average current capacity of approximately 5A.
  • 5V regulator Switching regulator with up to 4A of current output.
  • high side current sensing is applied to minimise EMI issues compared to low side sensing.
  • Shunt resistor with a power rating of 3W to handle the current.
  • a high side current shunt amplifier is provided for compatibility with the ADC of the microcontroller.
  • a resistor divider with the maximum output set with an extra margin below the maximum ADC input voltage to allow for voltage spikes.
  • a rail-to-rail voltage divider with a TVS diode is applied to handle voltage spikes.
  • FIG. 3 schematically illustrates an example communications arrangement according to an embodiment. This is in some embodiments implemented using the modular configuration of FIG. 2A/2B/2C (although components are labelled via a different schema here).
  • a drone device 300 (which may be drone device 100 of FIG. 1 A) includes a WiFi module 301 (for example on a main controller board).
  • WiFi module 301 enables drone 300 to provide a WiFi hotspot, to which a third party WiFi enabled device can connect.
  • this takes the form of an example client device 310, which may take the form of (for instance) a smartphone, tablet, smartwatch, and so on.
  • the client device executes a software application that is configured to interact with drone 300, to provide functionalities including (but not necessarily limited to) providing control instructions to drone 300 and viewing images captured by a camera module of drone 300.
  • a browser-based interface is used as an alternative to a proprietary application (such as an android or iOS app).
  • a drone control user interface is rendered on a display screen of device 310 (for example a touchscreen display).
  • Connector 321 represents wireless WiFi communications between drone 300 and device 310.
  • a high-level vision/control system 302 provides a central processing hub to interface user instructions (from device 310), input/output from/to a camera module 304 (such as image data which is used for image-based object identification and tracking, and instructions to a camera control servomotor), and output to a flight controller module 303.
  • System 302 preferably also receives additional inputs, for example GPS and/or IMU data (in some cases an IMU is provided on a circuit board that provides system 302).
  • communications are achieved as follows:
  • This provides a streamlined and efficient mechanism for defining flight controls (which are implemented via motors coupled to the rotor blade assemblies) based on inputs including user inputs from device 310 and sensed inputs derived from the camera module and other modules. For example, hardware-interfaced connections (Wifi and Camera) are used for high speed data transfer to minimise latency of the system.
  • the flight controller and a GPS unit also communicate over I2C.
  • System 302 also communicates with an external micro SD storage reader over UART for on-board storage of videos and photos.
  • FIG. 4 illustrates an example component connection arrangement, providing alternate (and more detailed) view as to the configuration of hardware components discussed in preceding sections.
  • selfie generation is a core functionality.
  • One embodiment provides a computer implemented method for controlling an aerial drone system thereby to autonomously follow a human face.
  • the method includes operating the aerial drone system to capture image data via a RGB camera.
  • the use of a single RGB camera (as opposed to multiple cameras, depth-field cameras, and the like) is relevant in the context of increasing device simplicity and reducing costs.
  • the image data is fed into a main controller module, which is a circuit board with on-board image processing capabilities.
  • the image data preferably includes sequential image frames captured as a frame rate, which may be autonomously or manually defined.
  • the method then includes processing the image data thereby to identify a target region predicted to contain a human face. This processing includes:
  • the image processing that takes place is in some embodiments based on an open source library (for example derived from OpenCV).
  • an open source library for example derived from OpenCV.
  • the present vision system utilises an adaptive approach to face detection.
  • the adaptive method allows for reference images of the target to be taken and stored during initialisation, thereby to create a more confident vision system tracking algorithm.
  • this system integrates with references images developed for each target, enabling more reliable tracking of the actual face being considered.
  • the process starts-up in a primary "follow-me” mode, in which reference images of the target are gathered to optimise the vision system.
  • a defined start pose and distance For example, the start post may be defined as a frontal view by default, and/or as a user selected pose (selected by facial positioning), such as a partial profile view or the like.
  • the distance is in some cases a known approximate distance, for example an "arm's length” approximation (such as 0.5m) based upon a distance at which the drone is released from a user's hand and begins to fly.
  • a first tuning function is configured to correlate the distance that each pixel covers in a captured image. This enables tuning each camera module used with the drone for calculating error data used for the purpose of defining control signals (discussed below).
  • a second tuning function is incorporated during the primary "follow me” mode, whereby the drone is held at arm's length, and the detected face is assumed to be 0.5m away. This is then compared against statistical averages of facial feature layouts (i.e. how far apart eyes are, and how far eyes are from the mouth) for improved facial recognition tuning.
  • the method then includes continues processing of image data thereby to enable tracking of the human face, and the defining of control instructions to cause movement of the drone and/or camera module (i.e. camera control servomotor) thereby to retain the tracked human face in a defined portion and/or pose relative to captured image data.
  • the control system utilises a vision system that is able to detect a target that is assumed to be a human face. Once the target is detected and confirmed, the vision system continues to run several image processing functions to maintain lock onto the target face. As the face moves, the vision system provides error data to the control system that then responds with an output signal to the drive system to follow the target. This in some embodiments includes operating a tracking and control algorithm that is configured to:
  • error data being an error based on pose and/or distance
  • control instruction based on the error data, define a control instruction based on a predefined control protocol, wherein the control instruction is configured to apply a predefined degree of three dimensional movement to the aerial drone thereby to reduce an error defined by the error data;
  • the approaches described above enable an aerial drone device to identify a human face, and track that human face as it moves, thereby to enable hands-free selfie photography.
  • a person may wish to take a "selfie” with their head silted about a horizontal axis a little to the left or right; restricting control cased on tracking is used to prevent the drone from continuously moving to capture a front-on view of the face in such a situation.
  • the drone control method includes configuring the tracking and control algorithm to:
  • the first and second sets of facial movement conditions are defined to enable a tracked face to adopt a facial pose without affecting aerial drone position.
  • the first and second sets of facial movement conditions include one or more of the following movement conditions: horizontal movement conditions; vertical axis facial rotation conditions; face-normal horizontal axis facial rotation conditions; and face parallel horizontal axis facial rotation conditions (and combinations thereof).
  • the first and second sets of facial movement conditions are temporally variable, such that at least one facial movement condition is transitioned from the first set to the second set (or vice versa) during a defined time period.
  • This is optionally implemented such that the drone initially adopts a "follow me” mode whereby a face is tracked in a centred neutral pose, and then into a "pose for photo” mode whereby one or more facial movement conditions (such as looking upwards, downwards, or sideways) no longer result in front control instructions.
  • the user positions the drone to take a photo based on a neutral pose, then instructs the drone to adopt the "pose for photo” mode and is free to move his/her face angle without causing drone movement.
  • threshold degrees of head movement remain in the first set of facial movement conditions, in effect allowing differentiation between facial tracking and general head position tracking. An example of this is illustrated in the method of FIG. 5A.
  • a user holds a drone at about arm's length from their body, initiates flight (for example by pressing a button) and releases the drone to fly at 501 .
  • the drone then executes a facial detection process, thereby to detect the user's face at 502. Facial tracking is then configured, in some cases including the capture of reference images of the identified face thereby to enhance tracking algorithms at 503.
  • drone control e.g. flight and camera position
  • This commences based on a first set of constraints (optionally a null set of constraints).
  • the control is such that the drone identifies errors in pose and distance offset, and is controlled thereby to keep the tracked face centred in the image frame in a front-on neutral pose at a distance of about 0.5m.
  • the user provides an instruction to implement a second set of tracking constraints. For example, this may cause the drone to stay in position, or to only track certain forms of movement (such as face height, but not facial pose direction). Tracking then continues at 506 based on that second set of tracking constraints, with images captured at 507.
  • the shift between tracking protocols at 505 is initiated by hand gestures, which may be additionally used to control other aspects of drone position, such as distance, height, and lateral position.
  • such commands are provided by way of a tethered IMU-enabled device.
  • a user initially holds a drone at arm's length at 511.
  • the drone then commences facial detection at 512, and a face is detected at 513.
  • a face is detected at 513.
  • the user holds the drone in one hand as if taking a selfie, and holds a smartphone in the other hand.
  • an event occurs thereby to inform the user of successful facial detection.
  • this may include: • A change in the status of the drone, such as a changing light colour or the like, and/or rotor assemblies accelerating to flight speed.
  • the smartphone beeps and/or vibrates to inform the user that facial detection has been completed, and the user then looks at the smartphone screen to confirm that the correct face has indeed been detected. The user then provides an instruction, which allows initiation of flight based on tracking of that face (see 513).
  • the drone accelerates rotors to a flight (hover) speed, and is released by the user (preferably a visual or audible signal informs a user that they can release the drone). From there, this example follows blocks 504-507 from FIG. 5A.
  • An additional/alternate functionality to restrict facial tracking based control in certain situations is to pre-program known photo poses.
  • the drone is programmed with popular photo pose positions (for a general user case and/or a specific user), and recognises transition into those poses as being separate from control-effecting facial movements. In some embodiments this is used to cause the drone to track a user based on a pose position other than a front-on neutral pose, for example where a user wishes to be photographed from a predefined offset relative to the front-on position.
  • processing in the context of facial recognition and tracking is shared between the drone and a connected device (such as a smartphone).
  • a connected device such as a smartphone
  • software to enable facial recognition and tracking is installed on a smartphone (along with a library of known faces and facial features, and processing at the drone device is focussed upon procuring and delivering video data (i.e. image frames) to the smartphone, with the more complex facial detection and tracking being performed at the smartphone.
  • the smartphone then defines and provides signals which are configured to enable flight control functionalities based on the facial recognition and tracking.
  • the drone remains self-sufficient in terms of other stabilization control and position control functionalities.
  • gesture-driven control whereby processing algorithms are configured to identify predefined human movements (such as hand movements, arm movements, and the like), and interpret those as control commands (for example “move up”, “move down”, “come closer”, “move back”, “take photos”, and so on).
  • predefined human movements such as hand movements, arm movements, and the like
  • interpret those as control commands for example “move up”, “move down”, “come closer”, “move back”, “take photos”, and so on.
  • this is implemented in combination with facial detection, such that only human movements belonging to the same person as a tracked face are identified as being gesture-driven controls.
  • a method includes receiving data representative of images captured by a camera device of the aerial drone device. The method then includes operating an image processing algorithm that is configured to detect the presence of one or more defined gesture-driven commands, wherein the gesture-driven commands relate to human body movements.
  • a facial detection and identification algorithm is operated, the facial detection and identification algorithm being configured to: (i) identify the presence of a human face; and (ii) determine whether the human face corresponds to a prescribed known human face.
  • the method then includes performing analysis thereby to determine whether a detected gesture driven commands corresponds to a prescribed known human face. In the case that a detected gesture driven commands corresponds to a prescribed known human face, implementing a control instruction in respect of the drone device corresponding to the detected gesture driven command.
  • Such an approach is especially useful in a situation where multiple moving objects (for example different people) are identifiable in image data, and there is a desire to prevent unauthorised/accidental gesture-driven controls from being recognised.
  • a WiFi connected device such as a smartphone
  • the connected device's IMU is used to translate the device's relative motion into a control command. For example, by holding the smartphone or tablet in a landscape orientation and gripping it with thumbs free (similar to a steering wheel), a user is enabled to you can adjust the angle of the device to send a control command.
  • These control commands are, in a preferred embodiment:
  • the connected device is configured to provide a live video feed using image data captured by the drone's camera device.
  • image data is captured and transmitted over WiFi, optionally using compression and/or frame rate reduction for the purpose of experience optimisation.
  • control commands are reversed with respect to those shown above for left and right, thereby to provide intuitive control over a drone device that faces the user (for example when operating as a "selfie drone"). That is, left and right commands are mirrored, such that a tilt to the right instructs the drone to move left.
  • the drone device is configured to shift between regular and mirrored command schemas. That is optionally controlled by any one or more or:
  • a mode of operation for example a "selfie” mode or a "pilot” mode.
  • an algorithm is configured to translate a degree of IMU- detected motion into a degree of control command responsive to a mode of drone operation. For example, this in some embodiments implemented reduced proportional effect of IMU-initiated commands where a drone is locked onto a tracking target as opposed to when the drone is in being flown freely.
  • processor may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory.
  • a "computer” or a “computing machine” or a “computing platform” may include one or more processors.
  • the methodologies described herein are, in one embodiment, performable by one or more processors that accept computer-readable (also called machine-readable) code containing a set of instructions that when executed by one or more of the processors carry out at least one of the methods described herein.
  • Any processor capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken are included.
  • a typical processing system that includes one or more processors.
  • Each processor may include one or more of a CPU, a graphics processing unit, and a programmable DSP unit.
  • the processing system further may include a memory subsystem including main RAM and/or a static RAM, and/or ROM.
  • a bus subsystem may be included for communicating between the components.
  • the processing system further may be a distributed processing system with processors coupled by a network. If the processing system requires a display, such a display may be included, e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT) display. If manual data entry is required, the processing system also includes an input device such as one or more of an alphanumeric input unit such as a keyboard, a pointing control device such as a mouse, and so forth. Input devices may also include audio/video input devices, and/or devices configured to derive information relating to characteristics/attributes of a human user.
  • the term memory unit as used herein, if clear from the context and unless explicitly stated otherwise, also encompasses a storage system such as a disk drive unit.
  • the processing system in some configurations may include a sound output device, and a network interface device.
  • the memory subsystem thus includes a computer-readable carrier medium that carries computer-readable code (e.g., software) including a set of instructions to cause performing, when executed by one or more processors, one of more of the methods described herein. Note that when the method includes several elements, e.g., several steps, no ordering of such elements is implied, unless specifically stated.
  • the software may reside in the hard disk, or may also reside, completely or at least partially, within the RAM and/or within the processor during execution thereof by the computer system.
  • the memory and the processor also constitute computer- readable carrier medium carrying computer-readable code.
  • a computer-readable carrier medium may form, or be included in a computer program product.
  • the one or more processors operate as a standalone device or may be connected, e.g., networked to other processor(s), in a networked deployment, the one or more processors may operate in the capacity of a server or a user machine in server-user network environment, or as a peer machine in a peer-to-peer or distributed network environment.
  • the one or more processors may form a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA Personal Digital Assistant
  • each of the methods described herein is in the form of a computer-readable carrier medium carrying a set of instructions, e.g., a computer program that is for execution on one or more processors, e.g., one or more processors that are part of web server arrangement.
  • a computer-readable carrier medium carrying computer readable code including a set of instructions that when executed on one or more processors cause the processor or processors to implement a method.
  • aspects of the present invention may take the form of a method, an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects.
  • the present invention may take the form of carrier medium (e.g., a computer program product on a computer-readable storage medium) carrying computer-readable program code embodied in the medium.
  • the software may further be transmitted or received over a network via a network interface device.
  • the carrier medium is shown in an exemplary embodiment to be a single medium, the term “carrier medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
  • the term “carrier medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by one or more of the processors and that cause the one or more processors to perform any one or more of the methodologies of the present invention.
  • a carrier medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media.
  • Non-volatile media includes, for example, optical, magnetic disks, and magneto-optical disks.
  • Volatile media includes dynamic memory, such as main memory.
  • Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise a bus subsystem. Transmission media also may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
  • carrier medium shall accordingly be taken to included, but not be limited to, solid-state memories, a computer product embodied in optical and magnetic media; a medium bearing a propagated signal detectable by at least one processor of one or more processors and representing a set of instructions that, when executed, implement a method; and a transmission medium in a network bearing a propagated signal detectable by at least one processor of the one or more processors and representing the set of instructions.
  • Coupled when used in the claims, should not be interpreted as being limited to direct connections only.
  • the terms “coupled” and “connected,” along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other.
  • the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means.
  • Coupled may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.

Abstract

La présente invention concerne la technologie du drone, et en particulier la technologie du drone aérien. Ceci comprend à la fois le matériel et la configuration du matériel du drone, et le logiciel et la configuration du logiciel du drone. Des modes de réalisation sont décrits par référence à un "drone à autophoto", qui est un dispositif de drone aérien qui est configuré pour prendre des photos (et/ou des vidéos) d'un utilisateur (par exemple en se positionnant lui-même dans un emplacement défini par rapport à l'utilisateur). Cependant, il sera apprécié que divers aspects de la technologie décrite ici ont une application plus étendue.
PCT/AU2017/050307 2016-04-07 2017-04-07 Dispositifs aériens, ensembles rotors pour dispositifs aériens, et cadres et méthodologies se rapportant aux dispositifs configurés à des fins de commande des dispositifs aériens WO2017173502A1 (fr)

Applications Claiming Priority (12)

Application Number Priority Date Filing Date Title
AU2016901301 2016-04-07
AU2016901298 2016-04-07
AU2016901302A AU2016901302A0 (en) 2016-04-07 Frameworks and methodologies configured to enable drone control via a device having motion sensing components
AU2016901301A AU2016901301A0 (en) 2016-04-07 Frameworks and methodologies configured to enable drone control via a graphical user interface
AU2016901299A AU2016901299A0 (en) 2016-04-07 Printed circuit board configurations for an aerial drone device
AU2016901300 2016-04-07
AU2016901302 2016-04-07
AU2016901298A AU2016901298A0 (en) 2016-04-07 A rotor assembly for an aerial device
AU2016901295 2016-04-07
AU2016901295A AU2016901295A0 (en) 2016-04-07 Frameworks and methodologies configured to enable control of aerial drones based on object recognition
AU2016901299 2016-04-07
AU2016901300A AU2016901300A0 (en) 2016-04-07 Frameworks and methodologies configured to enable gesture-driven control of a drone device

Publications (1)

Publication Number Publication Date
WO2017173502A1 true WO2017173502A1 (fr) 2017-10-12

Family

ID=60000177

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2017/050307 WO2017173502A1 (fr) 2016-04-07 2017-04-07 Dispositifs aériens, ensembles rotors pour dispositifs aériens, et cadres et méthodologies se rapportant aux dispositifs configurés à des fins de commande des dispositifs aériens

Country Status (1)

Country Link
WO (1) WO2017173502A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019079100A1 (fr) * 2017-10-20 2019-04-25 Amazon Technologies, Inc. Configuration distribuée et reconfigurable de véhicule aérien
US20220144429A1 (en) * 2020-11-06 2022-05-12 Yana SOS, Inc. Flight-enabled signal beacon
US11480958B2 (en) 2015-02-19 2022-10-25 Amazon Technologies, Inc. Collective unmanned aerial vehicle configurations
US11587181B1 (en) * 2019-07-31 2023-02-21 United Services Automobile Association (Usaa) Property damage assessment system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5628620A (en) * 1991-09-30 1997-05-13 Arlton; Paul E. Main rotor system for helicopters
WO2009005875A2 (fr) * 2007-04-19 2009-01-08 Samuel Alan Johnson Robot aérien distribuant un filament conducteur
WO2014025444A2 (fr) * 2012-05-21 2014-02-13 Arlton Paul E Véhicule à ailes tournantes
US20140299708A1 (en) * 2011-05-23 2014-10-09 John Green Rocket or ballistic launch rotary wing vehicle
WO2015138217A1 (fr) * 2014-03-13 2015-09-17 Endurant Systems, Llc Configurations d'uav et extension par batterie pour moteurs à combustion interne d'uav, et systèmes et méthodes associés
ES2549365A1 (es) * 2015-05-26 2015-10-27 Pablo MÁRQUEZ SERRANO Soporte volador para cámaras
WO2016012790A1 (fr) * 2014-07-23 2016-01-28 Airbus Ds Limited Améliorations apportées et relatives à des véhicules aériens sans pilote
WO2016077278A1 (fr) * 2014-11-10 2016-05-19 Ascent Aerosystems Llc Dispositif volant sans pilote
WO2016078056A1 (fr) * 2014-11-20 2016-05-26 SZ DJI Technology Co., Ltd. Procédé d'adressage de modules fonctionnels d'un objet mobile

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5628620A (en) * 1991-09-30 1997-05-13 Arlton; Paul E. Main rotor system for helicopters
WO2009005875A2 (fr) * 2007-04-19 2009-01-08 Samuel Alan Johnson Robot aérien distribuant un filament conducteur
US20140299708A1 (en) * 2011-05-23 2014-10-09 John Green Rocket or ballistic launch rotary wing vehicle
WO2014025444A2 (fr) * 2012-05-21 2014-02-13 Arlton Paul E Véhicule à ailes tournantes
WO2015138217A1 (fr) * 2014-03-13 2015-09-17 Endurant Systems, Llc Configurations d'uav et extension par batterie pour moteurs à combustion interne d'uav, et systèmes et méthodes associés
WO2016012790A1 (fr) * 2014-07-23 2016-01-28 Airbus Ds Limited Améliorations apportées et relatives à des véhicules aériens sans pilote
WO2016077278A1 (fr) * 2014-11-10 2016-05-19 Ascent Aerosystems Llc Dispositif volant sans pilote
WO2016078056A1 (fr) * 2014-11-20 2016-05-26 SZ DJI Technology Co., Ltd. Procédé d'adressage de modules fonctionnels d'un objet mobile
ES2549365A1 (es) * 2015-05-26 2015-10-27 Pablo MÁRQUEZ SERRANO Soporte volador para cámaras

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11480958B2 (en) 2015-02-19 2022-10-25 Amazon Technologies, Inc. Collective unmanned aerial vehicle configurations
WO2019079100A1 (fr) * 2017-10-20 2019-04-25 Amazon Technologies, Inc. Configuration distribuée et reconfigurable de véhicule aérien
US20190118944A1 (en) * 2017-10-20 2019-04-25 Amazon Technologies, Inc. Distributed and reconfigurable aerial vehicle configuration
US11066163B2 (en) 2017-10-20 2021-07-20 Amazon Technologies, Inc. Distributed and reconfigurable aerial vehicle configuration
US11587181B1 (en) * 2019-07-31 2023-02-21 United Services Automobile Association (Usaa) Property damage assessment system
US20220144429A1 (en) * 2020-11-06 2022-05-12 Yana SOS, Inc. Flight-enabled signal beacon

Similar Documents

Publication Publication Date Title
CN110692027B (zh) 用于提供无人机应用的易用的释放和自动定位的系统和方法
US11340606B2 (en) System and method for controller-free user drone interaction
US11423792B2 (en) System and method for obstacle avoidance in aerial systems
EP3494443B1 (fr) Systèmes et procédés permettant de commander une image capturée par un dispositif d'imagerie
WO2017181511A1 (fr) Borne et système de commande pour véhicule aérien sans pilote
US20200346753A1 (en) Uav control method, device and uav
US20190335084A1 (en) System and method for providing autonomous photography and videography
WO2018098784A1 (fr) Procédé, dispositif, équipement et système de commande de véhicule aérien sans pilote
CN106164562A (zh) 促进定位和定向移动计算设备的托架
WO2017173502A1 (fr) Dispositifs aériens, ensembles rotors pour dispositifs aériens, et cadres et méthodologies se rapportant aux dispositifs configurés à des fins de commande des dispositifs aériens
WO2018119578A1 (fr) Appareil transformable
CN107450573B (zh) 飞行拍摄控制系统和方法、智能移动通信终端、飞行器
WO2021088684A1 (fr) Procédé d'évitement d'obstacle omnidirectionnel et véhicule aérien sans pilote
WO2017222664A1 (fr) Commande de la capture d'un flux multimédia avec des réponses physiques d'utilisateur
CN105807783A (zh) 飞行相机
US20200382696A1 (en) Selfie aerial camera device
US20230033760A1 (en) Aerial Camera Device, Systems, and Methods
KR20170019777A (ko) 플라잉 봇의 촬영 제어 장치 및 그 제어방법

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17778486

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17778486

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 11/03/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17778486

Country of ref document: EP

Kind code of ref document: A1