WO2022134024A1 - Véhicule aérien sans pilote ayant des composants interactifs pour l'utilisateur et une structure pliable - Google Patents

Véhicule aérien sans pilote ayant des composants interactifs pour l'utilisateur et une structure pliable Download PDF

Info

Publication number
WO2022134024A1
WO2022134024A1 PCT/CN2020/139505 CN2020139505W WO2022134024A1 WO 2022134024 A1 WO2022134024 A1 WO 2022134024A1 CN 2020139505 W CN2020139505 W CN 2020139505W WO 2022134024 A1 WO2022134024 A1 WO 2022134024A1
Authority
WO
WIPO (PCT)
Prior art keywords
aerial vehicle
unmanned aerial
user
uav
sub
Prior art date
Application number
PCT/CN2020/139505
Other languages
English (en)
Inventor
Yue Gu
Shaojun YAN
Xiaozheng TANG
Original Assignee
SZ DJI Technology Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co., Ltd. filed Critical SZ DJI Technology Co., Ltd.
Priority to PCT/CN2020/139505 priority Critical patent/WO2022134024A1/fr
Priority to CN202080103877.7A priority patent/CN116194369A/zh
Priority to CN202023352348.7U priority patent/CN214356664U/zh
Publication of WO2022134024A1 publication Critical patent/WO2022134024A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/50Foldable or collapsible UAVs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls

Definitions

  • the present disclosure generally relates to an unmanned movable object, and systems and methods associated with operation of the unmanned movable object, and, more particularly, to systems, devices, and methods associated with operating an unmanned movable object integrated with one or more interactive components, such as an interactive display, and/or user interface elements on a display.
  • Unmanned aerial vehicles include pilotless aircraft of various sizes and configurations that can be remotely operated by a user and/or programmed for automated flight. UAVs can be equipped with cameras to capture images and videos for various purposes including, but not limited to, recreation, surveillance, sports, and aerial photography.
  • a user is required to use a secondary device in communication with a UAV, such as a controller or a mobile phone, to operate the UAV and a camera on-board the UAV.
  • a secondary device such as a controller or a mobile phone
  • UAV a secondary device
  • it may take the user extra effort and time to learn, practice, and master the controlling process.
  • the user often gets distracted from an ongoing activity (e.g., a hike, a conference, a work-out, a festivity, etc. ) as the user needs to transfer his or her attention to operation of the controller or the mobile phone to communicate with the UAV.
  • UAVs are becoming more intelligent and powerful for performing various autonomous functions
  • users may be frustrated by a cumbersome experience and even discouraged from using UAVs as much as they would like to.
  • users are not effectively taking full advantage of the UAV’s intelligence and powerful functions, and are missing opportunities to timely record subject matter of interest with the camera on-board the UAV.
  • an unmanned aerial vehicle includes one or more interactive components onboard the unmanned movable object configured to display information associated with operation of the unmanned movable object and detect an input of a user, received on the one or more interactive components, to interact with the displayed information.
  • the unmanned aerial vehicle further includes a propulsion system configured to propel the unmanned aerial vehicle; and one or more processors coupled to memory configured to store instructions for execution by the one or more processors to control operation of the unmanned aerial vehicle in accordance with the input.
  • the method includes displaying information associated with operation of the unmanned aerial vehicle via one or more interactive components onboard the unmanned aerial vehicle.
  • the method further includes detecting an input of a user received on the one or more interactive components of the unmanned aerial vehicle, the input interacting with the displayed information.
  • the method also includes generating, in accordance with the input, instructions for operating the movable object, comprising at least one of: controlling movement or navigation of the unmanned aerial vehicle; or controlling operation of an imaging device onboard on the unmanned aerial vehicle.
  • a foldable movable object including a body portion including one or more interactive components configured to display information associated with operation of the foldable movable object and detect an input of a user, received on the one or more interactive components, to interact with the displayed information.
  • the foldable movable object further includes a propulsion system connected to the body portion and including two sub-parts that are rotatably coupled to each other via a coupling portion, each sub-part including at least one set of rotary components configured to rotate to propel the foldable movable object.
  • FIG. 1A shows an example diagram illustrating a relationship between an unmanned aerial vehicle, a remote control device, and an operator, in accordance with embodiments of the present disclosure.
  • FIG. 1B shows an example diagram illustrating a relationship between an unmanned aerial vehicle including one or more interactive components for providing direct interaction with an operator, in accordance with embodiments of the present disclosure.
  • FIG. 1C shows an example block diagram illustrating various components of an unmanned aerial vehicle, in accordance with embodiments of the present disclosure.
  • FIGS. 2A-2C illustrate examples of non-foldable unmanned aerial vehicles including one or more interactive components, in accordance with embodiments of the present disclosure.
  • FIG. 2D illustrates an example of user interaction with an interactive component on a non-foldable unmanned aerial vehicle, in accordance with embodiments of the present disclosure.
  • FIG. 3A illustrates an example of a foldable unmanned aerial vehicle in an unfolded state, in accordance with embodiments of the present disclosure.
  • FIG. 3B illustrates an example of a foldable unmanned aerial vehicle in a folded state, in accordance with embodiments of the present disclosure.
  • FIGS. 3C-3F illustrate examples of user interaction with a foldable unmanned aerial vehicle, in accordance with embodiments of the present disclosure.
  • FIG. 4 is a flow diagram of an example process of operating an unmanned aerial vehicle, in accordance with embodiments of the present disclosure.
  • FIG. 5A illustrates an example of a foldable unmanned aerial vehicle in a folded state, in accordance with embodiments of the present disclosure.
  • FIG. 5B illustrates an example of the foldable unmanned aerial vehicle of FIG. 5A in a first state, in accordance with embodiments of the present disclosure.
  • FIG. 5C illustrates an example of the foldable unmanned aerial vehicle of FIG. 5A in a second state, in accordance with embodiments of the present disclosure.
  • FIG. 5D illustrates an example of an unfolded state of the foldable unmanned aerial vehicle of FIG. 5A, in accordance with embodiments of the present disclosure.
  • FIG. 6A illustrates an example of the foldable unmanned aerial vehicle of FIG. 5A in an unfolded state, viewed from a different perspective from FIG. 5D, in accordance with embodiments of the present disclosure.
  • FIG. 6B illustrates a cross-sectional view showing a coupling relationship between a body portion and a sub-section via coupling parts of the foldable unmanned aerial vehicle of FIG. 5A, in accordance with embodiments of the present disclosure.
  • FIGS. 6C and 6D illustrate a limiting mechanism connecting a body portion and a sub-section for limiting folding state between the body portion and the sub-section of the foldable unmanned aerial vehicle of FIG. 5A, in accordance with embodiments of the present disclosure.
  • FIG. 6E illustrates a side view showing the relationship between a body portion and two sub-sections of a foldable unmanned aerial vehicle in an unfolded state of the foldable unmanned aerial vehicle of FIG. 5A, in accordance with embodiments of the present disclosure.
  • FIGS. 7A and 7B illustrate examples of a folding structure of a foldable unmanned aerial vehicle, in accordance with embodiments of the present disclosure.
  • FIG. 1A shows an example diagram 100’ illustrating a relationship between an unmanned aerial vehicle ( “UAV” ) , provided as a conventional UAV 102’, a remote control device 130 (e.g., a remote controller) , and an operator 152, in accordance with embodiments of the present disclosure.
  • UAV unmanned aerial vehicle
  • remote control device 130 e.g., a remote controller
  • operator 152 an operator 152
  • FIG. 1A shows an example diagram 100’ illustrating a relationship between an unmanned aerial vehicle ( “UAV” ) , provided as a conventional UAV 102’, a remote control device 130 (e.g., a remote controller) , and an operator 152, in accordance with embodiments of the present disclosure.
  • UAV 102 unmanned aerial vehicle
  • remote control device 130 e.g., a remote controller, a mobile phone, etc.
  • remote control device 130 may need to first establish a connection, e.g., a wireless communication, with UAV 102’, and then operator 152 can use remote control device 130 to operate UAV 102’. During this process, operator 152 interacts with remote control device 130. Operating instructions and information feedback are transmitted between UAV 102’ and remote control device 130 via the established wireless communication.
  • a connection e.g., a wireless communication
  • remote control device 130 can increase the burden on operator 152.
  • operator 152 needs to additionally carry remote control device 130, turn on remote control device 130, wirelessly connect remote control device 130 to UAV 102’, and then input operating instructions through various input mechanisms, such as buttons and/or rockers, etc., on remote control device 130.
  • This conventional way of operation not only increases the burden on operator 152 to carry additional devices when traveling, but also increases the preparation time before use.
  • operating UAV 102’ using remote control device 130 can be complicated and present a certain learning curve for some operators.
  • buttons e.g., for power on or off
  • buzzers, and/or indicators without using remote control device 130
  • the number and level of complexity of the commands input by operator 152, and interaction with UAV 102’ are limited.
  • information may be indicated by the color of the indicators, a flashing frequency, a combination of indicators being on and off, or variations in sound provided by the buzzers.
  • neither indicators and buzzers nor simple buttons can provide visual status feedback or real-time camera images.
  • Simple buttons also present challenges to provide more complicated and versatile controls of UAV 102’.
  • operator 152 may only be able to control UAV 102’ by combinations of long and short pressing the limited number of buttons.
  • the visual feedback of aircraft information and the real-time image data captured by the camera cannot be presented to the user in real time.
  • a multi-rotor aircraft without blade protection can pose a safety risk to operator 152 and surrounding people during operation.
  • operator 152 usually places the aircraft on a surface and steps away from the surface before instructing the aircraft to take off. People surrounding the aircraft should also keep a distance during its takeoff, thus increasing burden and resulting in negative user experience.
  • various types of detachable blade protection covers are available, but it takes extra space to carry and requires extra effort and time to install them on the aircraft.
  • Embodiments described herein provide UAVs with an improved user interface.
  • the disclosed embodiments include a foldable and/or non-foldable UAVs integrated with one or more user interactive components and/or user interface elements on a display to provide the user with direct visual display and interaction for inputting user instructions to operate the UAV.
  • operator 152 can directly input instructions via the interactive components integrated on a UAV without using an additional device and allow the UAV to take off from the user’s hand.
  • FIG. 1B shows an example diagram 100 illustrating a relationship between an unmanned movable object, provided as an unmanned aerial vehicle ( “UAV” ) 102 and operator 152, UAV 102 including one or more interactive components 180 for providing direct interaction with operator 152, in accordance with embodiments of the present disclosure.
  • interactive components 180 include an interactive display, such as a touch screen, integrated on UAV 102 for displaying various content to operator 152, including but not limited to, an operation menu or real-time visual feedback provided by an imaging device onboard UAV 102.
  • the interactive display can receive user input directly on the touch screen for controlling the imaging device and/or operating UAV 102.
  • operator 152 does not need an additional remote control device, such as remote controller 130 in FIG. 1A.
  • operator 152 can carry fewer device.
  • operator 152 can conveniently input operation instructions to UAV 102 via interactive components 180 and allow UAV 102 to take off quickly and safely.
  • interactive components 180 can include multiple types of input mechanisms for receiving user commands, including but not limited to, a touchscreen, one or more buttons, an audio recording device for receiving audio commands, and/or an imaging device for capturing user body movement.
  • Interactive components 180 can also include multiple types of sensors onboard UAV 102 as disclosed herein.
  • UAV 102 can include blade protection mechanisms integrated in either a foldable or non-foldable structure, while leaving sufficient space for operator 152 to hold UAV 102, and view and interact with content displayed on the interactive display.
  • UAV 102 can provide operator 152 an area to conveniently hold UAV 102 and let UAV 102 directly take off from or land on a hand of the operator 152 safely. As a result, no additional remote control devices are needed to perform various types of operations.
  • the foldable structure disclosed herein can provide a compact volume for convenient storage and carrying.
  • FIG. 1C shows an example block diagram illustrating various components of UAV 102, in accordance with embodiments of the present disclosure. While the block diagram is configured for operation of a movable object provided as UAV 102, the movable object could instead be provided as any other suitable object, device, mechanism, system, or machine configured to travel on or within a suitable medium (e.g., surface, air, water, rails, space, underground, etc. ) .
  • the movable object may also be other types of movable object (e.g., wheeled objects, nautical objects, locomotive objects, other aerial objects, etc. ) .
  • UAV 102 refers to an aerial device configured to be operated and/or controlled automatically or autonomously based on commands detected by one or more interactive components 180 onboard UAV 102.
  • the one or more interactive components 180 include an imaging sensor 107, an audio sensor 174, an interactive display 181, one or more input mechanisms 182 (e.g., buttons, etc. ) , and other types of sensors 183 onboard UAV 102 (e.g., an ultrasonic sensor, a range sensor, a thermal sensor, a biometric sensor, and/or a motion sensor, etc. ) .
  • UAV 102 may be configured to be operated and/or controlled by an off-board device, such as a remote controller or a mobile device (not shown) .
  • UAV 102 includes one or more propulsion devices 104.
  • image sensor 107 may be connected or attached to UAV 102 by a carrier (e.g., such as a gimbal) , to allow for one or more degrees of relative movement (e.g., rotation) between the image sensor 107 and UAV 102.
  • image sensor 107 may also be mounted directly to UAV 102 without the carrier.
  • UAV 102 also includes a sensing system 172, a communication system 178, and an onboard controller 176 in communication with the other components.
  • UAV 102 includes one or more (e.g., 1, 2, 3, 3, 4, 5, 10, 15, 20, etc. ) propulsion devices 104 positioned at various locations (for example, top, sides, front, rear, and/or bottom of UAV 102) for propelling and steering UAV 102.
  • Propulsion devices 104 are devices or systems operable to generate forces for sustaining controlled flight.
  • Propulsion devices 104 may share or may each separately include or be operatively connected to a motive component, such as a motor (e.g., an electric motor, hydraulic motor, pneumatic motor, etc. ) or an engine (e.g., an internal combustion engine, a turbine engine, etc. ) , and a power source, e.g., a battery bank, etc., or a combination thereof.
  • a motive component such as a motor (e.g., an electric motor, hydraulic motor, pneumatic motor, etc. ) or an engine (e.g., an internal combustion engine, a turbine engine, etc. )
  • a power source
  • Each propulsion device 104 may also include one or more rotary components drivably connected to motive component (s) (not shown) and configured to be driven to generate forces for sustaining controlled flight.
  • the rotary components may include rotors, propellers, blades, nozzles, etc., which may be driven on or by a shaft, axle, wheel, hydraulic system, pneumatic system, or other component or system configured to transfer power from the motive component (s) .
  • Propulsion devices 104 and/or rotary components may be adjustable (e.g., tiltable) with respect to each other and/or with respect to UAV 102.
  • propulsion devices 104 and rotary components may have a fixed orientation with respect to each other and/or UAV 102.
  • each propulsion device 104 may be of the same type. In other embodiments, propulsion devices 104 may be of multiple different types. In some embodiments, all propulsion devices 104 may be controlled in concert (e.g., all at the same speed and/or angle) . In other embodiments, one or more propulsion devices may be independently controlled with respect to, e.g., speed and/or angle. In some embodiments as discussed herein, one or more protection mechanisms 110 may be provided to prevent the rotary components of propulsion devices 104 from contacting operator 152 when holding UAV 102. In some embodiments, protection mechanisms 110 may be provided to propulsion devices 104 one-to-one, respectively.
  • protection mechanisms 110 may be integrated in a continuous member configured to cover any number of propulsion devices 104. In some embodiments, protection mechanisms 110 may be removable. In some other embodiments, protection mechanisms 110 may be non-removable, e.g., connected to or a part of a body portion of UAV 102.
  • Propulsion devices 104 may be configured to propel UAV 102 in one or more vertical and horizontal directions and to allow UAV 102 to rotate about one or more axes. That is, propulsion devices 104 may be configured to provide lift and/or thrust for creating and maintaining translational and rotational movements of UAV 102. For instance, propulsion devices 104 may be configured to enable UAV 102 to achieve and maintain desired altitudes, provide thrust for movement in all directions, and provide for steering of UAV 102. In some embodiments, propulsion devices 104 may enable UAV 102 to perform vertical takeoffs and landings (i.e., takeoff and landing without horizontal thrust) . Propulsion devices 104 may be configured to enable movement of UAV 102 along and/or about multiple axes.
  • UAV 102 may further include one or more sensor devices.
  • the sensor devices may include devices for collecting or generating data or information, such as surveying, tracking, operation command, and capturing images or video of targets (e.g., objects, landscapes, subjects of photo or video shoots, etc. ) .
  • the sensor devices may also or alternatively include other suitable sensors for capturing visual, audio, and/or electromagnetic signals.
  • image sensor 107 is configured to gather data that may be used to generate images. As disclosed herein, image data obtained by image sensor 107 may be processed and analyzed to obtain commands and instructions from one or more users to operate UAV 102 and/or image sensor 107.
  • image sensor 107 may include photographic cameras, video cameras, infrared imaging devices, ultraviolet imaging devices, x-ray devices, ultrasonic imaging devices, radar devices, etc.
  • Interactive components 180 may include devices, such as audio sensor 174, for capturing audio data, such as microphones or ultrasound detectors. Audio sensor 174 may be included or integrated in image sensor 107.
  • Sensing system 172 of UAV 102 may include one or more onboard sensors (not shown) associated with one or more components or other systems.
  • sensing system 172 may include sensors for determining positional information, velocity information, and acceleration information relating to UAV 102 and/or targets.
  • Components of sensing system 172 may be configured to generate data and information for use (e.g., processed by onboard controller 176 or another device) in determining additional information about UAV 102, its components, and/or its targets.
  • Sensing system 172 may include one or more sensors for sensing one or more aspects of movement of UAV 102.
  • sensing system 172 may include sensory devices, such as a positioning sensor for a positioning system (e.g., GPS, GLONASS, Galileo, Beidou, GAGAN, RTK, etc. ) , motion sensors, inertial sensors (e.g., IMU sensors, MIMU sensors, etc. ) , proximity sensors, imaging device 107, etc.
  • Sensing system 172 may also include sensors configured to provide data or information relating to the surrounding environment, such as weather information (e.g., temperature, pressure, humidity, etc. ) , lighting conditions (e.g., light-source frequencies) , air constituents, or nearby obstacles (e.g., objects, structures, people, other vehicles, etc. ) .
  • Communication system 178 of UAV 102 may be configured to enable communication of data, information, commands, and/or other types of signals between onboard controller 176 and one or more off-board devices, such as a remote controller, a mobile device, a server 110 (e.g., a cloud-based server) , or another suitable entity.
  • Communication system 178 may include one or more onboard components configured to send and/or receive signals, such as receivers, transmitter, or transceivers, that are configured for one-way or two-way communication.
  • the onboard components of communication system 178 may be configured to communicate with off-board devices via one or more communication networks, such as radio, cellular, Bluetooth, Wi-Fi, RFID, and/or other types of communication networks usable to transmit signals indicative of data, information, commands, and/or other signals.
  • communication system 178 may be configured to enable communication with off-board devices for providing input for controlling UAV 102 during flight.
  • Onboard controller 176 of UAV 102 may be configured to communicate with various devices onboard UAV 102, such as interactive components 180, communication system 178, and sensing system 172. Controller 176 may also communicate with a positioning system (e.g., a global navigation satellite system, or GNSS) to receive data indicating the location of UAV 102. Onboard controller 176 may communicate with various other types of devices, including a barometer, an inertial measurement unit (IMU) , a transponder, or the like, to obtain positioning information and velocity information of UAV 102.
  • a positioning system e.g., a global navigation satellite system, or GNSS
  • Onboard controller 176 may communicate with various other types of devices, including a barometer, an inertial measurement unit (IMU) , a transponder, or the like, to obtain positioning information and velocity information of UAV 102.
  • IMU inertial measurement unit
  • Onboard controller 176 may also provide control signals (e.g., in the form of pulsing or pulse width modulation signals) to one or more electronic speed controllers (ESCs) configured to control one or more of propulsion devices 104. Onboard controller 176 may thus control the movement of UAV 102 by controlling one or more ESCs. As disclosed herein, onboard controller 176 may further include circuits and modules configured to process user input received on one or more interactive components 180, speech recognition, image recognition, speaker identification, and/or other functions disclosed herein.
  • control signals e.g., in the form of pulsing or pulse width modulation signals
  • ESCs electronic speed controllers
  • Onboard controller 176 may further include circuits and modules configured to process user input received on one or more interactive components 180, speech recognition, image recognition, speaker identification, and/or other functions disclosed herein.
  • interactive components 180 onboard UAV 102 may be configured to receive input, such as input from a user (e.g., a manual input on interactive display 181 or input mechanisms 182, speech input captured by audio sensor 174, user gestures captured by image sensor 107, information detected by various types of sensors 183, and communicate signals indicative of the input to controller 176.
  • controller 176 may be configured to generate corresponding signals indicative of one or more types of information, such as control data (e.g., signals) for moving or manipulating UAV 102 (e.g., via propulsion devices 104) , a payload (not shown) , and/or a carrier (not shown) .
  • Interactive components 180 may also be configured to receive data and information from UAV 102, such as data collected by or associated with image sensor 107, and operational data relating to, for example, positional data, velocity data, acceleration data, sensory data, visual feedback, and other data and information relating to UAV 102, its components, and/or its surrounding environment.
  • Interactive components 180 may further include input mechanisms 182, such as physical sticks, levers, rockers, switches, wearable apparatus, touchable display elements, and/or buttons configured to control flight parameters.
  • Interactive components 180 may further include different types of sensors 183, such as an ultrasonic sensor, a range sensor, a thermal sensor, a biometric sensor, a touch sensor, and/or a motion sensor, etc.
  • a touch sensor on a landing gear can detect whether UAV 102 is in contact with a user or an object.
  • a biometric sensor may be configured to support facial recognition, fingerprint recognition, voice recognition, and/or body temperature measurement.
  • Interactive components 180 can include hardware parts embedded on UAV 102. Further, as discussed herein, one or more user interface elements ( “UI” elements) can be provided on display 181 onboard UAV 102, and a user can interact with the UI elements for controlling movement or navigation of UAV 102, and/or controlling operation of image sensor 107 onboard UAV 102. User input received by interactive components 180 can be processed to generate operation instructions for controlling UAV 102.
  • interactive components 180 may be used to receive user inputs of other information, such as manual control settings, automated control settings, control assistance settings, and/or aerial photography settings. It is understood that different combinations or layouts of input devices for interactive components 180 are possible and within the scope of this disclosure.
  • Interactive display 181 may be configured to display information, such as signals indicative of information or data relating to movements of UAV 102 and/or data (e.g., imaging data) captured by UAV 102 (e.g., in conjunction with image sensor 107) .
  • interactive display 181 may be a multifunctional display device configured to display information as well as receive user input.
  • interactive display 181 may display an interactive graphical interface (GUI) for receiving one or more user inputs.
  • GUI interactive graphical interface
  • interactive display 181 may be configured to work in conjunction with a computer application (e.g., an “app” ) to provide an interactive interface on the display device or multifunctional screen for displaying information received from UAV 102 and for receiving user inputs.
  • a computer application e.g., an “app”
  • interactive display 181 may include interactive means, e.g., a touchscreen, for the user to identify or select a portion of the image of interest to the user.
  • interactive display 181 may be an integral component, e.g., attached or fixed, to UAV 102.
  • interactive display 181 may be electronically connectable to (and dis-connectable from) a part of UAV 102 (e.g., via a connection port or a wireless communication link) and/or otherwise connectable to the corresponding part via a mounting device, such as by a clamping, clipping, clasping, hooking, adhering, or other type of mounting device.
  • interactive display 181 may be located on a hand-held portion, upper or lower side, top of bottom surface, or any other suitable location on UAV 102.
  • Interactive display 181 may use a flip screen that can be opened or detached from a body portion when in use.
  • UAV 102 includes a memory 162 and at least one processor 160 which can be used to process various types of user input received by interactive components 180 onboard UAV 102.
  • memory 162 and processor 160 of UAV 102 are also configured to determine operation instructions corresponding to the received user input. The operation instructions are transmitted, e.g., substantially in real time with the flight of UAV 102, to related controlling and propelling components of UAV 102 and/or image sensor 107, and/or other devices for corresponding control and operations.
  • processor 160 is configured to execute modules, programs and/or instructions stored in memory 162 and thereby performing predefined operations.
  • Processor 160 may be any suitable hardware processor, such as an image processor, an image processing engine, an image-processing chip, a graphics-processor (GPU) , a microprocessor, a micro-controller, a central processing unit (CPU) , a network processor (NP) , a digital signal processor (DSP) , an application specific integrated circuit (ASIC) , a field-programmable gate array (FPGA) , or another programmable logic device, discrete gate or transistor logic device, discrete hardware component.
  • a graphics-processor GPU
  • microprocessor a micro-controller
  • CPU central processing unit
  • NP network processor
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field-programmable gate array
  • Memory 162 may include high-speed random access memory, such as DRAM, SRAM, or other random access solid state memory devices.
  • memory 162 includes non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices.
  • memory 162 includes one or more storage devices remotely located from processor (s) 160.
  • memory 162 or the computer readable storage medium of memory 162 stores one or more computer program instructions (e.g., modules) and a database, or a subset thereof that are configured to perform one or more steps of one or more processes as disclosed herein, e.g., process 400 in FIG. 4.
  • Memory 162 may also store images captured by imaging sensor 107, for processing by processor 160, operation instructions for controlling UAV 102 and imaging sensor 107, and/or the like.
  • UAV 102 integrated with interactive components 180 can communicatively couple to one or more electronic devices, such as a remote controller, a mobile device, and/or a server (e.g., cloud-based server) via a network in order to exchange information with one another and/or other additional devices and systems.
  • the network may be any combination of wired and wireless local area network (LAN) and/or wide area network (WAN) , such as an intranet, an extranet, and the internet.
  • the network is capable of providing communications between the one or more electronic devices as disclosed in the present disclosure.
  • UAV 102 is capable of transmitting data (e.g., image data and/or motion data) detected by one or more sensors on-board (e.g., imaging sensor 107, and/or inertial measurement unit (IMU) sensors included in sensing system 172) in real-time during movement of UAV 102 to the remote controller, the mobile device, and/or the server that are configured to process the data.
  • data e.g., image data and/or motion data
  • sensors on-board e.g., imaging sensor 107, and/or inertial measurement unit (IMU) sensors included in sensing system 172
  • IMU inertial measurement unit
  • operation instructions can be transmitted from the remote control, the mobile device, and/or the cloud-based server to UAV 102 in real-time to control the flight of UAV 102 and components thereof via any suitable communication techniques, such as local area network (LAN) , wide area network (WAN) (e.g., the Internet) , cloud environment, telecommunications network (e.g., 3G, 4G) , WiFi, Bluetooth, radiofrequency (RF) , infrared (IR) , or any other communications technique.
  • LAN local area network
  • WAN wide area network
  • telecommunications network e.g., 3G, 4G
  • WiFi Wireless Fidelity
  • Bluetooth wireless local area network
  • RF radiofrequency
  • IR infrared
  • FIGS. 2A-2C illustrate various examples of non-foldable UAVs, such as UAV 200, 220, and 240, respectively, where each UAV includes one or more interactive components, in accordance with embodiments of the present disclosure.
  • UAVs 200, 220, and 240 may be similar to UAV 102 described above with reference to FIGS. 1B and 1C.
  • UAV 200 is non-foldable and includes multiple sets of multi-blade propellers 202 configured to propel UAV 200.
  • UAV 200 includes a body portion 203 and multiple protection portions 205 (similar to protection mechanisms 110) respectively configured to safely enclose propellers 202 respectively.
  • Body portion 203 and multiple protection portions 205 may form a continuous structure.
  • an inner wall (e.g., in FIGS. 2B and 2C) of protection portion 205 is perpendicular to a horizontal plane within which propeller 202 rotates to enclose the corresponding propeller 202. A certain space exists between the blades and the inner wall of protection portion 205.
  • protection portions 205 may use any suitable form of blade protection mechanisms, including but not limited to, a protection well structure without covering when the protection well is deep enough to contain the corresponding propeller 202 (e.g., as shown in FIG. 2B) , or a covering structure, such as a net-like covering or a mesh-like covering (e.g., as shown in FIG. 3A) , to prevent propeller 202 from directly contacting an external object, such as a user or an object.
  • body portion 203 includes an interactive component, such as a display 206, corresponding to display 181, disposed between propellers 202.
  • Display 206 may be an interactive display, such as a touchscreen.
  • Display 206 may be embedded, e.g., fixedly integrate, in body portion 203. Display 206 may also be removably coupled, e.g., via magnetic attraction or any suitable coupling mechanism, to a receiving portion of body portion 203.
  • body portion 203 includes one or more hand-holding areas, such as an area 204 and an area 208, provided in spaces between propellers 202 or between propeller 202 and display 206.
  • UAV 220 in FIG. 2B includes a body portion 223 and propellers 222 enclosed by protection portions 225.
  • UAV 220 includes a display 226 placed closer to one edge of UAV 220.
  • UAV 220 further includes other types of interactive components, such as an imaging device 229 (similar to imaging device 107) , and/or an input mechanism, such as a button, for receiving user input.
  • Imaging device 229 may be placed on any suitable portion, e.g., on a side plane as shown in FIG. 2B, or next to display 226 on a top plane.
  • UAV 220 also includes hand-holding areas 224 and 228.
  • UAV 240 in FIG. 2C includes a body portion 243 and propellers 242 enclosed by protection portions 245.
  • UAV 240 includes a display 246 on a side plane of UAV 240.
  • UAV 240 further includes other types of interactive components, such as imaging device 249 (similar to imaging device 107) , and/or an input mechanism, such as a button, for receiving user input.
  • Imaging device 249 may be placed on any suitable portion, e.g., on a different side plane from display 246 as shown in FIG. 2C, or next to display 246 on the same side plane. Imaging device 249 may also be placed on a top plane, e.g., in a hand-hold area 248.
  • UAV 240 can also include hand-holding areas 244 and 248.
  • FIG. 2D illustrates an example of user interaction with an interactive component, e.g., display 206, on non-foldable UAV 200, in accordance with embodiments of the present disclosure.
  • the operator can manually power on UAV 200, e.g., by touch display 206 or pressing a button.
  • UAV 200 may additionally or alternatively automatically turn on its power in response to information detected by one or more interactive components, such as a touch sensor configured to detect that the operator is holding hand-holding area 204, a position sensor configured to detect that an orientation of UAV 200 indicates that display 206 is facing the operator, and/or an imaging device configured to capture image data indicating that the operator is holding UAV 200 and/or intended to input user instruction.
  • a touch sensor configured to detect that the operator is holding hand-holding area 204
  • a position sensor configured to detect that an orientation of UAV 200 indicates that display 206 is facing the operator
  • an imaging device configured to capture image data indicating that the operator is holding UAV 200 and/or intended to input user instruction.
  • the operator can use one hand 270 to hold hand-hold area 204, while use the other hand 260 to input instructions on display 206, e.g., for controlling navigation of UAV 200 or operating imaging sensor onboard UAV 200.
  • the user can select from an operation menu, a user interface element out of a plurality of user interface elements displayed on display 206, for controlling navigation of UAV 200.
  • protection portions 205 are configured to prevent propellers 202 from directly contacting the operator, after the operator completes inputting the command, UAV 200 can directly take off from the hand of the operator.
  • the operator can hold hand-held area 204 and release UAV 200 when UAV 200 is ready to take off from the hand.
  • the operator can safely and conveniently receive returning UAV 200 by allowing UAV 200 to land in the hand.
  • the imaging device e.g., imaging device 229 or 249
  • the imaging device can be placed at any suitable location on the UAV, including but not limited to, on any side plane, a bottom plane, a top plane.
  • the UAV may also include multiple imaging devices, e.g., pointing to different viewing angles.
  • the UAV may or may not include a carrier, e.g., a gimbal, for supporting the imaging device (s) .
  • FIG. 3A illustrates an example of a foldable unmanned aerial vehicle, provided as a UAV 300, in an unfolded state, in accordance with embodiments of the present disclosure.
  • FIG. 3B illustrates an example of the foldable aerial vehicle object, e.g., UAV 300, in a folded state, in accordance with embodiments of the present disclosure.
  • UAV 300 is similar to UAV 102 as described with reference to FIGS. 1B and 1C.
  • UAV 300 includes a body portion 302, and a foldable propulsion system including two sub-sections 310 and 312 that are rotatably coupled to each other to provide the folding structure illustrated in FIGS. 3A and 3B.
  • body portion 302 includes one or more interactive components as discussed herein, such as a display 306 (e.g., an interactive display similar to interactive display 181) , an imaging device 304 (similar to image sensor 107) , one or more input mechanism (e.g., buttons) , and/or various types sensors.
  • UAV 300 may include a carrier 305, such as a gimbal, for supporting imaging device 304 and providing one or more degrees of relative movement between imaging device 304 and UAV 300.
  • imaging device 304 may be mounted directly to UAV 300 without carrier 305.
  • each sub-section 310 or 312 of the propulsion system includes two sets of rotary components, such as multi-blade propellers 322. Further, each sub-section 310 or 312 includes two corresponding sets of protection portions 324 (similar to protection mechanisms 110) configured to cover or contain corresponding propellers 322 to prevent the rotating blades from contacting the user or an object. Protection portions 324 can include two mesh coverings that cover propellers from both top and bottom as shown in FIGS. 3A and 3B. Protection portions 324 can also include any other suitable structures as discussed herein, such as well-like structures containing the blades without any covering, as discussed with reference to FIGS. 2A-2C.
  • the two sets of propellers 322 on sub-section 310 can form a receiving portion 330, e.g., an empty area between the two sets of propellers 322 on one sub-section, when in an unfolded state as shown in FIG. 3A.
  • the two sets of propellers 322 on sub-section 312 can form a receiving portion 331.
  • Receiving portion 330 and/or receiving portion 331 can contain at least a portion of body portion 302 when UAV 300 is in a folded state as shown in FIG. 3B. In some embodiments, when in a folded state in FIG.
  • body portion 302 can provide the operator a comfortable gripping or hand-holding area, such that the operator can conveniently and safely hold UAV 300 while inputting user instructions via the one or more interactive components and/or the user interface elements displayed on display 306. Further, UAV 300 can directly take off or land on the user’s hand.
  • one of the one or more interactive components configured to display the information e.g., display 306, is visible in both a folded state and an unfolded state.
  • FIGS. 3C-3F illustrate examples of user interaction with an interactive component, e.g., display 306, on UAV 300, in accordance with embodiments of the present disclosure.
  • an interactive component e.g., display 306, on UAV 300
  • the operator may unfold UAV 300, use one hand 370 to hold body portion 302, while use a finger on the other hand 360 to input instruction on display 306.
  • the operator may hold body portion 302 with one hand 360 while using a finger, e.g., a thumb, on the same hand 360 to input instruction on display 306. Because body portion 302 is positioned substantially away from sub-sections 310 and 312 of the propulsion system, and further due to presence of protection portions 324 discussed with reference to FIGS.
  • UAV 300 can directly take off from or land in the hand of the operator as shown in FIG. 3F.
  • body portion 302 when flying, taking images, and/or recording videos, body portion 302 can be maintained in a position substantially perpendicular to a propeller plane that is formed by sub-section 310 aligned to sub-section 312 in the unfolded state.
  • UAV 300 can fly with body portion 302 pointing down as shown in FIG. 3F, or pointing up in an up-side-down position.
  • body portion 302 can adjust, e.g., automatically, its tilt angle relative to the propeller plane during flight in order to optimize the center of gravity of UAV 300 during flight, e.g., during acceleration, climbing, or descending.
  • the adjustment of the tilting angle of body portion 302 can also make it easier for the operator to observe the content on display 306.
  • a motive component such as a motor, may be provided to drive rotational movement of body portion 302 relative to the propeller plane, e.g., along a rotational axis 514 in FIG. 5D.
  • Processor 160 may control the motive component to adjust the tilt angle based on the motion status and/or positional state of UAV 300.
  • FIG. 4 is a flow diagram of an example process 400 of operating UAV 102, UAV 200, UAV 220, UAV 240, or UAV 300 in accordance with embodiments of the present disclosure.
  • process 400 may be performed by one or more hardware devices and/or software modules of UAV 102 shown in FIG. 1C.
  • step 410 information associated with operation of a UAV, e.g., UAV 102, UAV 200, UAV 220, UAV 240, or UAV 300, is displayed via one or more interactive components including a display.
  • the display such as display 181, 206, 226, 246, or 306, is onboard the UAV.
  • the display may be an interactive display, e.g., a touchscreen, located on a body portion, e.g., body portion 302, of the UAV.
  • the interactive display can detect user input received thereon.
  • step 420 an input of a user received via the one or more interactive components of the UAV is detected.
  • the input may interact with the displayed information in step 410.
  • the one or more interactive components configured to receive the input include one or more input mechanisms, e.g., input mechanisms 182 (such as a button, a key, a knob, a lever, a switch, or a rocker, etc. ) in FIG. 1C, onboard the UAV and separate from the display.
  • the interactive display e.g., interactive display 181 in FIG. 1C, used for displaying the information is configured to receive the input.
  • the one or more interactive components configured to receive the input include one or more sensors, e.g., audio sensor 174, imaging sensor 107, and/or sensors 183 in FIG. 1C, onboard the UAV.
  • the operator can control the UAV using speech command and/or body movement.
  • audio sensor 174 e.g., a microphone
  • UAV 102 may further include a language processing module for speech recognition.
  • the language processing module may include instructions stored in memory 162 that can be processed by processor 160 to process the detected audio command to obtain user instructions for controlling the UAV.
  • controller 176 can generate operation commands based on the user instructions for controlling the UAV and/or the imaging device onboard the UAV.
  • imaging device 107 can capture images including body movement of the user indicating a user instruction for operating the UAV.
  • an image processing module supported by any suitable imaging processing algorithms (e.g., a machine learning model) , includes instructions stored in memory 162 that can be processed by processor 160 to process the captured images of body user movement to obtain user instructions for controlling the UAV. Controller 176 can then generate operation commands based on the user instructions. Such operation commands may include, but are not limited to, controlling the UAV to move, triggering the imaging device shooting, or recalling the UAV to return through voice command or hand gestures, etc.
  • the input is detected while the UAV is being held by the operator with at least one hand and the information associated with operating the unmanned aerial vehicle is displayed to the operator on the display.
  • the input is received on the one or more interactive components of the UAV while the UAV hovers within a reachable distance from the operator. For example, when the UAV is hovering in the air and within the reachable distance from the operator, the user can view the content on the display while tapping on the interactive display to select UI elements on the operation menu to operate the UAV or the imaging device, such as updating operating instructions, and/or camera parameters for shooting pictures.
  • the input of the user instructs the UAV to take off. In response to the input, the UAV takes off from a support surface, such as directly from a hand of the operator as discussed with reference to FIGS. 2D and 3F.
  • step 430 including substep 440 or 450, instructions for operating the UAV can be generated in accordance with the input of the user.
  • substep 440, generating the instructions includes controlling movement or navigation of the UAV.
  • substep 450, generating the instructions includes controlling operation of an imaging device, e.g., imaging device 107, onboard on the UAV.
  • the generated instructions can be used to control the propulsion system (e.g., propulsion system 104) , select a pre-programmed flying mode, control navigation of the UAV (e.g., selecting a destination, perform target tracking, obstacle avoidance, etc. ) .
  • the generated instructions can also be used to control the imaging device, e.g., to select a tracking target, select a photography mode, or adjust a parameter of the imaging device.
  • the operator can choose a mode or a format of photo or video to be captured, choose a composition, an angle of view, a distance, a lens movement, and then select a target to be tracked from a photo captured by the imaging device and displayed on the interactive display. The operator can then allow the UAV to take off from his or her hand.
  • the operator can also edit content displayed on the interactive display, preview a preset video effect on the interactive display, or can view, edit, and delete, via the interactive display, the content that has been captured by the imaging device.
  • the generated instructions can be used to control one or more sensors onboard the UAV, such as adjusting parameters, calibrating, and/or switching working modes, etc.
  • the information displayed on the display includes an operation menu including one or more user interface ( “UI” ) elements corresponding to instructions for operating the UAV.
  • the detected input corresponds to a selection of one of the UI elements from the operation menu.
  • the detected input corresponds to an input via a soft keyboard for inputting information.
  • the information displayed on the display includes one or more UI elements on the interactive display.
  • the input received from the user corresponds to customizing the one or more UI elements.
  • the user can edit, rearrange, add, create, delete, select to add/remove/delete, bookmark, and/or add favorite, one or more UI elements and the associated features on the interactive display or via one or more input mechanisms.
  • the UI elements may also be displayed according to different use scenarios. For example, different sets of photography parameters may be displayed and provided for user selection according to the target in interest, e.g., portrait photography or scenery photography.
  • UI elements associated with selected operation modes may be provided for user selection in accordance with corresponding use scenes, and such operation modes include but are not limited to snapshot mode, short video mode, slow-motion video mode, “QuickShots” mode (which further including sub-modes such as flying UAV backward and upward with camera facing toward the identified operator, circling UAV around operator, automatically adjusting UAV and camera to take panorama view including an environment surrounding the operator, etc. ) .
  • one or more UI elements displayed on the display may be associated with guiding the user to calibrate one or more devices onboard the UAV, such as one or more sensors of sensory system 172 and/or sensors 183. For example, based on the current state of the sensors, the UAV can generate the UI elements on the display associated with calibration instruction to guide the user to perform the calibration step by step via the displayed UI elements.
  • the UAV may display different UI elements, use different interactive components, and generate different responses.
  • the information displayed on the display includes an image captured by imaging sensor 107.
  • the detected input corresponds to a user interaction with at least a portion of the captured image to cause generation of an instruction for operating the UAV in accordance with the user interaction.
  • the user may circle a target area on the captured image for the UAV to adjust its navigation to track, zoom-in, and/or capture an image of a certain object in the target area.
  • the displayed information is updated in real-time in response to the detected user input received on the one or more interactive components of the UAV.
  • the user may select a region on a captured image for zooming in the selected region.
  • the display can update the displayed information in real time as a zoomed-in view of the selected region.
  • the information displayed on the display includes a map.
  • the detected input corresponds to a user interaction with at least a portion of the map, e.g., press or circle a location, to cause generation of an instruction for operating the UAV in accordance with the user interaction.
  • imaging device 107 of the UAV is configured to capture image data identifying a body movement of the operator, e.g., a registered user, to cause the UAV to operate according to the body movement.
  • the body movement may include a hand gesture, a movement of a portion of the body, eye gaze movement indicating an object of interest, etc.
  • the captured body movement can be from another person other than the operator or the registered user in the field of view of imaging device 107.
  • the one or more interactive components comprise a projector, e.g., onboard the UAV, configured to project virtual reality (VR) /augmented reality (AR) content associated with operation of the UAV onto another object.
  • Imaging device 107 is then configured to detect an action of the user, e.g., a hand gesture or an eye-gaze movement, indicating interacting with at least a portion of the projected VR/AR content. For example, the user may point a finger toward the displayed VR/AR content indicating a selection from the projected interactable content.
  • an operation instruction can be generated based on the detected action for operating the UAV, e.g., navigating the UAV or controlling imaging device 107 to capture images.
  • the VR/AR content projected from the interactive components onboard the UAV can provide an immersive experience to the user.
  • the projected VR/AR content can be generated based on a captured image, e.g., a bird’s -eye view.
  • the projected VR/AR content can include an immersive and interactive map that allows user interaction with the AR/VR content.
  • various features as described in the present disclosure can be achieved via machine learning and/or artificial intelligence (AI) algorithms.
  • FIG. 5A illustrates an example of a foldable unmanned aerial vehicle (provided as UAV 300) in a folded state 500, in accordance with embodiments of the present disclosure.
  • UAV 300 includes body portion 302 including one or more interactive components, such as interactive display 306 (similar to interactive display 181) configured to display information associated with operation of UAV 300.
  • interactive display 306 is also configured to detect an input of an operator of UAV 300 received on interactive display 306 to interact with the displayed information.
  • the input of the operator may be detected on one or more input mechanisms 182.
  • UAV 300 may also include imaging device 304 configured to capture one or more images (e.g., photos and/or videos) .
  • the displayed information may include an operation menu including one or more UI elements, a map, one or more images captured by imaging device 304, or any other type of interactable content.
  • UAV 300 includes a propulsion system including two sub-sections 310 and 312 as described with reference to FIGS. 3A and 3B.
  • the propulsion system can be foldable with the two sub-sections 310 and 312 rotatably coupled to each other along an axis 514 (e.g., in FIG. 5B) via a coupling portion 502, e.g., a rotatable connector such as a hinge, to provide the folding mechanism.
  • each sub-portion includes at least one set of rotary components 322, e.g., rotary blades, configured to rotate to propel UAV 300.
  • Each sub-portion may further include at least one protection portion 324, e.g., a mesh covering, lines, or a protection well, etc., configured to cover rotary components 322, preventing the rotating blades from contacting the operator or other objects.
  • at least one protection portion 324 e.g., a mesh covering, lines, or a protection well, etc., configured to cover rotary components 322, preventing the rotating blades from contacting the operator or other objects.
  • protection portion 324 e.g., a mesh covering, lines, or a protection well, etc.
  • FIG. 5A when in folded state 500, opposing surfaces of the two sub-sections 310 and 312 are in close contact to provide a compact folded structure.
  • body portion 302 is disposed in receiving portion 330 and/or receiving portion 331. Fitting body portion 302 in receiving portion 330 and/or receiving portion 331 when in folded state 500 can further provide the compact structure for easy storage and carrying.
  • FIG. 5B illustrates an example of UAV 300 in a first state 520, in accordance with embodiments of the present disclosure.
  • the propulsion system of UAV 300 is in a first state 520, e.g., a half-open or half-folded state.
  • first state 520 can be obtained by separating sub-section 312 from sub-section 310 from FIG. 5A and rotating sub-section 312 along a folding axis (or a rotating axis) 514.
  • body portion 302 remains coupled in receiving portion 330 of sub-section 310 (e.g., fixed via a release structure 516 shown in FIG.
  • release structure 512 or 516 includes a magnetic member, a thread fastening member, a locking member, or other suitable structure.
  • the rotational movement between sub-sections 310 and 312 can be driven by a manual force, e.g., a pulling force on one of the sub-sections such as sub-section 312 in FIG. 5B, applied by the operator.
  • the rotational movement may be supported by actuator members that are configured to drive the two sub-sections 310 and 312 to separate with respect to each other.
  • Actuator members may be or may include suitable actuators and/or force transmission components.
  • actuator members may include electric motors configured to provide rotational motion to sub-sections 310 and/or 312 in conjunction with axles, shafts, rails, belts, chains, gears, and/or other components.
  • UAV 300 further includes one or more sensors configured to measure, sense, detect, or determine the folding state information of UAV 300.
  • Folding state information may include positional information (e.g., relative location, orientation, attitude, linear displacement, angular displacement, etc. ) , velocity information (e.g., linear velocity, angular velocity, etc. ) , acceleration information (e.g., linear acceleration, angular acceleration, etc. ) , and/or other information relating to the rotational motion between sub-section 310 and sub-section 312.
  • the sensors may include one or more types of suitable sensors, such as potentiometers, optical sensors, vision sensors, magnetic sensors, motion or rotation sensors (e.g., gyroscopes, accelerometers, inertial sensors, etc. ) .
  • the sensors may be associated with or attached to various components of UAV 300, such as components of sub-sections 310 and/or 312, or body portion 302.
  • the sensors may be configured to communicate data related to the measured state information to onboard controller 176 of UAV 300 (similar to UAV 102) via a wired or wireless connection (e.g., RFID, Bluetooth, Wi-Fi, radio, cellular, etc. ) .
  • Data and information generated by the sensors and communicated to onboard controller 176 may be used by onboard controller 176 for further processing, such as for determining folding state information of UAV 300.
  • an angle ⁇ between sub-sections 310 and 312 may be around 90° in first state 520. It will be understood that various components of UAV 300 may have similar structural relationship when the angle ⁇ between sub-sections 310 and 312 is between 0° and 90°. For example, when the angle ⁇ between sub-sections 310 and 312 is between 0° and 90°, body portion 302 remains coupled in receiving portion 330 on sub-section 310, while body portion 302 is detached from receiving portion 331 on sub-section 312. In some embodiments as shown in FIG. 5B, folding axis 514 passes through an upper edge 303 of body portion 302.
  • a ratio of the length of upper edge 303 of the body portion 302 to the total length of sub-section 310 is about 1/4, 1/3, 1/2, or any other suitable ratio.
  • FIG. 5C illustrates an example of UAV 300 in a second state 540, in accordance with embodiments of the present disclosure.
  • the propulsion system of UAV 300 is in second state 540, e.g., a more than half-way open state.
  • state 540 can be obtained by continuing rotating sub-section 312 away from sub-section 310 beyond state 520, such that the angle ⁇ between sub-sections 310 and 312 is greater than 90°.
  • angle ⁇ can be between 90° and 180°.
  • body portion 302 is detached from receiving portion 330 of sub-section 310, and is also detached from receiving portion 331 on sub-section 312.
  • an angle ⁇ between body portion 302 and a sub-section may have a maximum value of 90°. That is, when rotating sub-section 312 away from sub-section 310 to reach a state at which the angle ⁇ between body portion 302 and sub-section 312 is substantially 90°, by continuing to rotate sub-section 312, body portion 302 may be detached (e.g., pulled out) from receiving portion 330.
  • FIG. 5D illustrates an example of an unfolded state 560 of UAV 300, in accordance with embodiments of the present disclosure.
  • the propulsion system of UAV 300 is in unfolded state 560, e.g., where sub-sections 310 and 312 are substantially aligned with each other, with the angle ⁇ therebetween substantially 180°.
  • the wings of UAV 300 e.g., sub-sections 310 and 312
  • body portion 302 is substantially perpendicular to the plane.
  • the angle ⁇ between body portion 302 and sub-section 312 remains at substantially 90°, until sub-section 312 is fully open and substantially aligned with sub-section 310.
  • FIG. 6A illustrates an example of UAV 300 in unfolded state 560, viewed from a different perspective from FIG. 5D, in accordance with embodiments of the present disclosure.
  • UAV 300 includes a coupling part 602 on at least one of the sub-sections, e.g., sub-section 312.
  • coupling part 602 may be disposed on an inner surface of receiving portion 330.
  • sub-section 310 may also include coupling part 602 on the opposing inner surface at a location facing coupling part 602 on sub-section 312 (not seen in FIG. 6A) .
  • coupling part 602 is configured to be engaged by at least a portion of body portion 302, e.g., a corresponding coupling part 604, to maintain body portion 302 coupled to sub-section 310 in folded state 500 in FIG. 5A, or first state 520 in FIG. 5B.
  • FIG. 6B illustrates a cross-sectional view 600 showing the coupling relationship between body portion 302 and sub-section 310 via coupling parts 602 and 604, in accordance with embodiments of the present disclosure.
  • Coupling parts 602 and 604 include a concave structure and a protrusion structure that can be coupled, matched, mated, or fitted with each other.
  • coupling part 602 may include a protrusion structure and may include one or more components disposed on sub-section 310.
  • coupling part 602 includes a resilient part or an elastic part 602a, such as a spring, coupled to a filling part 602b, such as a steel ball.
  • Coupling part 604 includes a concave structure, such as a groove, disposed on body portion 302.
  • elastic part 602a can apply a force to filling part 602b to be coupled to, e.g., project into and be contained in, the concave structure of coupling part 604.
  • filling part 602b cooperates with concave structure 604, and forms a coupling structure enforced by the force applied by elastic part 602a. Further, to release coupling part 602 from coupling part 604, a certain amount of force is required.
  • coupling part 602 can include a concave structure, such as a groove, on inner surfaces of sub-section 310 or 312, and coupling part 604 can include a protrusion on body portion 302, and the concave coupling part 602 can be removably coupled to the protruding coupling part 604.
  • the coupling structure illustrated in FIGS. 6A and 6B are examples and are not limiting.
  • Other suitable coupling structures can also be used.
  • coupling parts 602 and 604 can include a magnetic member and an iron member that by magnetic attraction couple body portion 302 to sub-section 310. It will be understood that similar coupling parts may be disposed on sub-section 312 and corresponding locations on body portion 302, to provide similar coupling relationships between body portion 302 and sub-section 312.
  • FIGS. 6C and 6D illustrate a limiting mechanism 620 connecting body portion 302 and one of the sub-sections, e.g., sub-section 312, to limit the folding state between body portion 302 and sub-section 312, in accordance with embodiments of the present disclosure.
  • limiting mechanism 620 can prevent sub-section 312 from rotating or unfolding to an angle ⁇ between sub-section 312 and body portion 302 greater than a predetermined threshold (e.g., 90°) .
  • a predetermined threshold e.g. 90°
  • sub-sections 310 and 312 and body portion 302 when in folded state 500, sub-sections 310 and 312 and body portion 302 can be fixedly coupled together by coupling parts 602 and 604. Then, as shown in FIG. 5B, the operator can then hold one sub-section, e.g., sub-section 310, of UAV 300 in one hand, and use the other hand to pull the other sub-section, e.g., sub-section 312, to rotate around folding axis 514 and relative to body portion 302 to unfold UAV 300.
  • a predetermined threshold e.g. 90°
  • limiting mechanism 620 can prevent sub-section 312 from further extending angle ⁇ beyond the predetermined threshold.
  • the position between sub-section 310 and body portion 302 can be fixed at the angle ⁇ of the predetermined threshold, by limiting mechanism 620, magnetic attraction, and/or any other suitable coupling structure (e.g., concave and convex coupling structures) between sub-section 310 and body portion 302.
  • suitable coupling structure e.g., concave and convex coupling structures
  • sub-section 310 can be pulled to separate from body portion 302 when there is sufficient pulling force to cause the detachment between coupling parts 602 and 604.
  • coupling parts 602 and 604 can disengage a sub-section, e.g., sub-section 310 in FIG. 5C, from body portion 302 to unfold UAV 300.
  • the other sub-section e.g., sub-section 312
  • rotate e.g., unfold
  • body portion 302 is limited by limiting mechanism 620 at the predetermined threshold.
  • the angle ⁇ between the sub-section 312 and body portion 302 will maintain the predetermined threshold, e.g., the maximum 90°, and the further pulling, rotating, or unfolding, triggers the sub-section 310 to disengage from body portion 302, increasing an angle ⁇ between body portion 302 and sub-section 310 (as shown in FIG. 6E) .
  • sub-sections 310 and 312 can extend all the way to unfolded state 560 as shown in FIG. 5D, with angle ⁇ therebetween being substantially 180° (e.g., angle ⁇ and angle ⁇ being substantially 90° respectively) .
  • FIG. 6E illustrates a side view 640 showing the relationship between body portion 302 and sub-sections 310 and 312 when UAV 300 is in unfolded state 560, in accordance with embodiments of the present disclosure.
  • sub-sections 310 and 312 when in unfolded state 560, may be substantially aligned to each other, e.g., the angle ⁇ being substantially 180°.
  • Sub-sections 310 and 312 may be fixed by a magnetic attraction or any other suitable coupling force.
  • UAV 300 as described in the present disclosure may include more folding states than the above-discussed folding states.
  • angle ⁇ between sub-sections 310 and 312 may be beyond 180°, and body portion 302 may be at any suitable position relative to sub-section 310 or 312, e.g., with angle ⁇ or angle ⁇ being in any suitable range.
  • FIGS. 7A and 7B illustrates examples of a folding structure by staggering motors from two opposing propellers, in accordance with embodiments of the present disclosure.
  • a first motor 702 on a first sub-section 710 and a second motor 704 on a second sub-section 712 can be disposed in a staggered arrangement.
  • Sub-sections 710 and 712 may be similar to sub-sections 310 and 312, respectively.
  • opposing surfaces of the two sub-sections 710 and 712 are in close contact.
  • the stacked protection portions 324 as on sub-sections 310 and 312, and stacked opposing motors can increase the thickness of UAV 300 in folded state 500.
  • opposing sides of propellers on sub-sections 710 and 712 that are in contact when folded may not include their protection portions. That is, sub-sections 710 and 712 each may only have protection portion 324 on one side (facing outside when folded as shown in FIG. 7A) .
  • opposing propellers may be asymmetrical, such that when folded, motor 702 does not overlap with motor 704, and propeller blades from sub-sections 710 and 712 respectively can be closely packed as shown in FIG. 7B.
  • motor 702 can be closer to the folding axis 514, while motor 704 can be closer to the opening edge of sub-section 312.
  • motors 702 and 704 can be staggered to further reduce the thickness.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • User Interface Of Digital Computer (AREA)
  • Toys (AREA)

Abstract

L'invention porte sur des appareils, des systèmes et des procédés pour faire fonctionner un véhicule aérien sans pilote (102). Le véhicule aérien sans pilote (102) comprend un ou plusieurs composants interactifs (180) embarqués dans le véhicule aérien sans pilote (102), configurés pour afficher des informations associées au fonctionnement du véhicule aérien sans pilote (102) et détecter une entrée d'un utilisateur, reçue sur le ou les composants interactifs (180), afin d'interagir avec les informations affichées, un système de propulsion étant configuré pour propulser le véhicule aérien sans pilote (102) ; et un ou plusieurs processeurs (160) couplés à la mémoire (162) configurée pour stocker des instructions destinées à être exécutées par le ou les processeurs (160) pour commander le fonctionnement du véhicule aérien sans pilote (102) conformément à l'entrée.
PCT/CN2020/139505 2020-12-25 2020-12-25 Véhicule aérien sans pilote ayant des composants interactifs pour l'utilisateur et une structure pliable WO2022134024A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/CN2020/139505 WO2022134024A1 (fr) 2020-12-25 2020-12-25 Véhicule aérien sans pilote ayant des composants interactifs pour l'utilisateur et une structure pliable
CN202080103877.7A CN116194369A (zh) 2020-12-25 2020-12-25 具有用户交互部件和可折叠结构的无人飞行器
CN202023352348.7U CN214356664U (zh) 2020-12-25 2020-12-31 多旋翼无人机

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/139505 WO2022134024A1 (fr) 2020-12-25 2020-12-25 Véhicule aérien sans pilote ayant des composants interactifs pour l'utilisateur et une structure pliable

Publications (1)

Publication Number Publication Date
WO2022134024A1 true WO2022134024A1 (fr) 2022-06-30

Family

ID=77952863

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/139505 WO2022134024A1 (fr) 2020-12-25 2020-12-25 Véhicule aérien sans pilote ayant des composants interactifs pour l'utilisateur et une structure pliable

Country Status (2)

Country Link
CN (2) CN116194369A (fr)
WO (1) WO2022134024A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104580606A (zh) * 2015-02-02 2015-04-29 金陵科技学院 一种可变为无人机的手机壳
CN105035303A (zh) * 2015-08-19 2015-11-11 无锡觅睿恪科技有限公司 折叠型航拍飞行器
CN106794896A (zh) * 2014-10-08 2017-05-31 韩华泰科株式会社 无人飞行器
US9977434B2 (en) * 2016-06-23 2018-05-22 Qualcomm Incorporated Automatic tracking mode for controlling an unmanned aerial vehicle
CN110475717A (zh) * 2017-01-19 2019-11-19 维趣斯有限公司 用于uav和其他自主车辆的室内测绘和模块化控制以及相关系统和方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106794896A (zh) * 2014-10-08 2017-05-31 韩华泰科株式会社 无人飞行器
CN104580606A (zh) * 2015-02-02 2015-04-29 金陵科技学院 一种可变为无人机的手机壳
CN105035303A (zh) * 2015-08-19 2015-11-11 无锡觅睿恪科技有限公司 折叠型航拍飞行器
US9977434B2 (en) * 2016-06-23 2018-05-22 Qualcomm Incorporated Automatic tracking mode for controlling an unmanned aerial vehicle
CN110475717A (zh) * 2017-01-19 2019-11-19 维趣斯有限公司 用于uav和其他自主车辆的室内测绘和模块化控制以及相关系统和方法

Also Published As

Publication number Publication date
CN116194369A (zh) 2023-05-30
CN214356664U (zh) 2021-10-08

Similar Documents

Publication Publication Date Title
US11572196B2 (en) Methods and systems for movement control of flying devices
US20210116944A1 (en) Systems and methods for uav path planning and control
JP6816156B2 (ja) Uav軌道を調整するシステム及び方法
US9493232B2 (en) Remote control method and terminal
US20190291864A1 (en) Transformable apparatus
JP6234679B2 (ja) 外乱を引き起こす動きを最小化しながら搭載型カメラによる撮影を行うための回転翼無人機の操縦方法
WO2018098704A1 (fr) Procédé, appareil et système de commande, véhicule aérien sans pilote, et plateforme mobile
US20200148352A1 (en) Portable integrated uav
US20220137647A1 (en) System and method for operating a movable object based on human body indications
JP2013144539A (ja) 遠隔制御装置によって無人機を直観的に操縦するための方法
WO2017166723A1 (fr) Système de véhicule aérien sans pilote et procédé de commande de vol associé
US20200382696A1 (en) Selfie aerial camera device
WO2022134024A1 (fr) Véhicule aérien sans pilote ayant des composants interactifs pour l'utilisateur et une structure pliable
WO2022056683A1 (fr) Procédé, dispositif et système de détermination de champ de vision et support
US20230033760A1 (en) Aerial Camera Device, Systems, and Methods
CN114641743A (zh) 无人机及其控制方法、系统、手持控制设备、头戴设备
JP7289152B2 (ja) 飛行制御システム
WO2022134301A1 (fr) Véhicule aérien sans pilote, procédé et système de commande associés, dispositif de commande portatif et dispositif monté sur la tête
JP2021036452A (ja) Uav軌道を調整するシステム及び方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20966589

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20966589

Country of ref document: EP

Kind code of ref document: A1