CN110615095A - Hand-held remote control device and flight system set - Google Patents

Hand-held remote control device and flight system set Download PDF

Info

Publication number
CN110615095A
CN110615095A CN201910897267.8A CN201910897267A CN110615095A CN 110615095 A CN110615095 A CN 110615095A CN 201910897267 A CN201910897267 A CN 201910897267A CN 110615095 A CN110615095 A CN 110615095A
Authority
CN
China
Prior art keywords
flight system
remote control
magnetic
control device
flight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910897267.8A
Other languages
Chinese (zh)
Other versions
CN110615095B (en
Inventor
王兆喆
张通
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Zero Zero Technology Co Ltd
Original Assignee
Hangzhou Zero Zero Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Zero Zero Technology Co Ltd filed Critical Hangzhou Zero Zero Technology Co Ltd
Priority to PCT/CN2019/107716 priority Critical patent/WO2020063631A1/en
Publication of CN110615095A publication Critical patent/CN110615095A/en
Application granted granted Critical
Publication of CN110615095B publication Critical patent/CN110615095B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/04Helicopters
    • B64C27/08Helicopters with two or more rotors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls

Abstract

The invention relates to a hand-held remote control device and a flight system kit. The hand-held remote control device includes: a support structure; a quick capture and release coupling mechanism disposed on the support structure and having a first magnetic component; and a controller for controlling the first magnetic member such that the first magnetic member can have at least a first operating state in which the first magnetic member produces magnetic attraction with the second magnetic member of the flight system and a second operating state in which the first magnetic member does not produce magnetic attraction with the second magnetic member of the flight system. The flight system kit includes: the hand-held remote control device; and a flight system including a second magnetic component capable of magnetic attraction with the first magnetic component of the handheld remote control device.

Description

Hand-held remote control device and flight system set
Technical Field
The present disclosure relates generally to the field of flight systems, and more particularly, to a handheld remote control device for a flight system and a flight system kit including the handheld remote control device and the flight system.
Background
The initiation of flight (i.e., "takeoff") and landing are key steps in the autonomous or remote control of the flight system. Current takeoff and landing operations are performed with respect to the ground or the user's hands. These operations are relatively complex. Performing these operations from the ground can be difficult because the ground is often complex, i.e., uneven and uncertain. Performing these operations from/to the user's hand may present other problems, such as an unstable surface. In addition, performing these operations from/to the user's hand can cause additional safety issues related to the user's safety.
The present disclosure is directed to one or more of the problems set forth above.
Disclosure of Invention
In a first aspect of the present disclosure there is provided a hand-held remote control device for use with a flight system, the hand-held remote control device comprising: a support structure; a quick capture and release coupling mechanism disposed on the support structure and having a first magnetic component; and a controller for controlling the first magnetic member such that the first magnetic member can have at least a first operating state in which the first magnetic member produces magnetic attraction with the second magnetic member of the flight system and a second operating state in which the first magnetic member does not produce magnetic attraction with the second magnetic member of the flight system.
According to a first aspect, the first magnetic component is an electromagnet, in the first operating state the controller switches on the current through the first magnetic component so that the electromagnet magnetically attracts the second magnetic component of the flight system, and in the second operating state the controller switches off the current through the first magnetic component so that the electromagnet does not magnetically attract the second magnetic component of the flight system.
According to the first aspect, the controller may be configured to control a direction of current passing through the first magnetic member so that the first magnetic member has a third operating state different from the first and second operating states, and in the third operating state, the magnetism of the first magnetic member is the same as that of the second magnetic member so that the first magnetic member and the second magnetic member repel each other.
According to a first aspect, first magnetic part is the permanent magnet, handheld remote control unit is still including setting up first magnetic part actuating mechanism on the bearing structure, first magnetic part actuating mechanism can make first magnetic part moves between first position and second position first position, first magnetic part with second magnetic part can produce first magnetic attraction power second position, first magnetic part with can not produce magnetic attraction power between the second magnetic part or produce and be less than first magnetic attraction power's second magnetic attraction power.
According to the first aspect, the first magnetic member driving mechanism includes a motor, a guide rail, a movable tray connected to the guide rail, and a screw connected to the motor, the first magnetic member being fixed to the movable tray, the screw being screwed through a threaded hole of the movable tray, the motor being capable of driving the screw to rotate so that the movable tray is capable of moving up and down along the guide rail to move the first magnetic member between the first position and the second position.
According to a first aspect, the support structure comprises an elongate tube within which a battery is housed.
According to a first aspect, the hand-held remote control device comprises a display screen arranged on the support structure and connected to the controller, the display screen being capable of receiving and displaying image signals from the flight system or the hand-held remote control device.
According to a first aspect, the hand-held remote control device comprises one or more cameras.
According to a first aspect, the hand-held remote control device comprises a camera driving mechanism for driving the camera to translate and/or rotate.
According to a first aspect, the hand-held remote control device comprises one or more microphones and/or one or more loudspeakers.
According to a first aspect, the support structure comprises a recess for accommodating at least a part of a flight system.
According to a first aspect, the recess of the support structure is capable of accommodating the flying system such that the profile of the flying system is enclosed within the recess.
According to a first aspect, the support structure comprises a catch into which a correspondingly provided latch on the flying system can be releasably snapped in.
According to the first aspect, a direction in which the latch on the flying system is latched into the latch is the same as a direction of magnetic attraction between the first magnetic member and the second magnetic member.
According to a first aspect, the support structure further comprises a catch drive mechanism capable of driving the catch to release a catch of the flying system.
According to a first aspect, the catch drive mechanism is controlled by the controller, and the controller is configured to be able to control the catch drive mechanism and the first magnetic component such that the operation of the catch to release the catch of the flying system is substantially synchronized with the operation of the first magnetic component to enter the second working state.
According to a first aspect, the support structure is provided with a sensor capable of detecting whether the catch is in a locked state of a locking catch, the controller being configured to control the first magnetic component to enter the second operating state upon detection of the catch being in the locked state.
According to a first aspect, the support structure is provided with a flight system controller coupling mechanism for mounting the flight system controller on the support structure.
According to a first aspect, the hand-held remote control device is capable of charging a flight system controller connected to the flight system controller coupling mechanism by wired or wireless means, and/or the flight system controller connected to the flight system controller coupling mechanism is capable of charging the hand-held remote control device by wired or wireless means.
According to a first aspect, the hand-held remote control device is capable of charging a flight system in a wired or wireless manner, and/or the flight system is capable of charging the hand-held remote control device in a wired or wireless manner.
According to a second aspect of the present disclosure, there is provided a flight system kit comprising: the hand-held remote control device; and a flight system including a second magnetic component capable of magnetic attraction with the first magnetic component of the handheld remote control device.
According to a second aspect, the flying system further comprises a latch that mates with the snap.
According to a second aspect, the second magnetic component is a permanent magnet or an electromagnet.
According to a second aspect, the flying system comprises a flying system body provided with the second magnetic means and magnetically attachable to the hand-held remote control device, and a power pack detachably connected to the flying system body, the power pack comprising at least one rotor and a motor driving the rotor.
According to a second aspect, the flying system body can be accommodated in a recess of the support structure.
Drawings
FIG. 1 is a schematic view of a flight system and a system for controlling a flight system including a handheld remote control device according to an embodiment of the present disclosure.
FIG. 2 is a view of an exemplary flight system according to an embodiment of the present disclosure.
Fig. 3 is a view of an exemplary optical system according to an embodiment of the present disclosure.
FIG. 4 is a second schematic view of a flight system according to an embodiment of the present disclosure.
FIG. 5 is a third schematic view of a flight system and a system for controlling a flight system according to an embodiment of the present disclosure.
FIG. 6 is a schematic view of a flight system including an obstacle detection and avoidance system, according to an embodiment of the present disclosure.
FIG. 7 is a schematic view of the hand-held remote control device of FIG. 1 along with an exemplary flight system according to a first embodiment of the present disclosure.
Fig. 8 is an exploded state diagram of fig. 7.
FIG. 9 is an exploded state diagram of the hand-held remote control device of FIG. 1 along with an exemplary flight system, wherein the quick-capture and release coupling mechanism is different from that of the first embodiment, according to a second embodiment of the present disclosure.
Fig. 10 is an enlarged view of section a of fig. 9, showing the permanent magnet or magnetizable material of the quick capture and release coupling mechanism in a raised position capable of magnetic attraction with the coupler member.
FIG. 11 is another exploded state diagram of the hand-held remote control device of FIG. 1 along with an exemplary flight system according to a second embodiment of the present disclosure.
Fig. 12 is an enlarged view of section B of fig. 11, showing the permanent magnet or magnetizable material of the quick capture and release coupling mechanism in a lowered position unable to magnetically attract the coupler member.
FIG. 13 is another schematic view of the hand-held remote control device of FIG. 1 in conjunction with an exemplary flight system according to a third embodiment of the present disclosure.
Fig. 14 is an exploded state view of fig. 13.
FIG. 15 is another schematic view of the hand-held remote control device of FIG. 1 in conjunction with an exemplary flight system according to a fourth embodiment of the present disclosure.
Fig. 16 is an exploded state view of fig. 15.
Fig. 17 and 18 are schematic views of the power module of the flight system removed from fig. 15.
Detailed Description
The following description of the embodiments of the present disclosure is not intended to limit the present disclosure to these embodiments, but is provided to enable any person skilled in the art to make and use the present disclosure. Referring to the drawings and in operation, a system 10 for controlling a flight system 12 (e.g., a drone) is provided. The system 10 includes a robot or flight system controller 14 having a control client 16. Control client 16 provides a user interface (see below) that allows user 18 to send instructions to flight system 12 to control the operation of flight system 12. As discussed in more depth below, the flight system 12 includes one or more cameras (see below) for obtaining pictures and/or video that may be sent to the flight system controller 14 and/or stored in memory on the flight system 12.
In one aspect of the present disclosure, a hand-held remote control device 8 is provided. As discussed in further detail below, the robot 8 provides support for the flight system 12 and the flight system controller 14. The hand-held robot 8 includes a quick capture and release mechanism (see below) for controllably capturing and releasing the flight system 12 during takeoff and landing operations, respectively.
The flight system 12 may include an obstacle detection and avoidance system 50. The obstacle detection and avoidance system 50 may include a pair of ultra-wide angle lens cameras 52A, 52B (see below) for providing obstacle detection and avoidance.
Flight system 12 may include one or more sensors (see below) for detecting or sensing an operation or action (e.g., expression) performed by user 18 to control the operation of flight system 12 (see below) without direct or physical interaction with flight system controller 14. In a controllerless embodiment, the entire control loop from start (release and hover) to end (catch and leave), and the triggering of the movements and events (e.g., taking pictures and videos) that control the flight system 12, are performed solely on the flight system 12 without involving the flight system controller 14. In some such embodiments or systems 10, the flight system controller 14 may not be provided or included.
In some embodiments, flight system controller 14 includes one or more sensors that detect or sense operations or actions performed by user 18 in order to control the operation of flight system 12 without physically interacting with flight system controller 14 under certain conditions (e.g., when flight system 12 is too far from user 18).
Overview of System 10 and flight System 12
An exemplary flight system 12 and control system 10 are shown in fig. 1-5. The control client 16 of the flight system 12 is used to receive data, including video images and/or video, from the flight system 12 and to control the visual display on the flight system controller 14. The control client 16 may also receive operational instructions and facilitate remote control of the flight system 12 based on the operational instructions. The control client 16 is preferably configured to run on the flight system controller 14, but can alternatively be configured to run on the flight system 12 or on any other suitable system. As discussed above, and discussed more fully below, the flight system 12 may be controlled by itself without direct or physical interaction with the flight system controller 14.
The control client 16 can be a native application (e.g., a mobile application), a browser application, an operating system application, or any other suitable construct.
The flight system controller 14 executing the control client 16 is used to display data (e.g., data is displayed as indicated by the control client 16), receive user input, calculate operational instructions based on the user input (e.g., operational instructions are calculated based on the user input as indicated by the control client 16), send the operational instructions to the flight system 12, store information for the control client (e.g., an associated flight system identifier, a security key, user account information, user account preferences, etc.), or perform any other suitable function. Flight system controller 14 can be a user device (e.g., a smartphone, tablet, laptop, etc.), a networked server system, or any other suitable remote computing system. The flight system controller 14 can include one or more of: an output, an input, a communication system, a sensor, a power supply, a processing system (e.g., CPU, memory, etc.), or any other suitable component. The output can include: a display (e.g., an LED display, an OLED display, an LCD, etc.), an audio speaker, a light (e.g., an LED), a tactile output (e.g., a tactile pixel (tile) system, a vibrating motor, etc.), or any other suitable output. The input can include: a touch screen (e.g., capacitive touch screen, resistive touch screen, etc.), a mouse, a keyboard, a motion sensor, a microphone, a biometric input, a camera, or any other suitable input. The communication system can include wireless connections, such as radios that support: a long-range system (e.g., Wi-Fi, cellular, WLAN, WiMAX, microwave, IR, radio frequency, etc.), a short-range system (e.g., BLE long-range, NFC, ZigBee, RF, audio, optical, etc.), or any other suitable communication system. The sensor can include: an orientation sensor (e.g., accelerometer, gyroscope, etc.), an ambient light sensor, a temperature sensor, a pressure sensor, an optical sensor, an acoustic sensor, or any other suitable sensor. In one variation, the flight system controller 14 can include a display (e.g., a touch sensitive display including a touch screen that overlaps the display), a set of radios (e.g., Wi-Fi, cellular, BLE, etc.), and a set of orientation sensors. However, flight system controller 14 can include any suitable collection of components.
The flight system 12 is used to fly, capture video within a physical space, stream the video to the flight system controller 14 in near real-time, and operate based on operational instructions received from the flight system controller 14.
The flight system 12 can additionally process video (e.g., video frames) and/or process audio received from on-board audio sensors prior to streaming the video to the flight system controller 14; generate its own operation instructions and automatically operate based on the instructions (e.g., to automatically follow an object); or perform any other suitable function. The flight system 12 can additionally be used to move the field of view of the optical sensor within the physical space. For example, flight system 12 can control macroscopic movement (e.g., large field of view (FOV) changes, meter-level adjustments), microscopic movement (e.g., small field of view (FOV) changes, millimeter or centimeter-level adjustments), or any other suitable movement.
The flight system 12 is capable of performing some function based on-board processing of sensor data from on-board sensors. Such functions may include, but are not limited to:
-take-off and landing;
-an owner identification;
-face recognition;
-speech recognition;
-facial expression and gesture recognition; and
-controlling the flying system, e.g. controlling the movements of the flying system, based on the owner, the face, the expression and the gesture recognition and the speech recognition.
As shown in fig. 2-5, the flight system 12 (e.g., drone) can include a body 20, a processing system 22, a communication system 24, an optical system 26, and an actuation mechanism 28 that mounts the optical system 26 to the body 20. The flight system 12 can additionally or alternatively include a lift mechanism, a sensor, a power system, or any other suitable component (see below).
The body 20 of the flight system 12 is configured to mechanically protect and/or maintain the flight system components. The body 20 can define a lumen (lumen), be a platform, or have any suitable configuration. The body 20 can be closed, open (e.g., truss), or have any suitable configuration. The body 20 can be made of metal, plastic (e.g., polymer), carbon composite, or any other suitable material. The body 20 can define a longitudinal axis, a transverse axis, a lateral axis, a front end, a rear end (e.g., opposite the front end along the longitudinal axis), a top, a bottom (e.g., opposite the top along the lateral axis), or any other suitable reference. In one variation, while in flight, the lateral axis of the body 20 can be substantially parallel to the gravity vector (e.g., perpendicular to the ground), and the longitudinal and transverse axes of the body can be substantially perpendicular to the gravity vector (e.g., parallel to the ground). However, the body 20 can be configured in other ways.
The processing system 22 of the flight system 12 is used to control flight system operations. The processing system 22 is capable of: receive operating instructions from the communication system 24, interpret the operating instructions as machine instructions, and control flight system components based on the machine instructions (individually or as a group). The processing system 22 can additionally or alternatively process images recorded by the cameras, stream images to the flight system controller 14 (e.g., in real-time or near real-time), or perform any other suitable function. The processing system 22 may include one or more of: a processor 32 (e.g., CPU, GPU, etc.), memory (e.g., flash memory, RAM, etc.), or any other suitable processing component. In one variation, the processing system 22 can additionally include dedicated hardware that automatically processes (e.g., restores, filters, crops, etc.) the images prior to transmission to the flight system controller 14. The processing system 22 is preferably connected to the movable components of the flight system 12 and mounted to the body 20, but can alternatively be otherwise associated with the flight system components.
The flight system communication system 24 is used to transmit and/or receive information from the flight system controller 14. Communication system 24 is preferably connected to processing system 22 such that communication system 24 transmits data to processing system 22 and/or receives data from processing system 22, but can alternatively be connected to any other suitable component. The flight system 12 can include one or more communication systems 24 of one or more types. The communication system 24 can include wireless connections, such as radios that support the following systems: a long-range system (e.g., Wi-Fi, cellular, WLAN, WiMAX, microwave, IR, radio frequency, etc.), a short-range system (e.g., BLE long-range, NFC, ZigBee, RF, audio, optical, etc.), or any other suitable communication system 24. The communication system 24 preferably shares at least one system protocol (e.g., BLE, RF, etc.) with the flight system controller 14, but can instead communicate with the flight system controller 14 via an intermediate communication system (e.g., a protocol conversion system). However, the communication system 24 can be configured in other ways.
The optical system 26 of the flying system 12 is used to record images of the physical space proximal to the flying system 12. The optical system 26 is preferably mounted to the body 20 via an actuation mechanism 28, but can alternatively be statically mounted to the body 20, removably mounted to the body 20, or otherwise mounted to the body 20. The optical system 26 is preferably mounted to the front end of the body 20, but can optionally be mounted to the bottom (e.g., near the front), the top, the back end, or any other suitable portion of the body 20. The optical system 26 is preferably connected to the processing system 22, but can alternatively be connected to the communication system 24 or to any other suitable system. The optical system 26 can additionally include dedicated image processing hardware that automatically processes images recorded by the camera before the images are transmitted to a processor or other endpoint. The flying system 12 can include one or more optical systems 26 of the same or different types mounted to the same or different locations. In one variation, the flying system 12 includes a first optical system 26 mounted to the front end of the body 20 and a second optical system 26 mounted to the bottom of the body 20. The first optical system 26 can be actuated about a pivot support, while the second optical system 26 can be held substantially statically with respect to the body 20, with the respective active surface substantially parallel to the body bottom. The first optical sensor 36 can be high resolution, while the second optical sensor 36 can be low resolution. However, the optical system 26 can be configured in other ways.
The optical system 26 can include one or more optical sensors 36 (see fig. 5). The one or more optical sensors 36 can include: a single lens camera (e.g., a CCD camera, a CMOS camera, etc.), a stereo camera, a hyperspectral camera, a multispectral camera, or any other suitable image sensor. However, the optical system 26 can be any other suitable optical system 26. The optical system 26 can define one or more active surfaces that receive light, but can alternatively include any other suitable components. For example, the active surface of the camera can be the active surface of a camera sensor (e.g., a CCD sensor, a CMOS sensor, etc.), preferably comprising a regular array of sensor pixels. The camera sensor or other active surface is preferably substantially planar and rectangular (e.g., having a first sensor edge, a second sensor edge opposite the first sensor edge, and third and fourth sensor edges each perpendicular to and extending from the first sensor edge to the second sensor edge), but can alternatively have any suitable shape and/or topography (topographiy). The optical sensor 36 is capable of generating image frames. The image frames preferably correspond to the shape of the active surface (e.g., rectangular, having first and second frame edges opposite each other, etc.), more preferably define a regular array of pixel locations, each pixel location corresponding to a sensor pixel of the active surface and/or a pixel of the image sampled by the optical sensor 36, but can alternatively have any suitable shape. The image frames preferably define various aspects of the image sampled by the optical sensor 36 (e.g., image size, resolution, pixel size and/or shape, etc.). The optical sensor 36 can optionally include a zoom lens, digital zoom, fisheye lens, filter, or any other suitable active or passive optical adjustment. The application of the optical adjustment can be actively controlled by the controller, manually controlled by the user 18 (e.g., where the user manually sets the adjustment), controlled by the flight system controller 14, or otherwise controlled. In one variation, the optical system 26 can include a housing that encloses the remainder of the optical system assembly, where the housing is mounted to the body 20. However, the optical system 26 can be configured in other ways.
Actuation mechanism 28 of flying system 12 is used to movably mount optical system 26 to body 20. The actuation mechanism 28 can additionally be used to dampen optical sensor vibrations (e.g., mechanically stabilize the resulting image), accommodate roll of the flight system, or perform any other suitable function. The actuation mechanism 28 can be active (e.g., controlled by a processing system), passive (e.g., controlled by a set of weights, spring elements, magnetic elements, etc.), or otherwise controlled. The actuation mechanism 28 can rotate the optical system 26 relative to the body about one or more axes, translate the optical system 26 relative to the body along one or more axes, or otherwise actuate the optical system 26. The one or more optical sensors 36 can be mounted to the support along a first end, along a rear of the optical sensor (e.g., opposite the active surface), through the optical sensor body, or along any other suitable portion of the optical sensor 36.
In one variation, the actuation mechanism 28 can include a motor (not shown) connected to a single pivotal support (e.g., a gimbal), wherein the motor pivots the support about a rotational (or gimbal) axis 34 based on instructions received from a controller. The support is preferably arranged such that the axis of rotation is substantially parallel to the transverse axis of the body 20, but can alternatively be arranged such that the axis of rotation is in any other suitable orientation relative to the body 20. The support is preferably arranged within a cavity defined by the body 20, wherein the cavity also surrounds the optical sensor 36, but could alternatively be arranged along the exterior of the body or at any other suitable portion of the body 20. The optical sensor 36 is preferably mounted to a support with the active surface substantially parallel to the axis of rotation (e.g., such that the transverse axis of the body 20 or an axis parallel to the transverse axis is substantially parallel to the axis of rotation), but can alternatively be arranged such that the active surface is arranged at any suitable angle relative to the axis of rotation.
The motor is preferably an electric motor, but can alternatively be any other suitable motor. Examples of motors that can be used include: a DC motor (e.g., a brushed motor), an EC motor (e.g., a brushless motor), an induction motor, a synchronous motor, a magneto (magnetic motor), or any other suitable motor. The motor is preferably mounted to the body 20 (e.g., inside the body), electrically connected to the processing system 22 and controlled by the processing system 22, electrically connected to a power source or system 38 and powered by the power source or system 38. However, the motor can be connected in other ways. The actuation mechanism 28 preferably comprises a single motor support, but can alternatively comprise a plurality of motor supports, wherein the auxiliary motor support can be arranged orthogonal to (or at any other suitable angle to) the first motor support.
In a second variation, the actuation mechanism 28 can include a set of pivoting supports and weights connected to the optical sensor 36 that are offset from the center of gravity of the optical sensor, wherein the actuation mechanism 28 passively stabilizes the optical sensor 36.
The lift mechanism 40 of the flight system 12 is used to enable the flight system to fly. The lift mechanism 40 preferably includes a set of propeller blades 42 driven by a motor (not shown), but can alternatively include any other suitable propulsion mechanism. The lift mechanism 40 is preferably mounted to the body 20 and controlled by the processing system 22, but can alternatively be otherwise mounted to the flight system 12 and/or otherwise controlled. The flight system 12 can include a plurality of lift mechanisms 40. In one example, the flight system 12 includes four lift mechanisms 40 (e.g., two pairs of lift mechanisms 40), wherein the lift mechanisms 40 are substantially evenly distributed about a perimeter of the flight system 12 (e.g., wherein the lift mechanisms 40 of each pair oppose each other across the body 20). However, the lifting mechanism 40 can be configured in other ways.
Additional sensors 44 of the flight system are used to record signals indicative of flight system operation, the ambient environment surrounding the flight system 12 (e.g., the physical space proximate the flight system 12), or any other suitable parameter. The sensor 44 is preferably mounted to the body 20 and controlled by the processing system 22, but can alternatively be mounted to any other suitable component and/or otherwise controlled. The flight system 12 can include one or more sensors 36, 44. Examples of sensors that can be used include: an orientation sensor (e.g., accelerometer, gyroscope, etc.), an ambient light sensor, a temperature sensor, a pressure sensor, an optical sensor, an acoustic sensor (e.g., microphone), a voltage sensor, a current sensor, or any other suitable sensor.
The power supply 38 of the flight system 12 is used to power the active components of the flight system 12. The power source 38 is preferably mounted to the body 20 and is electrically connected (e.g., directly or indirectly) to all active components of the flight system 12, but can be otherwise arranged. The power source 38 can be a primary battery, a secondary battery (e.g., a rechargeable battery), a fuel cell, an energy harvester (e.g., sun, wind, etc.), or any other suitable power source. Examples of the secondary battery that can be used include: lithium chemistry (e.g., lithium ion polymer, etc.), nickel chemistry (e.g., NiCad, NiMH, etc.), or a battery having any other suitable chemistry.
One or more flight systems 12 can optionally be used with a remote computing system or with any other suitable system. The flight system 12 is used for flying, and can additionally be used to take pictures, transport loads, and/or relay wireless communications. The flight system 12 is preferably a rotary wing aircraft (e.g., a quadcopter, helicopter, rotorcraft (cyclocopter), etc.), but can alternatively be a fixed wing aircraft, an aerostat, or any other suitable flight system 12. The flight system 12 can include: a lifting mechanism 40; a power source 38; the sensors 36, 44; a processing system 22; a communication system 24; a main body 20; and/or any other suitable components.
The lift mechanism 40 of the flight system is used to provide lift and preferably comprises a set of rotors driven (individually or collectively) by one or more motors. Each rotor is preferably configured to rotate about a corresponding rotor axis, define a corresponding rotor plane perpendicular to its rotor axis, and sweep out a swept area in its rotor plane. The motor is preferably configured to provide sufficient power to the rotor to enable flight of the flight system, and more preferably is operable in two or more modes, at least one of which includes providing sufficient power for flight, and at least one of which includes providing less power than is required for flight (e.g., providing zero power, providing 10% of minimum flight power, etc.). The power provided by the motors preferably affects the angular speed at which the rotors rotate about their rotor axes. During flight of the flight system, the set of rotors is preferably configured to cooperatively or individually generate (e.g., by rotating about their rotor axes) almost all (e.g., more than 99%, more than 95%, more than 90%, more than 75%) of the total aerodynamic force generated by the flight system 12 (possibly excluding drag forces generated by the body 20 during flight, such as at high airspeed). Alternatively or additionally, flight system 12 can include any other suitable flight components for generating forces for flight of the flight system, such as jet engines, rocket engines, wings, solar sails, and/or any other suitable force generating components.
In one variation, flight system 12 includes four rotors, each rotor disposed at one corner of the flight system body. The four rotors are preferably substantially evenly distributed about the flight system body, and each rotor plane is preferably substantially parallel (e.g., within 10 degrees) to a transverse plane (e.g., encompassing the longitudinal axis and the transverse axis) of the flight system body. The rotors preferably occupy a relatively large portion of the entire flight system 12 (e.g., 90%, 80%, 75%, or a majority of the flight system footprint, or any other suitable proportion of the flight system 12). For example, the sum of the squares of the diameters of each rotor may be greater than a threshold amount (e.g., 10%, 50%, 75%, 90%, 110%, etc.) of projected convex hulls of flight system 12 projected onto a main plane (e.g., a transverse plane) of the flight system. However, the rotor can be arranged in other ways.
The flight system power supply 38 is used to power active components of the flight system 12 (e.g., the lift mechanism motors, etc.). The power source 38 can be mounted to the body 20 and connected to the active components, or otherwise arranged. The power source 38 can be a rechargeable battery, a secondary battery, a primary battery, a fuel cell, or any other suitable power source.
The sensors 36, 44 of the flight system are used to acquire signals indicative of the surroundings of the flight system and/or the operation of the flight system. The sensors 36, 44 are preferably mounted to the body 20, but can alternatively be mounted to any other suitable component. The sensors 36, 44 are preferably powered by the power source 38 and controlled by the processor, but can be connected to and interact with any other suitable component. The sensors 36, 44 can include one or more of: cameras (e.g., CCD cameras, CMOS cameras, multispectral cameras, visual range cameras, hyperspectral cameras, stereo cameras, etc.), orientation sensors (e.g., inertial measurement sensors, accelerometers, gyroscopes, altimeters, magnetometers, etc.), audio sensors (e.g., transducers, microphones, etc.), barometers, light sensors, temperature sensors, current sensors (e.g., hall effect sensors), air flow meters, a voltmeter, a touch sensor (e.g., a resistive touch sensor, a capacitive touch sensor, etc.), a proximity sensor, a force sensor (e.g., a strain gauge, a load cell), a vibration sensor, a chemical sensor, a sonar sensor, a position sensor (e.g., a GPS position sensor, a GNSS position sensor, a triangulation position sensor, etc.), or any other suitable sensor. In one variant, the flight system 12 comprises: a first camera mounted (e.g., statically or rotatably) along a first end of the body of the flight system, a field of view of the first camera intersecting a transverse plane of the body; a second camera mounted along the bottom of the aircraft body, the field of view of the second camera being substantially parallel to the transverse plane; and a set of orientation sensors, such as altimeters and accelerometers. However, the system can include any suitable number of any sensor types.
The processing system 22 of the flight system is used to control the operation of the flight system. The processing system 22 is capable of performing the following method; stabilizing flight system 12 during flight (e.g., selectively operating rotors to minimize in-flight system flapping); receiving, interpreting remote control commands and operating the flight system 12 based on the remote control commands; or otherwise control the operation of the flight system. The processing system 22 is preferably configured to receive and interpret the measurements sampled by the sensors 36, 44, more preferably by combining the measurements sampled by the different sensors (e.g., combining camera and accelerometer data). The flight system 12 can include one or more processing systems, where different processors can perform the same function (e.g., function as a multi-core system) or can be specialized. The processing system 22 can include one or more of: a processor (e.g., CPU, GPU, microprocessor, etc.), memory (e.g., flash memory, RAM, etc.), or any other suitable component. The processing system 22 is preferably mounted to the body 20, but can alternatively be mounted to any other suitable component. The processing system 22 is preferably powered by the power source 38, but can be otherwise powered. Processing system 22 is preferably connected to and controls sensors 36, 44, communication system 24, and lift mechanism 40, although processing system 22 can additionally or alternatively be connected to and interact with any other suitable component.
The communication system 24 of the flight system is used to communicate with one or more remote computing systems. The communication system 24 can be a long-range communication module, a short-range communication module, or any other suitable communication module. The communication system 24 can facilitate wired and/or wireless communication. Examples of communication systems 24 include 802.11x, Wi-Fi, Wi-Max, NFC, RFID, Bluetooth Low energy, ZigBee, cellular communications (e.g., 2G, 3G, 4G, LTE, etc.), Radio (RF), wired connections (e.g., USB), or any other suitable communication system 24 or combination thereof. The communication system 24 is preferably powered by the power source 38, but can be powered in other ways. The communication system 24 is preferably connected to the processing system 22, but can additionally or alternatively be connected to and interact with any other suitable component.
The body 20 of the flight system is used to support the components of the flight system. The body can additionally be used to protect components of the flight system. The body 20 preferably substantially encloses the communication system 24, the power supply 38, and the processing system 22, but can be otherwise configured. The body 20 can include a platform, a housing, or have any other suitable configuration. In one variation, the main body 20 includes a main portion housing the communication system 24, the power source 38, and the processing system 22, and first and second frames (e.g., holders) extending parallel to the rotor rotation plane and disposed along first and second sides of the main portion 20. These frames can serve as an intermediate assembly between the rotating rotors and a retention mechanism (e.g., such as a user's hand). The frame can extend along a single side of the body 20 (e.g., along the bottom of the rotor, along the top of the rotor), along first and second sides of the body 20 (e.g., along the top and bottom of the rotor), enclose the rotor (e.g., extend along all sides of the rotor), or otherwise be configured. These frames can be statically mounted to the body 20 or can be actuatably mounted to the body 20.
The frame can include one or more apertures (e.g., airflow apertures) that fluidly connect one or more rotors to the surrounding environment, which can be used to enable air and/or other suitable fluids to flow between the surrounding environment and the rotors (e.g., thereby enabling the rotors to generate aerodynamic forces that move the flight system 1 throughout the surrounding environment). The airflow apertures can be elongated or can have a substantial length and width. The airflow apertures can be substantially identical or can be different from one another. The airflow aperture is preferably small enough to prevent components of the retention mechanism (e.g., fingers of a hand) from passing through the airflow aperture. The geometric transparency (e.g., ratio of open area to total area) of the frame near the rotor is preferably large enough to enable flight of the flight system, more preferably to enable high performance maneuvers. For example, each airflow aperture can be smaller than a threshold size (e.g., an elongated slot all sizes smaller than the threshold size, narrower than the threshold size but significantly longer than the threshold size, etc.). In one particular example, the frame has a geometric transparency of 80-90%, and each airflow aperture (e.g., circular, polygonal such as regular hexagon, etc.) defines a circumscribed circle having a diameter of 12-16 millimeters. However, the body can be configured in other ways.
The body 20 (and/or any other suitable component of the flight system) can define a holding area that can be held by a holding mechanism (e.g., a human hand, a flight system dock, a pawl, etc.). The holding area preferably surrounds a portion of one or more rotors, more preferably completely all rotors, thereby preventing any inadvertent interaction between the rotors and the holding mechanism or other objects proximate flight system 12. For example, a projection of a holding area on a flight system plane (e.g., a transverse plane, a rotor plane, etc.) can overlap with a projection (e.g., partially, completely, mostly, at least 90%, etc.) of a sweep of one or more rotors (e.g., a sweep of one rotor, a total sweep of a set of rotors, etc.) on the same flight system plane.
The flight system 12 can additionally include inputs (e.g., microphone, camera, etc.), outputs (e.g., display, speaker, light emitting elements, etc.), or any other suitable components.
The remote computing system is used to receive auxiliary user input and can additionally be used to automatically generate and transmit control instructions for one or more flight systems 12 to the one or more flight systems 12. Each flight system 12 can be controlled by one or more remote computing systems. The remote computing system preferably controls the flight system 12 through a client (e.g., a local application, a browser application, etc.), but can otherwise control the flight system 12. The remote computing system can be a user device, a remote server system, a connected appliance, or any other suitable system. Examples of user devices include a tablet, a smartphone, a mobile phone, a laptop, a watch, a wearable device (e.g., glasses), or any other suitable user device. The user device can include a power storage device (e.g., a battery), a processing system (e.g., a CPU, GPU, memory, etc.), user output (e.g., a display, speakers, vibration mechanism, etc.), user input (e.g., a keyboard, a touchscreen, a microphone, etc.), a positioning system (e.g., a GPS system), sensors (e.g., optical sensors such as light sensors and cameras, orientation sensors such as accelerometers, gyroscopes, and altimeters, audio sensors such as microphones, etc.), a data communication system (e.g., a Wi-Fi module, BLE, cellular module, etc.), or any other suitable component.
The system 10 may be configured for controllerless user-drone interaction. Typically, the flight system or drone 12 requires a separate device, such as a flight system controller 14. The flight system controller 14 may be implemented in different types of devices including, but not limited to, ground stations, remote controls, or mobile phones, among others. In some embodiments, control of flight system 12 may be accomplished by a user through a user expression (expression) without use of flight system controller 14. The user expressions may include, but are not limited to, any activity performed by the user that does not include physical interaction with the flight system controller 14, including thoughts (measured by brain waves), facial expressions (including eye movements), gestures, and/or speech. In such embodiments, the user instructions are received directly via at least some of the optical sensors 36, and other sensors 44, and processed by the on-board processing system 22 to control the flight system 12.
In some embodiments, flight system 12 may alternatively be controlled via flight system controller 14.
In at least one embodiment, flight system 12 may be controlled without interacting with flight system controller 14, however, a display of flight system controller 14 may be used to display images and/or video relayed from flight system 12 that may assist user 18 in controlling flight system 12. Further, sensors 36, 44 associated with flight system controller 14 (e.g., one or more cameras and/or microphones (not shown)) may forward data to flight system 12, for example, when flight system 12 is too far from user 18. The sensor data forwarded from the flight system controller 14 to the flight system 12 is used in the same manner as the sensor data from the on-board sensors 36, 44 is used to control the flight system 12 using user expressions.
In this manner, flight system 12 can be fully controlled from start to finish (1) without the use of flight system controller 14 or (2) without physical interaction with flight system controller 14. Control of the flight system 12 is based on user instructions received at the various on-board sensors 36, 44. It should be noted that in the following discussion, the use of on-board sensors 36, 44 may also include the use of corresponding or similar sensors on flight system controller 14.
In general, the user 18 may use certain gestures and/or voice controls to control takeoff, landing, movement of the flight system 12 during flight, and other features, such as triggering of a photo and/or video capture. As discussed above, the flight system 12 may provide the following features without the use of the flight system controller 14 or without processing by the flight system controller 14:
-take-off and landing;
-an owner identification;
-face recognition;
-speech recognition;
-facial expression and gesture recognition; and
-controlling the flying system, e.g. controlling the movements of the flying system, based on the owner, facial expression and gesture recognition and speech recognition.
As described in detail above, the flight system 12 includes an optical system 26, and the optical system 26 includes one or more optical sensors 36, such as cameras. At least one on-board camera is configured for real-time video streaming and computer vision analysis. Optionally, the flying system 12 can have at least one depth sensor for multi-pixel depth sensing (or stereo-vision pair).
Generally, to provide full control of the flight system 12, a variety of user/drone interactions or activities are provided from the beginning to the end of the flight period. User/drone interactions include, but are not limited to, takeoff and landing, owner recognition, gesture recognition, facial expression recognition, and voice control.
Referring to FIG. 6, in another aspect of the present disclosure, flight system 12 may include an obstacle detection and avoidance system 50. In one embodiment, the obstacle detection and avoidance system 50 includes a pair of ultra-wide angle lens cameras 52A, 52B. As will be described more fully below, the pair of cameras 52A, 52B are coaxially mounted at the center of the top of the body and the center of the bottom of the body (see below).
The method and/or system can provide several advantages over conventional systems. First, the images recorded by the camera are processed on-board in real-time or near real-time. This allows the robotic instrument (robot) to navigate using images recorded by the camera.
A pair of cameras 52A, 52B are typically mounted or statically secured to the housing of the body 20. A memory 54 and a vision processor 56 are connected to the pair of cameras 52A, 52B. The system is used to sample images of a monitored area for real-time or near real-time image processing, such as depth analysis. The system can additionally or alternatively generate 3D video, generate maps of monitored areas, or perform any other suitable function.
The housing serves to hold the pair of cameras 52A, 52B in a predetermined configuration. The system preferably includes a single housing that holds the pair of cameras 52A, 52B, but the system can alternatively include multiple housing pieces or any other suitable number of housing pieces.
The pair of cameras 52A, 52B may be used to sample signals of the ambient environment near the flight system 12. The pair of cameras 52A, 52B are arranged such that the respective viewing cone of each camera overlaps the viewing cone of the other camera (see below).
Each camera 52A, 52B can be a CCD camera, a CMOS camera, or any other suitable type of camera. The camera can be sensitive in the visible spectrum, the IR spectrum or any other suitable spectrum. The camera can be hyperspectral, multispectral, or capture any suitable subset of the frequency bands. The camera can have a fixed focal length, an adjustable focal length, or any other suitable focal length. However, the camera can have any other suitable set of parameter values. The multiple cameras can be the same or different.
Each camera is preferably associated with a known position relative to a reference point (e.g., on the housing, on one of the plurality of cameras, on the host robotic instrument, etc.), but can be associated with an estimated, calculated, or unknown position. The pair of cameras 52A, 52B are preferably statically mounted to the housing (e.g., through holes in the housing), but can alternatively be actuatably mounted to the housing (e.g., by a joint). The camera can be mounted to a surface, edge, vertex of the housing, or to any other suitable housing feature. The camera can be aligned with, centered along, or otherwise arranged relative to the housing feature. The camera can be arranged such that the active surface is perpendicular to the housing radius or surface tangent, such that the active surface is parallel to the surface of the housing, or such that the active surface is otherwise arranged. Adjacent camera active surfaces can be parallel to each other, at a non-zero angle to each other, lie on the same plane, angled with respect to a reference plane, or otherwise arranged. Adjacent cameras preferably have baselines of 6.35cm (e.g., inter-camera distance or axial distance, distance between lenses, etc.), but can be more spaced or closer together.
The cameras 52A, 52B may be connected to the same vision processing system and memory, but can be connected to different vision processing systems and/or memories. Preferably, the cameras are sampled at the same clock, but the cameras can be connected to different clocks (e.g., where the clocks can be synchronized or otherwise related). The cameras are preferably controlled by the same processing system, but can be controlled by different processing systems. The cameras are preferably powered by the same power source (e.g., rechargeable batteries, solar panel arrays, etc.; host robotic device power source, separate power source, etc.), but can be powered by different power sources or otherwise.
The obstacle detection and avoidance system 50 may also include a transmitter 58, with the transmitter 58 being used to illuminate the physical area monitored by the cameras 52A, 52B. The obstacle detection and avoidance system 50 can include one emitter 58 for one or more of the cameras 52A, 52B, multiple emitters 58 for one or more of the cameras 52A, 52B, emitters 58, or any other suitable number of emitters 58 in any other suitable configuration. The one or more emitters 58 can emit modulated light, structured light (e.g., having a known pattern), collimated light, diffused light, or light having any other suitable property. The emitted light can include wavelengths in the visible range, UV range, IR range, or in any other suitable range. The transmitter location (e.g., relative to a given camera) is preferably known, but can alternatively be estimated, calculated, or otherwise determined.
In a second variation, the obstacle detection and avoidance system 50 operates as a contactless active 3D scanner. The non-contact obstacle detection and avoidance system is a time of flight sensor that includes a camera and a transmitter, wherein the camera records reflections (of signals transmitted by the transmitter) of obstacles in the monitored area and determines the distance between the obstacle detection and avoidance system 50 and the obstacle based on the reflected signals. The camera and emitter are preferably mounted within a predetermined distance (e.g., a few millimeters) from each other, but can be mounted in other ways. The emitted light can be diffuse light, structured light, modulated light, or light having any other suitable parameters. In a second variation, the non-contact obstacle detection and avoidance system is a triangulation system that also includes a camera and a transmitter. The emitter is preferably mounted a threshold distance beyond the camera (e.g., a few millimeters beyond the camera) and is oriented at a non-parallel angle relative to the active surface of the camera (e.g., mounted to the apex of the housing), but can be mounted in other ways. The emitted light can be collimated light, modulated light, or light having any other suitable parameters. However, the obstacle detection and avoidance system 50 can define any other suitable non-contact active system. However, the pair of cameras can form any other suitable optical ranging system.
The memory 54 of the obstacle detection and avoidance system 50 is used to store camera measurements. The memory can additionally be used to store the following: settings (settings); maps (e.g., calibration maps, pixel maps); camera position or index; transmitter location or index; or any other suitable collection of information. The obstacle detection and avoidance system 50 can include one or more memory devices. The memory is preferably non-volatile (e.g., flash memory, SSD, eMMC, etc.), but may alternatively be volatile (e.g., RAM). In one variation, the cameras 52A, 52B write to the same buffer, with each camera being assigned a different portion of the buffer. In a second variation, the cameras 52A, 52B write to different buffers in the same memory or in different memories. However, the cameras 52A, 52B can write to any other suitable memory. The memory 54 is preferably accessible to all processing systems of the system (e.g., vision processor, application processor), but can instead be accessed by a subset of the processing systems (e.g., a single vision processor, etc.).
The vision processing system 56 of the obstacle detection and avoidance system 50 is used to determine the distance of the physical point from the system. The vision processing system 56 preferably determines the pixel depth of each pixel from the subset of pixels, but can additionally or alternatively determine the object depth or determine any other suitable parameter of the physical point or set thereof (e.g., object). Vision processing system 56 preferably processes the sensor data streams from cameras 52A, 52B.
The vision processing system 56 may process each sensor data stream at a predetermined frequency (e.g., 30FPS), but can process the sensor data streams at a variable frequency or at any other suitable frequency. The predetermined frequency can be received from the application processing system 60, retrieved from storage, automatically determined based on camera scores or classifications (e.g., front, side, back, etc.), determined based on available computing resources (e.g., available cores, remaining battery power, etc.), or otherwise determined. In one variation, the vision processing system 56 processes multiple sensor data streams at the same frequency. In a second variation, the vision processing system 56 processes the multiple sensor data streams at different frequencies, where the frequencies are determined based on the classification assigned to each sensor data stream (and/or source camera), where the classification is assigned based on the direction of the source camera relative to the travel vector of the host robotic instrument.
The application processing system 60 of the obstacle detection and avoidance system 50 is used to determine time division multiplexing parameters for the sensor data streams. Application processing system 60 can additionally or alternatively perform object detection, classification, tracking (e.g., optical flow), or any other suitable processing using sensor data streams. Application processing system 60 can additionally or alternatively generate control instructions based on a stream of sensor data (e.g., based on a visual processor output). For example, a sensor data stream can be used to perform a navigation or visual ranging (odometry) process (e.g., using SLAM, RRT, etc.), where the system and/or host automated instrument is controlled based on navigation output.
The application processing system 60 can additionally or alternatively receive control commands and operate the flight system 12 and/or the host robotic instruments based on the commands. Application processing system 60 can additionally or alternatively receive external sensor information and selectively operate the system and/or host robotic instruments based on commands. The application processing system 60 can additionally or alternatively determine automated instrument system kinematics (e.g., position, orientation, velocity, acceleration) based on sensor measurements (e.g., using sensor fusion). In one example, application processing system 60 can use measurements from accelerometers and gyroscopes to determine a transit vector (e.g., direction of travel of the system) for the system and/or host robotic machine. The application processing system 60 can optionally automatically generate control instructions based on the automated instrument system kinematics. For example, the application processing system 60 can determine the position of the system (in physical space) based on images from the cameras 52A, 52B, where the relative position (from the orientation sensor) and the actual position and velocity (determined from the images) can be fed to the flight control module. In this example, images from the downward facing camera subset can be used to determine the system translation (e.g., using optical flow), where the system translation can be further fed into the flight control module. In one particular example, the flight control module can synthesize these signals to maintain the position of the robotic device (e.g., hover the drone).
The application processing system 60 can include one or more application processors. The application processor can be a CPU, GPU, microprocessor or any other suitable processing system. The application processing system 60 can be implemented as part of the vision processing system 56, or separate from the vision processing system 56, or distinct from the vision processing system 56. The application processing system 60 may be connected to the vision processing system 56 by one or more interface bridges. The interface bridge can be a high-throughput and/or high-bandwidth connection and can use MIPI protocol (e.g., 2-input to 1-output camera aggregation bridge — extending the number of cameras that can be connected to the vision processor), LVDS protocol, DisplayPort protocol, HDMI protocol, or any other suitable protocol. Alternatively or additionally, the interface bridge can be a low-throughput and/or low-bandwidth connection and can use the SPI protocol, the UART protocol, the I2C protocol, the SDIO protocol, or any other suitable protocol.
The system can optionally include an image signal processing unit (ISP)62, the image signal processing unit 62 being used to pre-process the camera signal (e.g., image) prior to passing the camera signal to the vision processing system and/or the application processing system. The image signal processing unit 62 can process signals from all cameras, signals from a subset of cameras, or signals from any other suitable source. The image signal processing unit 62 can automatically white balance, correct field shading, correct lens distortion (e.g., fisheye correction (dewarp)), crop, select a subset of pixels, apply a Bayer transform, demosaic, apply noise reduction, sharpen an image, or otherwise process a camera signal. For example, the image signal processing unit 62 can select pixels associated with an overlapping physical region between two cameras from the images of the respective streams (e.g., crop each image to include only pixels associated with an overlapping region shared between the cameras of a stereo camera pair). The image signal processing unit 62 can be a system on a chip with a multi-core processor architecture, an ASIC, with an ARM architecture, part of a vision processing system, part of an application processing system, or any other suitable processing system.
The system can optionally include a sensor 64, the sensor 64 being used to sample signals indicative of the operation of the system. The sensor output can be used to determine system kinematics, process images (e.g., for image stabilization), or otherwise be used. The sensors 64 can be peripheral to the vision processing system 56, the application processing system 60, or any other suitable processing system. The sensor 64 is preferably statically mounted to the housing, but can alternatively be mounted to a host robotic instrument or to any other suitable system. The sensor 64 can include: orientation sensors (e.g., Inertial Measurement Units (IMUs), gyroscopes, accelerometers, altimeters, magnetometers), acoustic sensors (e.g., microphones, transducers), optical sensors (e.g., cameras, ambient light sensors), touch sensors (e.g., force sensors, capacitive touch sensors, resistive touch sensors), position sensors (e.g., GPS systems, beacon systems, trilateration systems), or any other suitable set of sensors.
The system can optionally include inputs (e.g., keyboard, touch screen, microphone, etc.), outputs (e.g., speaker, light, screen, vibration mechanism, etc.), a communication system (e.g., WiFi module, BLE, cellular module, etc.), a power storage device (e.g., battery), or any other suitable component.
The system is preferably used with a host robotic machine that is used to navigate within a physical space. The host robotic machine can additionally or alternatively receive and operate according to remote control commands. The host robotic appliance can additionally generate remote content or perform any other suitable function. The host robotic instrument can include one or more of: a communication module, a motion mechanism, a sensor, a content generation mechanism, a processing system, a reset mechanism, or any other suitable collection of components. The host robotic machine can be a drone, a vehicle, a robotic machine, a security camera, or any other suitable remotely controllable system. Motion mechanism 66 can include a transmission system, rotors, jets, pedals, a rotary connection, or any other suitable motion mechanism. The application processing system is preferably a host automated instrument processing system, but can alternatively be connected to or otherwise associated with the host automated instrument processing system. In one particular example, the host robotic machine includes a flight system (e.g., drone) having a WiFi module, a camera, and an application processing system. The system can be mounted to the top of the host robotic machine (e.g., as determined during typical operations based on the gravity vector), the bottom of the host robotic machine, the front of the host robotic machine, centered within the host robotic machine, or otherwise mounted to the host robotic machine. The system can be integrally formed with, removably coupled to, or otherwise attached to the host robotic instrument. One or more systems can be used with one or more host robotic instruments.
Hand-held remote control device
Referring to fig. 7-10, an exemplary embodiment of a handheld remote control device 8 is shown. In the illustrated embodiment, the handheld remote control device 8 includes a support structure 72, a quick capture and release coupling mechanism 74, a controller coupling mechanism 76, and a battery 78. As shown in fig. 8, in the illustrated embodiment, the support structure 72 may comprise an elongated tube 72A. The battery 78 may be replaceable and disposed within one end of the elongate tube 72A.
A controller coupling mechanism 76 couples the flight system controller 14 to the support structure 72. In one aspect of the present disclosure, the flight system controller 14 may be embedded in the support structure 72 or a portion of the flight system controller 14 may be integrally formed with the support structure 72. In another aspect of the present disclosure, the controller coupling mechanism 76 allows the flight system controller 14 to be removably coupled to the support structure 72. For example, the controller coupling mechanism 76 may include a magnet (not shown) and may utilize magnetic force to hold the flight system controller 14 in place. Alternatively, the controller coupling mechanism 76 may include one or more flexible attachments (not shown) for frictionally holding the flight system controller 14 in a desired position. The controller coupling mechanism 76 may alternatively include a clamping mechanism or a plurality of fasteners or screws for releasably coupling the flight system controller 14 to the support structure 72.
As described above, the flight system controller 14 may include a user device (e.g., a smart phone, tablet, laptop, etc.), a networked server system, or any other suitable remote computing system. In general, the flight system controller 14 may include a touch screen display and a microphone.
The elongated tube 72A may form a handle 72B. The function button 72C may be located within the handle 72B. As discussed in further detail below, the user may use the function button 72C to control the quick capture and release coupling mechanism 74 during takeoff and landing operations. Alternatively or additionally, the quick capture and release coupling mechanism 74 may be controlled by the user through the use of the flight system controller 14.
In one embodiment, the quick capture and release coupling mechanism 74 includes a magnetic clutch device 80.
Referring to fig. 7 and 8, in the first embodiment, the magnetic clutch device 80 includes an electromagnet 82 (first magnetic member) and a coupler member 84 (second magnetic member). The current may be controllably provided to the electromagnet 82 via the function button 72C and/or the flight system controller 14. With particular reference to FIG. 8, the coupler member 84 is secured to the bottom of the flight system 12. The coupler member 84 may be a permanent magnet or made of magnetizable material (e.g., an electromagnet) or may be made of magnetically soft material (e.g., pure iron or silicon steel). When current is applied to the electromagnet 82 and the coupler member 84 is within a sufficient distance from the electromagnet 82, the flight system 12 is magnetically coupled to the hand-held remote control device 8. If the flight system 12 is magnetically coupled to the hand-held remote control device 8 and the current to the electromagnet 82 is terminated, the flight system 12 is released from the hand-held remote control device 8. Accordingly, takeoff and landing operations may be accomplished by manually or automatically controlling the current applied to the electromagnets 82 and the control instructions sent to the flight system 12.
It should be understood that the term "electromagnet" as used in this disclosure refers to any electromagnetic component that can generate a magnetic field by being energized and/or change the direction of a magnetic field by changing the direction of a current, including but not limited to electromagnets.
After takeoff, the flight of the flight system 12 and/or the camera or the onboard camera can be controlled by one or a combination of the function buttons 72C, the microphone or the inertial sensor-based control on the hand-held remote control device 8 or the flight system controller 14 (see above).
As discussed above, the flight system controller 14 is releasably coupled to the support structure 72. Thus, the flight system controller 14 may be used when separate from the support structure 72. The hand-held remote control device 8 may be designed to include one or more quick control buttons (not shown) to perform quick takeoff and landing operations, or other operations or programmable operations.
The battery 78 provides power to the quick capture and release coupling mechanism 74 and may additionally provide power to the flight system 12 and/or the flight system controller 14 via a charging cable or via a wireless charging module (not shown) and/or charge batteries of the flight system 12 and/or the flight system controller 14.
Referring to fig. 9-12, a second embodiment of a quick capture and release coupling mechanism 74 is shown. In a second embodiment, the quick capture and release coupling mechanism 74 uses a permanent magnet or magnetizable material 86. The magnetic force between the permanent magnet 86 and the coupler member 84 is controlled by controlling the distance between the permanent magnet 86 and the coupler member 84.
With particular reference to fig. 10, the permanent magnet 86 is secured to a movable tray 88. The movable tray 88 is slidably coupled to one or more guide rails 90. The quick catch and release coupling mechanism 74 also includes a bracket 92 and a motor 98, the bracket 92 forming a housing 92A of the quick catch and release coupling mechanism 74. The motor 98 is mounted to the base 92B of the bracket 92. The motor 98 is capable of driving the screw 94 to rotate. The screws 94 are threaded through threaded holes 96 in the tray. The motor 98 may be any type of suitable motor including, but not limited to, a gear motor or a servo motor. A motor 98 and one or more guide rails 90 are secured to the base 92. The screw 94 may be controllably rotated by a motor 98 to drive the permanent magnet 86 linearly up and down to vary the distance between the permanent magnet 86 and the coupler member 84 secured to the heeling apparatus 12. The quick catch and release coupling mechanism 74 may utilize other suitable mechanisms to vary the distance between the permanent magnet 86 and the coupler member 84, including but not limited to a crank and a linkage.
Fig. 9 and 10 illustrate the permanent magnet or magnetizable material 86 of the quick capture and release coupling mechanism 74 in a raised position where the distance between the permanent magnet or magnetizable material 86 and the coupler member 84 is minimal, thereby creating the maximum magnetic attraction force that enables the permanent magnet or magnetizable material 86 and the coupler member 84 to be effectively magnetically attracted.
Fig. 11 and 12 illustrate the quick capture and release coupling mechanism 74 using the permanent magnet or magnetizable material 86 in a lowered position where the distance between the permanent magnet or magnetizable material 86 and the coupler member 84 is at a maximum, such that the magnetic force is weakened, such that the permanent magnet or magnetizable material 86 and the coupler member 84 cannot effectively magnetically attract, thereby releasing the coupler member 84 (i.e., releasing the flight system 12).
FIG. 13 is another schematic view of the hand-held remote control device of FIG. 1 in conjunction with an exemplary flight system according to a third embodiment of the present disclosure. Fig. 14 is an exploded state view of fig. 13. The third embodiment shown in fig. 13 and 14 provides a more compact, portable structure.
In particular, the flight system 12, flight system controller 14, and support structure 72 shown in fig. 13 and 14 may have the structure and functionality described above in the summary and description of the hand-held remote control device with respect to the system 10 and flight system 12. Specifically, flight system 12 includes four rotors and optical system 26, flight system controller 14 is integrally formed with support structure 72, and flight system 12 is capable of being inserted into a recess 721 at the forward end of support structure 72 such that flight system 12 is substantially contained within recess 721 without protruding beyond the profile of support structure 72 to achieve a more compact, portable structure. The flight system controller 14 includes a display 141, function knobs 142, and buttons 143. The display 141 may be used as a touch screen, and may be used to display a video taken by the optical system 26 in real time, display a flight trajectory, preview a photograph or video stream, and the user may also input various control instructions, such as selecting a flight trajectory, etc., through the display 141. Function knobs 142 and buttons 143 may be used to manually adjust a flight status (e.g., altitude, orientation, etc.) of flight system 12.
In a third embodiment shown in fig. 13 and 14, similar to the first and second embodiments described above, the quick capture and release coupling mechanism 74 uses an electromagnet 82 or uses a permanent magnet or magnetizable material (the electromagnet 82 is shown as an example in fig. 14) to magnetically attract the coupler components (not shown) of the flying system 12 to secure the flying system 12.
In addition to securing the flying system 12 via magnetic attraction, in the third embodiment shown in fig. 13 and 14, a catch 722 is provided for cooperating with a correspondingly provided catch (not shown) on the flying system 12 to more securely secure the flying system 12 to the support structure 72. In this manner, after the flight system 12 is initially positioned and secured to the support structure 72 via the magnetic attraction between the electromagnet 82 and the coupler member 84 of the flight system 12, the flight system 12 may be further secured via the catch 722 (e.g., a user may snap a latch of the flight system 12 into the catch 722 by pressing the flight system 12). Thereby, a smaller electromagnet 82 can be provided. Furthermore, after the latch of the flying system 12 is snapped into the catch 722, the electromagnet 82 may no longer be powered, thereby saving power.
It should be understood that, in the third embodiment, it is possible to detect whether or not the tongue of the flying system 12 is caught in the catch 722 by providing a sensor or the like, and automatically turn off the power supply to the electromagnet 82 after detecting that the tongue of the flying system 12 is caught in the catch 722 (i.e., the catch 722 is in a locked state of locking the tongue). This can achieve an advantage of further power saving.
It should also be appreciated that the catch 722 may also be controlled by providing a suitable motor to release the catch of the flying system 12, similar to the magnetic attraction and release achieved by turning the power to the electromagnet 82 on and off. This simplifies the operation of the release flight system 12. For example, one-touch release of the flying system 12 can be achieved by pressing a release button (e.g., button 143) to simultaneously disconnect power to the electromagnet 82 and control the catch 722 to release the catch of the flying system 12. As a possible embodiment, the direction of snapping of the latch of the flying system 12 into said catch 722 is the same as the direction of magnetic attraction between the electromagnet 82 and the coupler member 84, so as to simplify the operations of attraction and release of the flying system 12.
Although the electromagnet 82 and the catch 722 are shown as separate components in fig. 13 and 14, it should be understood that the electromagnet 82 and the catch 722 may also be provided as a unitary component, e.g., at least a portion of the catch 722 may be provided as an electromagnet.
Additionally, in the third embodiment shown in fig. 13 and 14, the support structure 72 may be provided with one or more cameras (not shown) so that the support structure 72 itself may also be used to take pictures or video, for example, synchronously with the optical system 26 of the flight system 12, and the pictures or video taken by the cameras of the support structure 72 and the pictures or video taken by the optical system 26 of the flight system 12 may be displayed on the display 141 simultaneously. It should be understood that the camera of the support structure 72 may be fixedly arranged, may also be translatably and/or rotatably arranged, and that an actuating device (e.g. a motor) for driving the camera of the support structure 72 in translation and/or rotation may be provided. Such actuating devices are well known in the art and therefore will not be described in detail herein.
FIG. 15 is another schematic view of the hand-held remote control device of FIG. 1 in conjunction with an exemplary flight system according to a fourth embodiment of the present disclosure. Fig. 16 is an exploded state view of fig. 15. Fig. 17 and 18 are schematic views of the power module of the flight system removed from fig. 15. The fourth embodiment is similar in construction to the third embodiment, except that the flight system 12 of the fourth embodiment includes a flight system body 121 and a removable power module 122.
In particular, the flight system 12, flight system controller 14, and support structure 72 shown in fig. 15-18 have the structure and functionality described above in the summary and description of the hand-held robot with respect to the system 10 and flight system 12. The detachable power module 122 of the flying system 12 includes two rotors and a motor for driving the rotors, and is detachably mounted to the flying system body 121. Therefore, when the flying system is not needed to fly, the power module 122 can be detached from the flying system main body 121 and placed independently, so that the space occupied by the flying system is reduced, and the portability is improved. It should be understood that the above-described components of the flight system 12, such as sensors, optical systems, power supplies, etc., may be provided in the flight system body 121, and the power module 122 may also be provided with the above-described components of the flight system, such as sensors, etc.
The flight system controller 14 and the support structure 72 are integrally formed, and the flight system body 121 of the flight system 12 can be inserted into a recess 721 at the front end of the support structure 72, so that the flight system body 121 of the flight system 12 is substantially accommodated in this recess 721 without protruding outside the contour of the support structure 72, to achieve a more compact, portable structure. The flight system controller 14 includes a plurality of control buttons.
In the fourth embodiment, similar to the first, second, and third embodiments described above, the quick capture and release coupling mechanism 74 uses an electromagnet 82 or a permanent magnet or magnetizable material 86 (the electromagnet 82 is shown as an example in fig. 16) to magnetically attract coupler components (not shown) on the flying system body 121 of the flying system 12 to secure the flying system 12.
It should be understood that in the fourth embodiment shown in fig. 15-18, the support structure 72 itself may be provided with one or more cameras 722 (only one shown in fig. 17) so that the support structure 72 may also be used to take pictures or video, for example, in synchronization with the optical system 26 of the flight system 12.
The structure of the fourth embodiment is very advantageous in the case of a flight system 12 provided with a large rotor. The reason is that the power module 122 of the flight system including the rotor can be detached from the flight system main body 121 and placed separately, and the flight system main body 121 and the support structure 72 are formed as a compact whole, so portability of the flight system 12 and the support structure 72 as a whole is significantly improved. In addition, since the power module 122 can be easily snapped onto and detached from the flight system body 121 (such a snap structure is well known in the art and thus will not be described in detail), the takeoff preparation time and the storage time can be significantly reduced.
It should be appreciated that the quick capture and release coupling mechanism 74, when using the electromagnet 82, may facilitate takeoff of the aircraft 12 by controlling the current through the electromagnet 82 such that the electromagnet 82 is magnetically identical to the coupler member 84 by controlling the direction of current through the electromagnet 82 such that a repulsive force (i.e., a third operating state) is generated between the electromagnet 82 and the coupler member 84 (i.e., between the support structure 72 and the flight system 12) in addition to having both the magnetically attracted coupler member 84 (i.e., the first operating state) and the released coupler member 84 (i.e., the second operating state) by controlling the current through the electromagnet 82.
Although omitted for simplicity, the preferred embodiments include each combination and permutation of the various system components and the various method processes, which can be performed in any suitable order, sequentially or simultaneously.
As those skilled in the art will recognize from the foregoing detailed description and from the accompanying drawings and claims, modifications and variations can be made to the preferred embodiments of the present disclosure without departing from the scope of the disclosure as defined in the following claims.

Claims (10)

1. A hand-held remote control device for use with a flight system, the hand-held remote control device comprising:
a support structure;
a quick capture and release coupling mechanism disposed on the support structure and having a first magnetic component; and
a controller for controlling the first magnetic component such that the first magnetic component can have at least a first operating state and a second operating state, the first magnetic component producing magnetic attraction with the second magnetic component of the flight system in the first operating state, the first magnetic component not producing magnetic attraction with the second magnetic component of the flight system in the second operating state.
2. The handheld remote control device of claim 1, wherein the first magnetic component is an electromagnet, and in the first operating state the controller switches on current through the first magnetic component to cause the electromagnet to magnetically attract with a second magnetic component of the flight system, and in the second operating state the controller switches off current through the first magnetic component such that the electromagnet does not magnetically attract with the second magnetic component of the flight system.
3. The hand-held remote control device of claim 2, wherein the controller is capable of controlling a direction of current through the first magnetic component to cause the first magnetic component to have a third operating state different from the first and second operating states, the third operating state wherein a magnetic property of the first magnetic component is the same as a magnetic property of the second magnetic component such that the first and second magnetic components repel each other.
4. The handheld remote control device of claim 1, wherein the first magnetic component is a permanent magnet, the handheld remote control device further comprising a first magnetic component drive mechanism disposed on the support structure, the first magnetic component drive mechanism capable of moving the first magnetic component between a first position in which the first magnetic component and the second magnetic component are capable of producing a first magnetic attraction force and a second position in which no magnetic attraction force or a second magnetic attraction force less than the first magnetic attraction force is produced between the first magnetic component and the second magnetic component.
5. The hand-held remote control device of claim 4, wherein the first magnetic member drive mechanism comprises a motor, a guide rail, a movable tray coupled to the guide rail, and a screw coupled to the motor, the first magnetic member being secured to the movable tray, the screw being threaded through a threaded hole of the movable tray, the motor being capable of driving the screw to rotate such that the movable tray is capable of moving up and down along the guide rail to move the first secondary magnetic member between the first and second positions.
6. The handheld remote control device of claim 1, wherein the support structure comprises an elongated tube within which a battery is housed.
7. The hand-held remote control device of claim 1, wherein the hand-held remote control device comprises a display screen disposed on the support structure and connected to the controller, the display screen capable of receiving and displaying image signals from the flight system or the hand-held remote control device.
8. The handheld remote control device of claim 1, wherein the handheld remote control device comprises one or more cameras.
9. The hand-held remote control device according to claim 8, wherein the hand-held remote control device comprises a camera drive mechanism for driving the camera in translation and/or rotation.
10. A flight system kit, the flight system kit comprising:
a hand-held remote control device according to any one of claims 1 to 9; and
a flying system comprising a second magnetic component capable of producing a magnetic attraction with the first magnetic component of the handheld remote control device.
CN201910897267.8A 2018-09-28 2019-09-23 Hand-held remote control device and flight system set Active CN110615095B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/107716 WO2020063631A1 (en) 2018-09-28 2019-09-25 Handheld remote control device and flight system kit

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862738251P 2018-09-28 2018-09-28
US62/738,251 2018-09-28

Publications (2)

Publication Number Publication Date
CN110615095A true CN110615095A (en) 2019-12-27
CN110615095B CN110615095B (en) 2021-04-13

Family

ID=68923831

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910897267.8A Active CN110615095B (en) 2018-09-28 2019-09-23 Hand-held remote control device and flight system set

Country Status (2)

Country Link
CN (1) CN110615095B (en)
WO (1) WO2020063631A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111268126A (en) * 2020-01-31 2020-06-12 武汉大学 Wireless charging relay station, charging flight control system and method for power line inspection unmanned aerial vehicle
CN113148216A (en) * 2021-06-02 2021-07-23 南京联汇智能科技有限公司 Ejection type unmanned aerial vehicle
CN113859569A (en) * 2021-10-25 2021-12-31 成都飞机工业(集团)有限责任公司 Portable swarm unmanned aerial vehicle system and use method thereof

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114063496A (en) * 2021-11-02 2022-02-18 广州昂宝电子有限公司 Unmanned aerial vehicle control method and system and remote controller for remotely controlling unmanned aerial vehicle

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104722080A (en) * 2015-03-31 2015-06-24 尚平 Aircraft
CN105059558A (en) * 2015-07-16 2015-11-18 珠海云洲智能科技有限公司 Take-off and landing system for unmanned ship-borne unmanned aerial vehicle
CN105836148A (en) * 2016-05-19 2016-08-10 重庆大学 Wearable rotor craft
CN106218909A (en) * 2016-09-23 2016-12-14 珠海格力电器股份有限公司 Aircraft and system of taking photo by plane
US20180084173A1 (en) * 2016-09-16 2018-03-22 Gopro, Inc. Vibration Damping Gimbal Sleeve for an Aerial Vehicle
WO2018119578A1 (en) * 2016-12-26 2018-07-05 SZ DJI Technology Co., Ltd. Transformable apparatus
CN208021754U (en) * 2018-03-21 2018-10-30 北京中科遥数信息技术有限公司 A kind of unmanned plane delivery device

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2476318Y (en) * 2001-03-27 2002-02-13 美昌玩具制品厂有限公司 Toy car remote-controller and controllable car
US10054939B1 (en) * 2012-09-22 2018-08-21 Paul G. Applewhite Unmanned aerial vehicle systems and methods of use
CN205498797U (en) * 2016-03-21 2016-08-24 普宙飞行器科技(深圳)有限公司 Air park and unmanned aerial vehicle landing system
CN205582253U (en) * 2016-05-10 2016-09-14 南京乐乐飞电子科技有限公司 Multi -functional unmanned aerial vehicle remote controller
CN205801541U (en) * 2016-05-31 2016-12-14 南昌理工学院 SUAV electromagnetic ejection system
CN206437233U (en) * 2016-09-08 2017-08-25 厦门九星天翔航空科技有限公司 A kind of vehicle-mounted unmanned aerial vehicle launching apparatus
CN106516149A (en) * 2017-01-03 2017-03-22 上海量明科技发展有限公司 Electromagnetic aircraft launch system and method
US10293957B2 (en) * 2017-01-30 2019-05-21 Hanhui Zhang Rotary wing unmanned aerial vehicle and pneumatic launcher
CN206885335U (en) * 2017-05-09 2018-01-16 昊翔电能运动科技(昆山)有限公司 A kind of unmanned plane undercarriage and unmanned plane
CN207389580U (en) * 2017-07-28 2018-05-22 中交遥感载荷(北京)科技有限公司 A kind of unmanned plane landing following device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104722080A (en) * 2015-03-31 2015-06-24 尚平 Aircraft
CN105059558A (en) * 2015-07-16 2015-11-18 珠海云洲智能科技有限公司 Take-off and landing system for unmanned ship-borne unmanned aerial vehicle
CN105836148A (en) * 2016-05-19 2016-08-10 重庆大学 Wearable rotor craft
US20180084173A1 (en) * 2016-09-16 2018-03-22 Gopro, Inc. Vibration Damping Gimbal Sleeve for an Aerial Vehicle
CN106218909A (en) * 2016-09-23 2016-12-14 珠海格力电器股份有限公司 Aircraft and system of taking photo by plane
WO2018119578A1 (en) * 2016-12-26 2018-07-05 SZ DJI Technology Co., Ltd. Transformable apparatus
CN208021754U (en) * 2018-03-21 2018-10-30 北京中科遥数信息技术有限公司 A kind of unmanned plane delivery device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111268126A (en) * 2020-01-31 2020-06-12 武汉大学 Wireless charging relay station, charging flight control system and method for power line inspection unmanned aerial vehicle
CN113148216A (en) * 2021-06-02 2021-07-23 南京联汇智能科技有限公司 Ejection type unmanned aerial vehicle
CN113859569A (en) * 2021-10-25 2021-12-31 成都飞机工业(集团)有限责任公司 Portable swarm unmanned aerial vehicle system and use method thereof

Also Published As

Publication number Publication date
WO2020063631A1 (en) 2020-04-02
CN110615095B (en) 2021-04-13

Similar Documents

Publication Publication Date Title
US11649052B2 (en) System and method for providing autonomous photography and videography
CN110692027B (en) System and method for providing easy-to-use release and automatic positioning of drone applications
US11423792B2 (en) System and method for obstacle avoidance in aerial systems
CN110687902B (en) System and method for controller-free user drone interaction
CN110615095B (en) Hand-held remote control device and flight system set
CN110733624B (en) Unmanned flight system and control system for unmanned flight system
CN111596649B (en) Single hand remote control device for an air system
US11530038B2 (en) Detachable protection structure for unmanned aerial systems
US11530025B2 (en) Foldable rotor blade assembly and aerial vehicle with a foldable rotor blade assembly
US11008095B2 (en) Foldable rotor blade assembly and aerial vehicle with a foldable rotor blade assembly
US20210362877A1 (en) Two-axis gimbal system for supporting a camera
WO2020143029A1 (en) Two-axis gimbal system for supporting a camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP02 Change in the address of a patent holder

Address after: 311121 Room 401, 4th floor, building 2, Wudi center, Cangqian street, Yuhang District, Hangzhou City, Zhejiang Province

Patentee after: HANGZHOU ZERO ZERO TECHNOLOGY Co.,Ltd.

Address before: 310000 building 13, 1399 liangmu Road, Cangqian street, Yuhang District, Hangzhou City, Zhejiang Province

Patentee before: HANGZHOU ZERO ZERO TECHNOLOGY Co.,Ltd.

CP02 Change in the address of a patent holder