WO2018112848A1 - Procédé de commande de vol et appareil - Google Patents

Procédé de commande de vol et appareil Download PDF

Info

Publication number
WO2018112848A1
WO2018112848A1 PCT/CN2016/111564 CN2016111564W WO2018112848A1 WO 2018112848 A1 WO2018112848 A1 WO 2018112848A1 CN 2016111564 W CN2016111564 W CN 2016111564W WO 2018112848 A1 WO2018112848 A1 WO 2018112848A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
horizontal plane
drone
flying
angle
Prior art date
Application number
PCT/CN2016/111564
Other languages
English (en)
Chinese (zh)
Inventor
郭灼
苏冠华
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202110169187.8A priority Critical patent/CN112987782A/zh
Priority to CN201680076224.8A priority patent/CN108450032B/zh
Priority to PCT/CN2016/111564 priority patent/WO2018112848A1/fr
Publication of WO2018112848A1 publication Critical patent/WO2018112848A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Definitions

  • Embodiments of the present invention relate to the field of UAV technologies, and in particular, to a flight control method and apparatus.
  • the existing drone captures the picture through the camera device disposed thereon, and displays the picture to the user in real time through the display interface. If the user is interested in an object in the picture, the drone can be controlled to enter the pointing flight mode. That is, the user specifies a position on the screen, and the aircraft flies toward the position. However, when the camera is facing the ground, it is safe to consider that the aircraft cannot enter the pointing flight mode.
  • Embodiments of the present invention provide a flight control method and apparatus for preventing an unmanned aerial vehicle from easily touching an obstacle, ensuring flight safety of the drone, and expanding a target position range of the drone's pointing flight.
  • an embodiment of the present invention provides a flight control method, including:
  • an embodiment of the present invention provides a flight control apparatus, including:
  • a target determining module configured to determine a first target according to a specified position in the image
  • a flight mode determining module configured to determine the unmanned according to the size of the angle when the angle between the connection between the first target and the current position of the drone and the horizontal plane is greater than the first preset angle Flight mode of the aircraft;
  • a control module configured to control the drone to fly to the second target according to the determined flight mode, wherein a distance between the second target and the first target is not less than a preset distance.
  • an embodiment of the present invention provides a flight control apparatus, including: a memory and a processor;
  • the memory for storing code for executing a flight control method
  • the processor configured to invoke the code stored in the memory, to perform: determining a first target according to a specified position in an image; and connecting to a current location of the first target and the drone When the angle between the line and the horizontal plane is greater than the first preset angle, determining the flight mode of the drone according to the size of the angle; controlling the drone to fly to the second target according to the determined flight mode, The distance between the second target and the first target is not less than a preset distance.
  • an embodiment of the present invention provides a flight control system for a drone, including: a drone; and a flight control device according to the second or third aspect of the present invention.
  • the flight control method and apparatus provided by the embodiments of the present invention, and the flight control system of the drone when the first target determined according to the specified position in the image and the current position of the drone are at an angle with the horizontal plane
  • the drone is controlled to fly toward the second target, so that the drone flies to the second target at a preset distance from the first target, so that the drone does not easily touch the obstacle It ensures the safety of the drone flight and also expands the target range of the drone's pointing flight.
  • FIG. 1 is a schematic architectural diagram of an unmanned flight system 100 in accordance with an embodiment of the present invention
  • FIG. 2 is a flowchart of a flight control method according to an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of determining a first target by using multiple imaging devices according to an embodiment of the present invention
  • FIG. 4 is a schematic diagram of an angle between a first target and a current position of a drone and a horizontal plane according to an embodiment of the present invention
  • FIG. 5 is a schematic diagram of a first target and a current position of a drone according to an embodiment of the present invention, wherein an angle between a line and a horizontal plane is greater than a first preset angle;
  • FIG. 6 is a schematic diagram of a flight mode of a drone according to an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of a flight mode of a drone according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram of a first target and a current position of a drone according to an embodiment of the present invention, wherein an angle between a line and a horizontal plane is greater than a first preset angle and less than a second preset angle;
  • FIG. 9 is a schematic diagram of a flight mode of a drone according to an embodiment of the present invention.
  • FIG. 10 is a schematic diagram of a flight mode of a drone according to an embodiment of the present invention.
  • FIG. 11 is a schematic diagram of a first target and a current position of a drone and an angle between a horizontal plane and a horizontal plane are greater than a second preset angle according to an embodiment of the present invention
  • FIG. 12 is a schematic diagram of a flight mode of a drone according to an embodiment of the present invention.
  • FIG. 13 is a schematic diagram of a ground control device for controlling flight of a drone according to an embodiment of the present invention.
  • FIG. 14 is a schematic diagram of displaying a preset icon according to an embodiment of the present invention.
  • FIG. 15 is a schematic diagram of displaying a preset icon according to an embodiment of the present invention.
  • FIG. 16 is a schematic structural diagram of a flight control device according to Embodiment 1 of the present invention.
  • FIG. 17 is a schematic structural diagram of a flight control device according to Embodiment 2 of the present invention.
  • FIG. 18 is a schematic structural diagram of a flight control system of a drone according to an embodiment of the present invention.
  • Embodiments of the present invention provide flight control methods and apparatus, as well as flight control systems for drones.
  • the following description of the invention uses a drone as an example of a drone.
  • the drone can be a small or large drone.
  • the drone may be a rotorcraft, for example, a multi-rotor drone powered by air by a plurality of pushing devices, embodiments of the invention are not limited thereto, drones It can also be other types of drones.
  • FIG. 1 is a schematic architectural diagram of an unmanned flight system 100 in accordance with an embodiment of the present invention. This embodiment is described by taking a rotorless drone as an example.
  • the unmanned flight system 100 can include a drone 110, a pan/tilt head 120, a display device 130, and a steering device 140.
  • the drone 110 can include a power system 150, a flight control system 160, and a rack 170.
  • the drone 110 can be in wireless communication with a ground control device, which can include the steering device 140 and/or the display device 130.
  • Rack 170 can include a fuselage and a stand (also known as a landing gear).
  • the fuselage may include a center frame and one or more arms coupled to the center frame, the one or more arms extending radially from the center frame.
  • the tripod is coupled to the fuselage for supporting when the drone 110 is landing.
  • the powertrain 150 may include an electronic governor (referred to as ESC) 151, one or more propellers 153, and one or more motors 152 corresponding to one or more propellers 153, wherein the motor 152 is coupled to the electronic governor 151 and the propeller 153, the motor 152 and the propeller 153 are disposed on the corresponding arm; the electronic governor 151 is configured to receive the driving signal generated by the flight control system 160, and provide a driving current to the motor 152 according to the driving signal to control The rotational speed of the motor 152.
  • Motor 152 is used to drive the propeller to rotate to power the flight of drone 110, which enables drone 110 to achieve one or more degrees of freedom of motion.
  • the drone 110 can be rotated about one or more axes of rotation.
  • the above-described rotating shaft may include a roll axis, a pan axis, and a pitch axis.
  • the motor 152 can be a DC motor or an AC motor.
  • the motor 152 may be a brushless motor or a brush motor.
  • Flight control system 160 may include flight controller 161 and sensing system 162.
  • the sensing system 162 is used to measure the attitude information of the drone, that is, the position information and state information of the drone 110 in space, for example, three-dimensional position, three-dimensional angle, three-dimensional speed, three-dimensional acceleration, and three-dimensional angular velocity.
  • the sensing system 162 may include, for example, at least one of a gyroscope, an electronic compass, an Inertial Measurement Unit (IMU), a vision sensor, a global navigation satellite system, and a barometer.
  • the global navigation satellite system can be a global positioning system (English: Global Positioning System, referred to as: GPS) or.
  • the flight controller 161 is used to control the flight of the drone 110, for example, the flight of the drone 110 can be controlled based on the attitude information measured by the sensing system 162. It should be understood that the flight controller 161 may control the drone 110 in accordance with pre-programmed program instructions, or may control the drone 110 in response to one or more control commands from the steering device 140.
  • the pan/tilt 120 can include an ESC 121 and a motor 122.
  • the pan/tilt is used to carry the photographing device 123.
  • the flight controller 161 can control the motion of the platform 120 through the ESC 121 and the motor 122.
  • the platform 120 may further include a controller for controlling the movement of the platform 120 by controlling the ESC 121 and the motor 122.
  • the platform 120 can be independent of the drone 110 or a portion of the drone 110.
  • the motor 122 can be a DC motor or an AC motor.
  • the motor 122 may be a brushless motor or a brush motor.
  • the gimbal can be located at the top of the drone or at the bottom of the drone.
  • the photographing device 123 may be, for example, a device for capturing an image such as a camera or a video camera, and the photographing device 123 may communicate with the flight controller and perform photographing under the control of the flight controller.
  • Display device 130 can communicate with drone 110 wirelessly and can be used to display gesture information for drone 110.
  • an image taken by the photographing device can also be displayed on the display device 130.
  • the display device 130 may be a stand-alone device or may be disposed in the manipulation device 140.
  • the display device can include a screen.
  • the screen may or may not be a touch screen.
  • the screen can be a light emitting diode (LED) screen, an OLED screen, a liquid crystal display (LCD) screen, a plasma screen, or any other type of screen.
  • the display device can be configured to display a graphical user interface (GUI).
  • GUI graphical user interface
  • the GUI can display an image that can allow the user to control the actions of the UAV.
  • a user can select a target from the image.
  • the target can be a stationary target or a moving target.
  • the user can select the direction of travel from the image.
  • a user may select a portion of the image (eg, a point, region, and/or object) to define the target and/or direction.
  • the user can select the target and/or direction by directly touching the screen (eg, a touch screen).
  • the user can touch a part of the screen.
  • the user can touch that portion of the screen by touching a point on the screen.
  • the user may select an area from a pre-existing set of regions on the screen, or may draw a border for one region, or specify a portion of the plane in any other manner.
  • Users can by means of user interaction devices (eg, mouse, joystick, keyboard, trackball, touch panel, buttons, verbal commands, gesture recognition, attitude sensors, thermal sensors, touch capacitive sensors, or any other
  • the device selects the portion of the image to select the target and/or direction.
  • the touch screen can be configured to detect a user's touch location, touch duration, touch pressure, and/or touch motion, wherein each of the above-described touch modes can indicate a particular input command of the user.
  • the image on the display device can show a view collected by means of the payload of the movable object.
  • an image collected by the imaging device can be displayed on the display device.
  • This can be considered a first person video (FPV).
  • FPV first person video
  • a single imaging device can be provided and A single FPV can be provided.
  • a plurality of imaging devices having different fields of view may be provided.
  • the video can be converted between the plurality of FPVs, or the plurality of FPVs can be displayed simultaneously.
  • the plurality of FPVs may correspond to (or be generated by) different imaging devices that may have different fields of view.
  • the user at the user terminal can select a portion of the image collected by the imaging device to specify the target and/or direction of motion of the movable object.
  • the image on the display device can display a map that can be generated by means of information from the payload of the movable object.
  • This map may optionally be generated by means of a plurality of imaging devices (eg, a right camera, a left camera, or more), which may utilize stereo mapping techniques. In some cases, this map may be generated based on location information about the UAV relative to the environment, the imaging device relative to the environment, and/or the UAV relative to the imaging device.
  • the location information may include gesture information, spatial location information, angular velocity, linear velocity, angular acceleration, and/or linear acceleration.
  • Such a map may alternatively be generated by means of one or more additional sensors, such as described in more detail elsewhere herein.
  • Such a map can be a two-dimensional map or a three-dimensional map. It can be converted between 2D map video and 3D map video, or 2D map video and 3D map video can be displayed simultaneously.
  • the user at the user terminal can select a portion of this map to specify the target and/or direction of motion of the movable object.
  • the video may be converted between one or more FPVs and one or more of the atlas images, or the one or more FPVs and one or more of the atlas images may be displayed simultaneously.
  • the user can use either of these videos to select a target or direction.
  • the portion selected by the user may include the target and/or direction.
  • the user can select this portion using any of the selection techniques described.
  • the image may be provided in a 3D virtual environment displayed on a user terminal (eg, a virtual reality system or an augmented reality system).
  • the 3D virtual environment can optionally correspond to a 3D map.
  • the virtual environment can include a plurality of points or objects that can be manipulated by a user. The user can manipulate these points or objects through a variety of different actions in the virtual environment. Examples of such actions may include selecting one or more points or objects, dragging and dropping, panning, rotating, spinning, pushing, pulling, zooming in, zooming out, and the like. Any type of moving action on these points or objects in a three dimensional virtual space is conceivable.
  • a user at the user terminal can manipulate these points or objects in the virtual environment to control the flight path of the UAV and/or the motion characteristics of the UAV.
  • the handling device 140 can communicate with the drone 110 wirelessly for remote manipulation of the drone 110.
  • the operating device can be, for example, a remote control or a device equipped with a control drone
  • the user terminal of the program (English: Application, abbreviation: APP), because it is a terminal device configured with a touch screen, the user can output a flight control instruction or a camera instruction to the drone through a touch screen of the terminal device, such as a remote controller, a knee
  • the user's input is received by the manipulation device, and the drone can be controlled by the input device of the pull wheel, the button, the button, the joystick, or the user interface (UI) on the user terminal. .
  • the execution body of the flight control method of the present invention may be a drone in an unmanned flight system or a ground control device in an unmanned flight system, and is not limited herein.
  • FIG. 2 is a flowchart of a flight control method according to an embodiment of the present invention. As shown in FIG. 2, the method in this embodiment may include:
  • the image may be, for example, an image displayed in an interactive interface, and the specified location may be determined by an operation of the interactive interface.
  • the image shows an obstacle surface such as a ground or a ceiling.
  • the user wants to control the drone to fly toward a certain point on the ground or a certain direction on the ceiling, the user interacts with the ground in the image through the interactive interface.
  • the position or the position on the ceiling is subjected to the contact operation, and accordingly, the position corresponding to the contact operation of the present embodiment is taken as the pointing position.
  • the specified location may be obtained based on selected points in the one or more images. These images may be captured by the imaging device on the drone at the current location. When the user selects one or more points in the image on the display, at least a portion of the specified location displayed in the image can be selected. In some cases, selecting the one or more points may cause the selected display to be displayed throughout the specified location in the image.
  • the selected point in the one or more images may be associated with a set of image coordinates.
  • the target can be located at a second target location associated with a set of world goals.
  • a transformation from the set of image coordinates to the set of world coordinates can be generated.
  • a direction vector from the current location to the second target location can be calculated.
  • a path for controlling the flight of the drone can be generated.
  • selected points in the initialization image can be received from the user.
  • the initialization image can be included within the one or more images. Can provide multiple object candidates for this user to choose Alternatively, each of the object candidates can be referred to using a bounding box. When the user selects a bounding box associated with the selected target candidate, the selected target candidate can be received as the target.
  • a projection transformation of the first object in the one or more images may be obtained based on state information of the imaging device.
  • the status information of the imaging device may be determined based on position and orientation information of the drone and posture information of the imaging device.
  • selected points in the initialization image can be received from the user.
  • the initialization image can be included within the one or more images.
  • Determining the first target according to the specified position in the image specifically, determining the position of the first target in the real world (ie, world coordinates), or alternatively, determining that the first target is relatively unreal in the real world The orientation of the machine.
  • determining the position of the first target in the real world it may be determined using a single imaging device, or a plurality of imaging devices.
  • the imaging device can be translated in a lateral direction relative to the target and perpendicular to the direction from the imaging device to the first target (by moving the movable object).
  • the imaging device can capture a plurality of images during this lateral translation.
  • the plurality of images may be provided to the image analyzer, the image analyzer then calculating a distance from the first target to the movable object based on: (1) the first target change in the plurality of images And (2) the travel distance of the movable object during the lateral translation.
  • the distance covered during the lateral translation can be recorded by the imaging device and/or the IMU on the movable object.
  • the distance covered during the lateral translation can be obtained from one or more Global Navigation Satellite Systems (GNSS).
  • GNSS Global Navigation Satellite Systems
  • the GNSS receiver on the imaging device and/or the movable object can determine the estimated position, velocity, and time of accuracy (PVT) by processing the signals broadcast by the satellites.
  • the PVT information can be used to calculate the distance covered during the lateral translation.
  • the IMU may be an electronic device configured to measure and report the speed, orientation, and gravity of the UAV using a combination of multiple accelerometers and multiple gyroscopes.
  • a magnetometer can optionally be included.
  • the IMU can use one or more accelerometers to detect the current rate of acceleration and one or more gyroscopes to detect changes in rotational properties (like pitch, roll, and yaw).
  • a magnetometer can be included to assist in calibrating for orientation deviation.
  • a single imaging device can be used to determine a first target, the imaging device For time of flight (TOF) cameras.
  • the first target can be determined without moving the TOF camera.
  • a time-of-flight camera may be a range imaging camera system that can resolve a distance based on known light speeds by measuring the time of flight of the optical signal between the camera and the object for each point of the image. In some cases, tracking accuracy can be improved with a TOF camera.
  • FIG. 3 illustrates an example in which a plurality of imaging devices can be used to determine a first target.
  • a first imaging device 304 and a second imaging device 306 can be provided.
  • the first imaging device and the second imaging device may be arranged at different locations.
  • the first imaging device can be a payload carried by the movable object 302, and the second imaging device can be located on or within the movable object.
  • the first imaging device can be a camera and the second imaging device can be a binocular vision sensor.
  • the first imaging device and the second imaging device can be part of the same binocular camera.
  • the first IMU may be arranged on a payload, such as on the first imaging device itself, or on a carrier that couples the payload to the movable object.
  • the second IMU can be located within the body of the movable object.
  • the first imaging device and the second imaging device can have different optical axes.
  • the first imaging device can have a first optical axis 305 and the second imaging device can have a second optical axis 307.
  • the first imaging device and the second imaging device may belong to different inertial frame of reference that move independently of each other.
  • the first imaging device and the second imaging device may belong to the same inertial frame of reference.
  • the first imaging device can be configured to capture an image 310 that is displayed on an output device of the user terminal.
  • the second imaging device can be configured to capture a binocular image 314 comprising a left eye image 314-1 and a right eye image 314-2.
  • the first imaging device and the second imaging device can capture a plurality of images of one target 308.
  • the position of the first target in the captured images may be different because the first imaging device and the second imaging device are at different locations.
  • the location 308' of the target in image 310 can be located at the bottom right corner of the image.
  • the position 308-1' of the target in the left eye image 314-1 and the position 308-2' of the target in the right eye image 314-2 may be located in the left portion of the corresponding left and right eye images.
  • the locations 308-1' and 308-2' in the left and right eye images may also be slightly different due to binocular vision.
  • the difference in position between the first imaging device and the second imaging device may be determined based on real-time position information obtained from the first IMU and the second IMU.
  • the real-time location information from the first IMU may indicate the actual location of the first imaging device because the first IMU is mounted on the payload.
  • the real-time location information from the second IMU may indicate the actuality of the second imaging device.
  • the position is because the second IMU is mounted on the body of the movable object at the second imaging device.
  • the flight controller may adjust the pose of the movable object and/or payload based on the calculated position difference.
  • the image analyzer can be configured to associate the images obtained by the second imaging device with the images obtained by the first imaging device based on the calculated position differences.
  • the first target may be determined based on an association between the images of the first and second imaging devices and a difference in position of the first and second imaging devices at different times.
  • the actual location of the first target is not known. Tracking can be based primarily on the size and/or location of the first target in the image.
  • the movable object can be configured to move toward the target until the size of the first target within the image reaches a predetermined threshold.
  • the imaging device of the movable object may zoom the lens onto the first target without the need for a movable object until the size of the first target within the image reaches a predetermined threshold.
  • the imaging device can zoom in and the movable object can move toward the target object at the same time until the size of the target reaches a predetermined threshold within the image.
  • the actual location of the first target may be known.
  • the size of the first target within the image includes a feature length of the first target within the image.
  • the feature length of the first target within the image may be the most significant size scale based on the first target.
  • the most significant size scale for the target may be represented by the length, width, height, thickness, radians, and/or circumference of the salient portion of the first target.
  • the predetermined threshold may be defined based on the width of the image.
  • the movable object can be configured to move toward the first target and/or can actuate the imaging device until a first target within the image is displayed in the target area.
  • the target area can be the central portion of the image, as well as any other portion of the image.
  • Such actuation of the imaging device in n degrees of freedom may be accomplished using a carrier (eg, a pan/tilt).
  • the movable object can be configured to move from the first position to the second position along the path.
  • the surrounding environment may include an obstacle in the path between the movable object and the first target. These obstacles may be stationary, mobile, or in motion.
  • information about the external environment is necessary to circumvent such obstacles by moving the object in real time by re-planning the path.
  • information about the external environment may be provided as a 3D map based on one or more images captured by one or more imaging devices. The flight path of the movable object may be generated by using the 3D map.
  • FIG. 4 is the first target and the drone provided by the embodiment of the present invention.
  • a obstacle avoidance range is preset in the front, but when the pointing position is within the obstacle avoidance range, the drone flies toward the pointing position according to the pointing flight mode; When the pointing position exceeds this obstacle avoidance range, the drone cannot fly toward the pointing position.
  • the first preset angle is determined according to the obstacle avoidance range of the drone, such that the angle between the first target and the current position of the drone and the horizontal plane is greater than the first When the angle is preset, the first target does not belong to the obstacle avoidance range of the drone.
  • the angle is greater than the first preset angle
  • determining a flight mode of the drone according to the size of the angle and controlling the drone to fly according to the determined flight mode of the drone
  • the second target, the distance between the second target and the first target is not less than a preset distance.
  • one way is to calculate the coordinates of the second target, generate a path from the current position to the second target according to the coordinates of the current position and the coordinates of the second target, and then control the drone to fly to the second target according to the path.
  • the geographic coordinates of the pointing position in the geographic environment can be calculated according to the geographical environment (three-dimensional environment) in the image.
  • the direction vector of the pointing position in the image is acquired, and the intersection between the direction vector and the obstacle surface (for example, the ground or the ceiling) in the image is determined, and the geographical coordinates of the intersection are taken as the geographical coordinates of the designated position.
  • one way is: determining a target direction for the drone to move based on the specified position in the image, and when the drone flies along the target direction to the distance obstacle surface (ie, the plane where the first target is located) preset The distance is changed to the flight direction until the flight to the second target, which may be the target located above the first target.
  • the drone finally flies to a second target that is not less than the preset distance from the first target, so that the drone does not easily touch the obstacle and ensures the flight safety of the drone.
  • the target direction of the drone can be dynamically adjusted such that the drone evades one or more obstacles in the target direction.
  • the attitude of the imaging device and/or UAV can be adjusted to maintain the first target within the field of view of the imaging device when the drone circumvents the one or more obstacles.
  • the yaw angle movement and translational movement of the drone can be controlled to maintain the first target within the field of view.
  • the target (which may be the first target or the second target) no longer exists in the one or more images and/or within the field of view of the imaging device, it may be determined that a flight failure toward the target has occurred.
  • the position and orientation of the movable object and/or the pose of the imaging device can be adjusted to recapture the target in one or more subsequent images.
  • the one or more subsequent images can be analyzed to detect the target and, once detected, can fly toward the target.
  • the distance and/or speed of the target relative to the drone can be obtained.
  • the target may be flown based on the distance and/or speed of the target relative to the UAV.
  • the flight path of the drone may be an optimized route between the current location (associated with the drone) and the target (associated with the first target or the second target).
  • the path may be optimized based on one or more parameters including flight distance, time of flight, energy consumption, altitude, weather effects including wind direction and wind speed, and/or tracking of the target (eg, rate and direction of the target).
  • the path can also be optimized to cause the drone to circumvent one or more obstacles between the current location and the target.
  • the path can include multiple lines and/or multiple curves.
  • the path can be configured to minimize the energy consumption of the drone when the drone moves from the current location to the target.
  • the path can be configured to minimize the impact of weather on drone movement.
  • This path can be optimized based on wind speed and wind direction.
  • the path can be configured to reduce the movement of the drone in the upwind.
  • the path can be configured to account for changes in altitude and pressure as the drone moves toward the target.
  • the path may be configured based on a surrounding landscape between the current location and the second target.
  • the path can be configured to take into account the man-made structures and natural terrain that are present in the surrounding landscape.
  • the path may be configured to pass around/over/under the obstacles in the path between the current location and the second target, such as man-made structures and natural terrain.
  • a 3D model of the surrounding landscape can be obtained based on: (1) one or more images captured by one or more imaging devices on the drone, and (2) global positioning Topographic map obtained from system (GPS) data.
  • GPS global positioning Topographic map obtained from system
  • the GPS data can be provided from a server to a user terminal for controlling the drone.
  • the path can be configured such that when the drone is from the current When the position is moved to the target, the point of interest is maintained within the field of view of the imaging device of the drone, wherein the point of interest may be a target and/or other object.
  • the first target belongs to the obstacle avoidance range of the drone, and the flight mode of the drone is determined to be the pointing flight mode according to the prior art scheme. And according to the pointing flight mode, flying toward the first target.
  • the embodiment can control the The man-machine flies toward the second target, so that the drone flies to a second target at a preset distance from the first target, so that the drone does not easily touch the obstacle, thereby ensuring the flight safety of the drone, and at the same time
  • the target position range of the drone's pointing flight is expanded.
  • the drone When the drone flies to the second target in the following manner: determining the target direction for the drone to move based on the specified position in the image, when the drone flies along the target direction to the obstacle surface (ie, the first target) The plane in which it is located) changes the flight direction when the preset distance is close to the preset distance until it reaches the second target; the flight path of the drone has multiple ways. The following is an example.
  • the second target is located on a first horizontal plane
  • the first horizontal plane is a horizontal plane that is at a preset distance from the first target.
  • an implementation manner of determining the flight mode of the drone according to the size of the angle is: when the angle is greater than the first preset angle (as shown in FIG. 5); The flight mode of the drone is: flying from the current position to the first horizontal plane, and then flying along the first horizontal plane to the second target; for example, as shown in FIG.
  • a feasible implementation manner of the foregoing S203 is: controlling the drone to fly from the current position to the first horizontal plane to the first horizontal plane, and the vertical speed of the drone when the drone reaches the first horizontal plane It is 0, and then the drone is controlled to fly along the first horizontal plane to the second target.
  • the speed of the drone's horizontal direction also drops to zero.
  • the second target is located on a first horizontal plane
  • the first horizontal plane is a horizontal plane that is at a preset distance from the first target.
  • an implementation manner of determining the flight mode of the drone according to the size of the angle is: when the angle is greater than the first preset angle (as shown in FIG. 5);
  • the flight mode of the drone is: flying from the current position toward the first target to a first position, the first position being located at a side of the first horizontal plane facing away from the first target Flying from the first position to the second target according to an arc trajectory; for example, as shown in FIG.
  • a feasible implementation manner of the foregoing S203 is: controlling the drone to fly from the current position toward the first target to the third position, wherein the distance between the first position and the first target in the vertical direction is greater than The distance between a horizontal plane and the first target in the vertical direction, and then controlling the drone to fly from the first position toward the second target to the second target in accordance with the curved trajectory.
  • the second target is located on a first horizontal plane
  • the first horizontal plane is a horizontal plane that is at a preset distance from the first target.
  • an implementation manner of determining an airplane mode of the drone according to the size of the angle is: when the angle is greater than the first preset angle and less than a second preset angle, The second preset angle is greater than the first preset angle, as shown in FIG. 8; determining that the flight mode of the drone is: flying from the current position to the first horizontal plane, and then along the The first horizontal plane flies to the second target; for example, see FIG.
  • the second target is located on a first horizontal plane
  • the first horizontal plane is a horizontal plane that is at a preset distance from the first target.
  • an implementation manner of determining an airplane mode of the drone according to the size of the angle is: when the angle is greater than the first preset angle and less than a second preset angle, The second preset angle is greater than the first preset angle, as shown in FIG. 8; determining that the flight mode of the drone is: flying from the current position toward the first target to the first a position, the first position being located on a side of the first horizontal plane facing away from the first target, flying from the first position to the second target according to an arc trajectory; for example, as shown in FIG.
  • the flight of the drone in the direction of the first target may be a direction along the line connecting the current position to the first target toward the first target, or may not fly along the connection, as long as the flight process makes no one The closer the distance between the machine and the first target belongs to the solution of the embodiment of the present invention.
  • the Flying the current position to the first horizontal plane includes: flying from the current position to a second position on the first horizontal surface, the second position being a connection between the first target and the current position The intersection of the line with the first horizontal plane.
  • the direction along the line connecting the current position and the second target is toward the first horizontal plane, and the position reaching the first horizontal plane is the second position, that is, between the current position and the second target.
  • the intersection of the line and the first horizontal plane in controlling the drone When flying to the second position of the first horizontal plane, the speed of the drone's vertical direction drops to zero.
  • the Flying the current position to the first horizontal plane includes: flying from the current position toward the first target to a third position, the third position being located at the first horizontal plane facing away from the first target a side; flying from the third position to the first horizontal surface in an arcuate trajectory.
  • FIG. 10 it is shown that the flight along the direction between the current position and the first target is directed toward the first horizontal plane, when flying to the third position, but the embodiment is not limited to only the current position.
  • the flight in the direction of the line with the first target may also fly toward the first horizontal plane along the direction of the line between the current position and the second target.
  • the distance between the third position and the first target in the vertical direction is greater than the distance between the first horizontal plane and the first target in the vertical direction, and then flies to the first horizontal plane according to the curved trajectory, and controls the drone to fly to the first horizontal plane.
  • the speed of the drone's vertical direction drops to zero.
  • the implementation manner of determining the flight mode of the drone according to the size of the angle is: when the angle is greater than or equal to a second preset angle (as shown in FIG. 11)
  • the second preset angle is greater than the first preset angle
  • determining that the flight mode of the drone is: flying from the current position to the second target along a horizontal plane where the current position is located,
  • the second target is at the same horizontal plane as the current location, and the line connecting the second target and the first target is perpendicular to a horizontal plane; for example, as shown in FIG.
  • a feasible implementation manner of the foregoing S203 is: controlling the drone to start from the current position (when the speed of the drone in the vertical direction is reduced to 0) to fly to the second target along the horizontal plane where the current position is located ( At this time, the horizontal speed of the drone is reduced to 0), the second target is located in the vertical direction of the first target, and the distance between the second target and the vertical direction of the first target is equal to the vertical position of the current position and the first target. The distance of the direction.
  • the execution body of the method is a ground control device
  • the determining a specified position in the image as the first target comprises: acquiring a frame operation through the interaction interface; when the frame operation frame is selected
  • the position of the frame operation frame selection is the first target. For example, as shown in FIG. 13, the image captured by the drone through the photographing device is displayed through the interactive interface.
  • the user can perform a frame operation on the object through the interactive interface.
  • the ground control device of the embodiment acquires the frame operation through the interactive interface, and acquires an object in the image selected by the drawing operation frame, After determining whether the object in the image belongs to a preset type (for example, a person, a car, etc.), when the object in the image does not belong to the preset type, determining the position of the object in the image selected by the frame operation frame (ie, designating The position is a first target, and then when the angle between the line connecting the first target and the current position of the drone and the horizontal plane is greater than the first preset angle, the solution shown in the above S202 and S203 is performed; When the angle between the line connecting the first target and the current position of the drone and the horizontal plane is less than or equal to the first preset angle, the drone flight is controlled according to the pointing flight mode. When the object in the image is not a preset follower object, the drone flight is controlled according to the tracking flight mode.
  • a preset type for example, a person, a car, etc.
  • the object in the image selected by the frame operation frame belongs to a preset type, determining that the object is a target following object; controlling the drone to follow the target following the object according to the object
  • the object flies.
  • the execution body of the method is a ground control device, the method further comprising: displaying a preset icon at the pointing position in the image; the determining the size according to the size of the angle
  • the method further includes: moving a preset icon displayed at the pointing position in the image to a position in the image corresponding to the second target.
  • a preset icon is displayed at a specified position in the image, as shown in FIG. 14, to indicate that the user successfully specifies the position in the image.
  • the preset icon displayed at the specified position in the image is moved from the specified position to the position corresponding to the second target in the image to indicate that the drone will fly to the The second goal. As shown in FIG. 15, it is used to indicate that the drone is controlled to fly to the second target when the first target is at the designated position, so as to avoid touching the obstacle and ensure flight safety.
  • FIG. 16 is a schematic structural diagram of a flight control apparatus according to Embodiment 1 of the present invention.
  • the flight control apparatus 400 of this embodiment may include: a target determining module 401, an airplane mode determining module 402, and a control module 403.
  • a target determining module 401 configured to determine a first target according to a specified position in the image
  • the flight mode determining module 402 is configured to determine, according to the size of the angle, when the angle between the connection between the first target and the current position of the drone and the horizontal plane is greater than the first preset angle Flight mode of man and machine;
  • the control module 403 is configured to control the drone to fly to the second target according to the determined flight mode, wherein a distance between the second target and the first target is not less than a preset distance.
  • the flight mode determining module 402 is configured to: when the angle is greater than the first preset angle, determine that the flight mode of the drone is: fly from the current position to the Determining a first horizontal plane, and then flying along the first horizontal plane to the second target; or determining that the flight mode of the drone is: flying from the current position toward the first target to a first position, the first position being located on a side of the first horizontal plane facing away from the first target, flying from the first position to the second target according to an arc trajectory;
  • the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane whose distance from the first target is the preset distance.
  • the flight mode determining module 402 is configured to: when the angle is greater than the first preset angle and less than the second preset angle, the second preset angle is greater than the first Determining an angle, determining that the flight mode of the drone is: flying from the current position to the first horizontal plane, and then flying along the first horizontal plane to the second target; or determining the none The flight mode of the human machine is: flying from the current position toward the first target to a first position, the first position being located on a side of the first horizontal plane facing away from the first target, according to An arcuate trajectory flying from the first position to the second target;
  • the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane whose distance from the first target is the preset distance.
  • the flight mode determining module 402 determines that the flight mode of the drone is flying from the current position to the first horizontal plane, and then flying along the first horizontal plane to the second target Flying from the current position to the first horizontal plane includes: flying from the current position to a second position on the first horizontal surface, the second position being the first target and The intersection of the line between the current position and the first horizontal plane.
  • the flight mode determining module 402 determines that the flight mode of the drone is flying from the current position to the first horizontal plane, and then flying along the first horizontal plane to the second target Flying from the current position to the first horizontal plane includes: flying from the current position toward the first target to a third position, the third position being located at the first horizontal plane One side of the first target; flying from the third position to the first horizontal surface in an arcuate trajectory.
  • the flight module determining module 402 is configured to: when the angle is not less than the second preset angle, determine that the flight mode of the drone is: along the current location
  • the horizontal plane flies from the current position to the second target, the second target is at the same level as the current position, and the line connecting the second target and the first target is perpendicular to a horizontal plane.
  • the target determining module 401 is specifically configured to: obtain a frame operation by using an interaction interface; and determine the frame when the object in the image selected by the frame operation frame does not belong to a preset type.
  • the position selected by the operation frame is the first target.
  • the target determining module 401 is further configured to: when the object in the image selected by the frame operation frame belongs to a preset type, determine that the object is a target following object;
  • the control module 403 is further configured to control the drone to follow the object to fly according to the object as a target following object.
  • the flight control device 400 of the embodiment further includes: a display module 404.
  • a display module 404 configured to display a preset icon at the specified position in the image; and after the flight mode determining module 402 determines the flight mode of the drone according to the size of the angle, A preset icon displayed at the pointing position in the image is moved to a position in the image corresponding to the second target.
  • the device in this embodiment may be used to implement the technical solutions of the foregoing method embodiments of the present invention, and the implementation principles and technical effects thereof are similar, and details are not described herein again.
  • FIG. 17 is a schematic structural diagram of a flight control apparatus according to Embodiment 2 of the present invention.
  • the flight control apparatus 500 of this embodiment may include: a memory 501 and a processor 502.
  • the memory 501 is coupled to the processor 502 via a bus.
  • the processor 502 may be a central processing unit (CPU), and the processor may also be another general-purpose processor, a digital signal processor (DSP), and an application-specific integrated circuit (English: Application Specific Integrated Circuit (ASIC), Field-Programmable Gate Array (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • the general purpose processor may be a microprocessor or the processor or any conventional processor or the like.
  • the memory 501 is configured to store code for executing a flight control method
  • the processor 502 is configured to invoke the code stored in the memory 501, and execute: Determining a first target according to a specified position in the image; and when the angle between the line connecting the first target and the current position of the drone and the horizontal plane is greater than the first preset angle, according to the angle Sizing to determine a flight mode of the drone; controlling the drone to fly to a second target according to the determined flight mode, wherein a distance between the second target and the first target is not less than a pre- Set the distance.
  • the processor 502 is configured to: when the angle is greater than the first preset angle, determine that the flight mode of the drone is: fly from the current location to the first a horizontal plane, and then flying along the first horizontal plane to the second target; or determining that the flight mode of the drone is: flying from the current position toward the first target to the first Position, the first position is located on a side of the first horizontal plane facing away from the first target, and flies from the first position to the second target according to an arc trajectory;
  • the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane whose distance from the first target is the preset distance.
  • the processor 502 is configured to: when the angle is greater than the first preset angle and less than the second preset angle, the second preset angle is greater than the first preset An angle determining that the flight mode of the drone is: flying from the current position to the first horizontal plane, and then flying along the first horizontal plane to the second target; or determining the drone The flight mode is: flying from the current position toward the first target to a first position, the first position being located on a side of the first horizontal plane facing away from the first target, in a curved shape Trajecting from the first position to the second target;
  • the second target is located on a first horizontal plane, and the first horizontal plane is a horizontal plane whose distance from the first target is the preset distance.
  • Flying from the current position to the first horizontal plane includes: flying from the current position to a second position on the first horizontal surface, the second position being the first target and the current The intersection between the line of locations and the first horizontal plane.
  • Flying from the current position to the first horizontal plane includes: moving from the current position toward the first Flying in a direction of a target to a third position, the third position being located on a side of the first horizontal plane facing away from the first target; flying from the third position to the first horizontal plane in accordance with an arcuate trajectory .
  • the processor 502 is configured to: when the angle is not less than the second preset angle, determine that the flight mode of the drone is: along a horizontal plane where the current position is located Flying from the current position to the second target, the second target is at the same level as the current position, and the line connecting the second target and the first target is perpendicular to a horizontal plane.
  • the flight control device 500 in the above embodiment may be a drone or may be a ground control device.
  • the flight control device 500 of the embodiment is a ground control device, and the flight control device 500 further includes: an interaction interface 503.
  • the interactive interface 503 is coupled to the processor 502 via a bus.
  • the interaction interface 503 is configured to detect a picture frame operation.
  • the processor 502 is specifically configured to: acquire the frame operation by using the interaction interface 503; and determine the frame when the object in the image selected by the frame operation frame does not belong to a preset type.
  • the position selected by the operation frame is the first target.
  • the processor 502 is further configured to: when the object in the image selected by the frame operation frame belongs to a preset type, determine that the object is a target following object; follow the object as a target An object that controls the drone to fly following the object.
  • an interaction interface 503 is configured to display a preset icon at the pointing position in the image; and after the processor 502 determines the flight mode of the drone according to the size of the angle And moving a preset icon displayed at the pointing position in the image to a position in the image corresponding to the second target.
  • the device in this embodiment may be used to implement the technical solutions of the foregoing method embodiments of the present invention, and the implementation principles and technical effects thereof are similar, and details are not described herein again.
  • FIG. 18 is a schematic structural diagram of a flight control system of a drone according to an embodiment of the present invention.
  • the flight control system 800 of the drone of the present embodiment includes: a flight control device 600 and a drone. 700.
  • the flight control device 600 can adopt the structure of the device embodiment shown in FIG. 16 or FIG. 17, which can correspondingly implement the technical solutions of the foregoing method embodiments of the present invention, and the implementation principle and the technical effect are similar. Narration.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

La présente invention a trait à un procédé de commande de vol et à un appareil (400). Le procédé consiste : à déterminer une position désignée dans une image comme étant une première cible (S201) ; lorsqu'un angle inclus formé par une ligne qui relie la première cible et la position courante d'un véhicule aérien sans pilote et par le plan horizontal est supérieur à un premier angle préétabli, à déterminer un mode de vol du véhicule aérien sans pilote selon la taille de l'angle inclus (S202) ; et, selon le mode de vol déterminé, à commander ledit véhicule aérien sans pilote afin qu'il vole vers une seconde cible, la distance entre la seconde cible et la première cible étant supérieure ou égale à une distance préétablie (S203). Même si l'angle inclus formé par la ligne qui relie la première cible et la position courante de ce véhicule aérien sans pilote et par le plan horizontal est supérieur au premier angle préétabli, le véhicule aérien sans pilote peut toujours être commandé afin de voler vers la seconde cible. Cela signifie que ledit véhicule aérien sans pilote vole vers la seconde cible à une distance préétablie par rapport à la première cible, de telle sorte que le véhicule aérien sans pilote n'entre pas facilement en collision avec des obstacles, ce qui garantit la sécurité de vol de ce véhicule aérien sans pilote, et étend également la plage de positions cibles dans laquelle le véhicule aérien sans pilote peut voler.
PCT/CN2016/111564 2016-12-22 2016-12-22 Procédé de commande de vol et appareil WO2018112848A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202110169187.8A CN112987782A (zh) 2016-12-22 2016-12-22 飞行控制方法和装置
CN201680076224.8A CN108450032B (zh) 2016-12-22 2016-12-22 飞行控制方法和装置
PCT/CN2016/111564 WO2018112848A1 (fr) 2016-12-22 2016-12-22 Procédé de commande de vol et appareil

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/111564 WO2018112848A1 (fr) 2016-12-22 2016-12-22 Procédé de commande de vol et appareil

Publications (1)

Publication Number Publication Date
WO2018112848A1 true WO2018112848A1 (fr) 2018-06-28

Family

ID=62624251

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/111564 WO2018112848A1 (fr) 2016-12-22 2016-12-22 Procédé de commande de vol et appareil

Country Status (2)

Country Link
CN (2) CN112987782A (fr)
WO (1) WO2018112848A1 (fr)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109540834A (zh) * 2018-12-13 2019-03-29 深圳市太赫兹科技创新研究院 一种电缆老化监测方法及系统
CN109947096B (zh) * 2019-02-25 2022-06-21 广州极飞科技股份有限公司 受控对象的控制方法及装置、无人驾驶系统
CN110673642B (zh) * 2019-10-28 2022-10-28 深圳市赛为智能股份有限公司 无人机着陆控制方法、装置、计算机设备及存储介质
CN113759985A (zh) * 2021-08-03 2021-12-07 华南理工大学 一种无人机飞行控制方法、系统、装置及存储介质
CN114115351A (zh) * 2021-12-06 2022-03-01 歌尔科技有限公司 飞行器的避障方法、飞行器以及计算机可读存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201804119U (zh) * 2010-08-19 2011-04-20 中国测绘科学研究院 一种机载gps航摄导航控制系统
CN103019250A (zh) * 2012-12-03 2013-04-03 华北电力大学 巡检飞行机器人斜面起飞控制方法
US20150197335A1 (en) * 2012-09-23 2015-07-16 Israel Aerospace Industries Ltd. System, a method and a computer program product for maneuvering of an air vehicle
CN105278543A (zh) * 2015-09-28 2016-01-27 小米科技有限责任公司 提升飞行安全性的方法及装置、电子设备
CN105700550A (zh) * 2016-01-26 2016-06-22 深圳市大疆创新科技有限公司 无人机及其飞行控制方法与系统
CN105867400A (zh) * 2016-04-20 2016-08-17 北京博瑞爱飞科技发展有限公司 无人机的飞行控制方法和装置
CN105955292A (zh) * 2016-05-20 2016-09-21 腾讯科技(深圳)有限公司 一种控制飞行器飞行的方法、移动终端、飞行器及系统
CN105955304A (zh) * 2016-07-06 2016-09-21 零度智控(北京)智能科技有限公司 一种避障方法、避障装置及无人飞行器

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8149210B2 (en) * 2007-12-31 2012-04-03 Microsoft International Holdings B.V. Pointing device and method
TWI408568B (zh) * 2010-06-24 2013-09-11 Hon Hai Prec Ind Co Ltd 手持裝置及利用其控制無人飛行載具的方法
CN102707724B (zh) * 2012-06-05 2015-01-14 清华大学 一种无人机的视觉定位与避障方法及系统
CN102854886B (zh) * 2012-08-29 2016-01-20 深圳一电科技有限公司 飞行线路编辑及控制的方法和装置
KR101483058B1 (ko) * 2014-01-21 2015-01-15 엘아이지넥스원 주식회사 무인항공기 충돌방지를 위한 지상통제 시스템
US9881021B2 (en) * 2014-05-20 2018-01-30 Verizon Patent And Licensing Inc. Utilization of third party networks and third party unmanned aerial vehicle platforms
CN107577247B (zh) * 2014-07-30 2021-06-25 深圳市大疆创新科技有限公司 目标追踪系统及方法
CN105517666B (zh) * 2014-09-05 2019-08-27 深圳市大疆创新科技有限公司 基于情景的飞行模式选择
CN104991563B (zh) * 2015-05-12 2023-10-03 零度智控(北京)智能科技有限公司 一种无人机分级操作的方法及系统
CN105141851B (zh) * 2015-09-29 2019-04-26 杨珊珊 无人飞行器用控制系统、无人飞行器及控制方法
CN106022274B (zh) * 2016-05-24 2024-01-12 零度智控(北京)智能科技有限公司 一种避障方法、避障装置及无人驾驶机器
CN105955298B (zh) * 2016-06-03 2018-09-07 腾讯科技(深圳)有限公司 一种飞行器的自动避障方法及装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201804119U (zh) * 2010-08-19 2011-04-20 中国测绘科学研究院 一种机载gps航摄导航控制系统
US20150197335A1 (en) * 2012-09-23 2015-07-16 Israel Aerospace Industries Ltd. System, a method and a computer program product for maneuvering of an air vehicle
CN103019250A (zh) * 2012-12-03 2013-04-03 华北电力大学 巡检飞行机器人斜面起飞控制方法
CN105278543A (zh) * 2015-09-28 2016-01-27 小米科技有限责任公司 提升飞行安全性的方法及装置、电子设备
CN105700550A (zh) * 2016-01-26 2016-06-22 深圳市大疆创新科技有限公司 无人机及其飞行控制方法与系统
CN105867400A (zh) * 2016-04-20 2016-08-17 北京博瑞爱飞科技发展有限公司 无人机的飞行控制方法和装置
CN105955292A (zh) * 2016-05-20 2016-09-21 腾讯科技(深圳)有限公司 一种控制飞行器飞行的方法、移动终端、飞行器及系统
CN105955304A (zh) * 2016-07-06 2016-09-21 零度智控(北京)智能科技有限公司 一种避障方法、避障装置及无人飞行器

Also Published As

Publication number Publication date
CN108450032A (zh) 2018-08-24
CN112987782A (zh) 2021-06-18
CN108450032B (zh) 2021-03-02

Similar Documents

Publication Publication Date Title
US11644832B2 (en) User interaction paradigms for a flying digital assistant
US11914370B2 (en) System and method for providing easy-to-use release and auto-positioning for drone applications
US10860040B2 (en) Systems and methods for UAV path planning and control
US11635775B2 (en) Systems and methods for UAV interactive instructions and control
EP3387507B1 (fr) Systèmes et procédés de commande de vol de véhicule aérien sans pilote (uav)
US11467179B2 (en) Wind estimation system, wind estimation method, and program
US20200346753A1 (en) Uav control method, device and uav
WO2018112848A1 (fr) Procédé de commande de vol et appareil
WO2018098784A1 (fr) Procédé, dispositif, équipement et système de commande de véhicule aérien sans pilote
KR20180068411A (ko) 무인 비행 전자 장치의 운행 제어 방법 및 이를 지원하는 전자 장치
JP2022554248A (ja) 無人飛行体を使用する構造体スキャン
CN109564434B (zh) 用于定位可移动物体的系统和方法
WO2022094808A1 (fr) Procédé et appareil de commande de prise de photographies, véhicule aérien sans pilote, dispositif et support de stockage lisible

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16924468

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16924468

Country of ref document: EP

Kind code of ref document: A1